Apr 24 22:27:42.476863 ip-10-0-129-176 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 22:27:42.476877 ip-10-0-129-176 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 22:27:42.476887 ip-10-0-129-176 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 22:27:42.477190 ip-10-0-129-176 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 22:27:52.510979 ip-10-0-129-176 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 22:27:52.510993 ip-10-0-129-176 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot a504aca6670b4dfdb054f3bbc646f0d4 -- Apr 24 22:30:13.825142 ip-10-0-129-176 systemd[1]: Starting Kubernetes Kubelet... Apr 24 22:30:14.274052 ip-10-0-129-176 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 22:30:14.274052 ip-10-0-129-176 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 22:30:14.274052 ip-10-0-129-176 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 22:30:14.274052 ip-10-0-129-176 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 22:30:14.274052 ip-10-0-129-176 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 22:30:14.277069 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.276983 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 22:30:14.282074 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282058 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:30:14.282074 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282073 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:30:14.282135 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282078 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:30:14.282135 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282081 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:30:14.282135 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282084 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:30:14.282135 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282088 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:30:14.282135 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282091 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:30:14.282135 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282094 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:30:14.282135 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282097 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:30:14.282135 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282100 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:30:14.282135 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282102 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:30:14.282135 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282105 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:30:14.282135 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282108 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:30:14.282135 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282110 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:30:14.282135 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282113 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:30:14.282135 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282116 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:30:14.282135 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282119 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:30:14.282135 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282121 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:30:14.282135 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282124 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:30:14.282135 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282128 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:30:14.282135 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282130 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:30:14.282135 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282133 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:30:14.282661 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282136 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:30:14.282661 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282138 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:30:14.282661 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282141 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:30:14.282661 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282144 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:30:14.282661 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282147 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:30:14.282661 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282150 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:30:14.282661 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282153 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:30:14.282661 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282155 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:30:14.282661 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282158 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:30:14.282661 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282160 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:30:14.282661 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282164 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:30:14.282661 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282168 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:30:14.282661 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282171 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:30:14.282661 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282174 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:30:14.282661 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282176 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:30:14.282661 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282180 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:30:14.282661 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282202 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:30:14.282661 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282206 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:30:14.282661 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282209 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:30:14.283381 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282212 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:30:14.283381 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282215 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:30:14.283381 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282218 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:30:14.283381 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282221 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:30:14.283381 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282223 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:30:14.283381 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282226 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:30:14.283381 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282229 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:30:14.283381 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282231 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:30:14.283381 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282236 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:30:14.283381 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282238 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:30:14.283381 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282241 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:30:14.283381 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282244 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:30:14.283381 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282247 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:30:14.283381 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282250 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:30:14.283381 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282253 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:30:14.283381 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282256 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:30:14.283381 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282259 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:30:14.283381 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282262 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:30:14.283381 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282265 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:30:14.283381 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282268 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:30:14.283927 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282271 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:30:14.283927 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282273 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:30:14.283927 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282276 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:30:14.283927 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282279 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:30:14.283927 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282281 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:30:14.283927 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282284 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:30:14.283927 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282286 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:30:14.283927 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282289 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:30:14.283927 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282291 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:30:14.283927 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282294 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:30:14.283927 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282296 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:30:14.283927 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282299 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:30:14.283927 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282301 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:30:14.283927 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282304 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:30:14.283927 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282306 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:30:14.283927 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282309 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:30:14.283927 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282311 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:30:14.283927 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282314 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:30:14.283927 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282316 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:30:14.284412 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282319 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:30:14.284412 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282322 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:30:14.284412 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282324 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:30:14.284412 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282327 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:30:14.284412 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282330 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:30:14.284412 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.282334 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:30:14.284812 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284796 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:30:14.284812 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284812 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:30:14.284867 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284815 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:30:14.284867 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284819 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:30:14.284867 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284823 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:30:14.284867 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284826 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:30:14.284867 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284829 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:30:14.284867 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284832 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:30:14.284867 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284834 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:30:14.284867 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284837 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:30:14.284867 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284840 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:30:14.284867 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284843 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:30:14.284867 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284846 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:30:14.284867 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284848 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:30:14.284867 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284851 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:30:14.284867 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284853 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:30:14.284867 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284856 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:30:14.284867 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284859 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:30:14.284867 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284861 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:30:14.284867 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284864 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:30:14.284867 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284867 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:30:14.284867 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284869 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:30:14.285373 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284885 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:30:14.285373 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284889 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:30:14.285373 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284892 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:30:14.285373 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284895 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:30:14.285373 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284897 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:30:14.285373 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284900 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:30:14.285373 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284903 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:30:14.285373 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284905 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:30:14.285373 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284908 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:30:14.285373 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284911 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:30:14.285373 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284913 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:30:14.285373 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284916 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:30:14.285373 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284919 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:30:14.285373 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284921 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:30:14.285373 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284924 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:30:14.285373 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284928 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:30:14.285373 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284930 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:30:14.285373 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284933 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:30:14.285373 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284935 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:30:14.285373 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284938 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:30:14.285927 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284941 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:30:14.285927 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284943 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:30:14.285927 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284946 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:30:14.285927 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284948 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:30:14.285927 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284951 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:30:14.285927 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284954 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:30:14.285927 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284956 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:30:14.285927 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284959 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:30:14.285927 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284961 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:30:14.285927 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284964 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:30:14.285927 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284966 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:30:14.285927 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284969 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:30:14.285927 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284971 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:30:14.285927 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284975 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:30:14.285927 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284978 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:30:14.285927 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284980 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:30:14.285927 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284983 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:30:14.285927 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284985 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:30:14.285927 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284988 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:30:14.286386 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284990 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:30:14.286386 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284994 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:30:14.286386 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.284998 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:30:14.286386 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.285001 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:30:14.286386 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.285003 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:30:14.286386 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.285007 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:30:14.286386 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.285012 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:30:14.286386 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.285016 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:30:14.286386 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.285019 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:30:14.286386 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.285022 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:30:14.286386 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.285025 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:30:14.286386 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.285027 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:30:14.286386 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.285031 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:30:14.286386 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.285033 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:30:14.286386 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.285036 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:30:14.286386 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.285039 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:30:14.286386 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.285041 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:30:14.286386 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.285045 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:30:14.286386 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.285048 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:30:14.286862 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.285051 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:30:14.286862 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.285053 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:30:14.286862 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.285056 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:30:14.286862 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.285059 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:30:14.286862 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.285061 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:30:14.286862 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.285064 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:30:14.286862 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285697 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 22:30:14.286862 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285708 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 22:30:14.286862 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285714 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 22:30:14.286862 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285719 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 22:30:14.286862 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285723 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 22:30:14.286862 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285727 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 22:30:14.286862 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285731 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 22:30:14.286862 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285736 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 22:30:14.286862 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285740 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 22:30:14.286862 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285743 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 22:30:14.286862 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285746 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 22:30:14.286862 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285750 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 22:30:14.286862 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285753 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 22:30:14.286862 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285756 2574 flags.go:64] FLAG: --cgroup-root="" Apr 24 22:30:14.286862 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285759 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 22:30:14.286862 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285762 2574 flags.go:64] FLAG: --client-ca-file="" Apr 24 22:30:14.286862 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285765 2574 flags.go:64] FLAG: --cloud-config="" Apr 24 22:30:14.287449 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285768 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 24 22:30:14.287449 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285771 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 22:30:14.287449 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285775 2574 flags.go:64] FLAG: --cluster-domain="" Apr 24 22:30:14.287449 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285778 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 22:30:14.287449 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285781 2574 flags.go:64] FLAG: --config-dir="" Apr 24 22:30:14.287449 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285784 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 22:30:14.287449 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285788 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 22:30:14.287449 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285791 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 22:30:14.287449 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285796 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 22:30:14.287449 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285799 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 22:30:14.287449 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285803 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 22:30:14.287449 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285806 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 24 22:30:14.287449 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285809 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 22:30:14.287449 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285812 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 22:30:14.287449 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285815 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 22:30:14.287449 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285818 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 22:30:14.287449 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285823 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 22:30:14.287449 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285826 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 22:30:14.287449 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285829 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 22:30:14.287449 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285832 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 22:30:14.287449 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285835 2574 flags.go:64] FLAG: --enable-server="true" Apr 24 22:30:14.287449 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285837 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 22:30:14.287449 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285842 2574 flags.go:64] FLAG: --event-burst="100" Apr 24 22:30:14.287449 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285845 2574 flags.go:64] FLAG: --event-qps="50" Apr 24 22:30:14.287449 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285848 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 22:30:14.288076 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285851 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 22:30:14.288076 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285854 2574 flags.go:64] FLAG: --eviction-hard="" Apr 24 22:30:14.288076 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285858 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 22:30:14.288076 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285861 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 22:30:14.288076 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285864 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 22:30:14.288076 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285867 2574 flags.go:64] FLAG: --eviction-soft="" Apr 24 22:30:14.288076 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285870 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 22:30:14.288076 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285885 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 22:30:14.288076 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285888 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 22:30:14.288076 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285891 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 22:30:14.288076 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285894 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 22:30:14.288076 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285897 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 22:30:14.288076 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285900 2574 flags.go:64] FLAG: --feature-gates="" Apr 24 22:30:14.288076 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285903 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 22:30:14.288076 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285907 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 22:30:14.288076 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285910 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 22:30:14.288076 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285914 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 22:30:14.288076 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285917 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 24 22:30:14.288076 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285921 2574 flags.go:64] FLAG: --help="false" Apr 24 22:30:14.288076 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285924 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-129-176.ec2.internal" Apr 24 22:30:14.288076 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285927 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 22:30:14.288076 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285930 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 22:30:14.288076 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285933 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 22:30:14.288654 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285937 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 22:30:14.288654 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285941 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 22:30:14.288654 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285944 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 22:30:14.288654 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285947 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 22:30:14.288654 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285950 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 22:30:14.288654 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285953 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 22:30:14.288654 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285956 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 22:30:14.288654 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285959 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 22:30:14.288654 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285962 2574 flags.go:64] FLAG: --kube-reserved="" Apr 24 22:30:14.288654 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285965 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 22:30:14.288654 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285968 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 22:30:14.288654 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285971 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 22:30:14.288654 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285974 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 22:30:14.288654 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285977 2574 flags.go:64] FLAG: --lock-file="" Apr 24 22:30:14.288654 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285980 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 22:30:14.288654 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285982 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 22:30:14.288654 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285985 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 22:30:14.288654 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285991 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 22:30:14.288654 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285994 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 22:30:14.288654 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.285997 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 22:30:14.288654 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286000 2574 flags.go:64] FLAG: --logging-format="text" Apr 24 22:30:14.288654 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286003 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 22:30:14.288654 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286007 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 22:30:14.288654 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286010 2574 flags.go:64] FLAG: --manifest-url="" Apr 24 22:30:14.289241 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286013 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 24 22:30:14.289241 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286018 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 22:30:14.289241 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286021 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 22:30:14.289241 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286026 2574 flags.go:64] FLAG: --max-pods="110" Apr 24 22:30:14.289241 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286029 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 22:30:14.289241 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286032 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 22:30:14.289241 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286036 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 22:30:14.289241 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286039 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 22:30:14.289241 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286042 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 22:30:14.289241 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286045 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 22:30:14.289241 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286047 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 22:30:14.289241 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286055 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 22:30:14.289241 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286058 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 22:30:14.289241 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286061 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 22:30:14.289241 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286064 2574 flags.go:64] FLAG: --pod-cidr="" Apr 24 22:30:14.289241 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286066 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 22:30:14.289241 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286072 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 22:30:14.289241 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286075 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 22:30:14.289241 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286079 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 24 22:30:14.289241 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286081 2574 flags.go:64] FLAG: --port="10250" Apr 24 22:30:14.289241 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286084 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 22:30:14.289241 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286087 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0440feccb14e558f1" Apr 24 22:30:14.289241 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286091 2574 flags.go:64] FLAG: --qos-reserved="" Apr 24 22:30:14.289241 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286094 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 24 22:30:14.289823 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286097 2574 flags.go:64] FLAG: --register-node="true" Apr 24 22:30:14.289823 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286100 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 24 22:30:14.289823 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286103 2574 flags.go:64] FLAG: --register-with-taints="" Apr 24 22:30:14.289823 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286106 2574 flags.go:64] FLAG: --registry-burst="10" Apr 24 22:30:14.289823 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286109 2574 flags.go:64] FLAG: --registry-qps="5" Apr 24 22:30:14.289823 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286112 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 24 22:30:14.289823 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286114 2574 flags.go:64] FLAG: --reserved-memory="" Apr 24 22:30:14.289823 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286118 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 22:30:14.289823 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286121 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 22:30:14.289823 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286124 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 22:30:14.289823 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286127 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 22:30:14.289823 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286131 2574 flags.go:64] FLAG: --runonce="false" Apr 24 22:30:14.289823 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286134 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 22:30:14.289823 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286137 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 22:30:14.289823 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286140 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 24 22:30:14.289823 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286143 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 22:30:14.289823 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286146 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 22:30:14.289823 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286149 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 22:30:14.289823 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286152 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 22:30:14.289823 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286156 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 22:30:14.289823 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286159 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 22:30:14.289823 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286162 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 22:30:14.289823 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286165 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 22:30:14.289823 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286167 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 22:30:14.289823 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286171 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 22:30:14.289823 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286174 2574 flags.go:64] FLAG: --system-cgroups="" Apr 24 22:30:14.290470 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286177 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 22:30:14.290470 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286182 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 22:30:14.290470 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286185 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 24 22:30:14.290470 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286188 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 22:30:14.290470 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286193 2574 flags.go:64] FLAG: --tls-min-version="" Apr 24 22:30:14.290470 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286195 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 22:30:14.290470 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286198 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 22:30:14.290470 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286201 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 22:30:14.290470 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286204 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 22:30:14.290470 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286207 2574 flags.go:64] FLAG: --v="2" Apr 24 22:30:14.290470 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286211 2574 flags.go:64] FLAG: --version="false" Apr 24 22:30:14.290470 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286215 2574 flags.go:64] FLAG: --vmodule="" Apr 24 22:30:14.290470 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286219 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 22:30:14.290470 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.286222 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 22:30:14.290470 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286309 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:30:14.290470 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286314 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:30:14.290470 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286319 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:30:14.290470 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286322 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:30:14.290470 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286326 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:30:14.290470 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286329 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:30:14.290470 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286332 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:30:14.290470 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286334 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:30:14.291034 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286337 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:30:14.291034 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286340 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:30:14.291034 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286342 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:30:14.291034 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286345 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:30:14.291034 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286348 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:30:14.291034 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286350 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:30:14.291034 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286353 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:30:14.291034 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286356 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:30:14.291034 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286359 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:30:14.291034 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286361 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:30:14.291034 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286364 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:30:14.291034 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286367 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:30:14.291034 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286370 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:30:14.291034 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286372 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:30:14.291034 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286375 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:30:14.291034 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286377 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:30:14.291034 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286380 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:30:14.291034 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286383 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:30:14.291034 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286386 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:30:14.291555 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286390 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:30:14.291555 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286394 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:30:14.291555 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286397 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:30:14.291555 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286400 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:30:14.291555 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286404 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:30:14.291555 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286407 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:30:14.291555 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286410 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:30:14.291555 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286413 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:30:14.291555 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286416 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:30:14.291555 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286419 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:30:14.291555 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286422 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:30:14.291555 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286425 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:30:14.291555 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286428 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:30:14.291555 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286431 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:30:14.291555 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286433 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:30:14.291555 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286436 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:30:14.291555 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286438 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:30:14.291555 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286441 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:30:14.291555 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286443 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:30:14.292122 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286446 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:30:14.292122 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286449 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:30:14.292122 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286452 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:30:14.292122 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286455 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:30:14.292122 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286457 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:30:14.292122 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286460 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:30:14.292122 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286462 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:30:14.292122 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286465 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:30:14.292122 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286467 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:30:14.292122 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286470 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:30:14.292122 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286472 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:30:14.292122 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286475 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:30:14.292122 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286478 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:30:14.292122 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286480 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:30:14.292122 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286483 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:30:14.292122 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286485 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:30:14.292122 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286488 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:30:14.292122 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286494 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:30:14.292122 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286497 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:30:14.292122 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286500 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:30:14.292709 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286502 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:30:14.292709 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286505 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:30:14.292709 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286508 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:30:14.292709 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286510 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:30:14.292709 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286514 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:30:14.292709 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286517 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:30:14.292709 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286521 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:30:14.292709 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286524 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:30:14.292709 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286526 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:30:14.292709 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286529 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:30:14.292709 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286531 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:30:14.292709 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286534 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:30:14.292709 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286537 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:30:14.292709 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286539 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:30:14.292709 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286542 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:30:14.292709 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286545 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:30:14.292709 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286547 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:30:14.292709 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286550 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:30:14.292709 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286552 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:30:14.292709 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.286555 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:30:14.293572 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.287487 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 22:30:14.294970 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.294949 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 22:30:14.294970 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.294970 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 22:30:14.295117 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295061 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:30:14.295117 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295069 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:30:14.295117 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295073 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:30:14.295117 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295078 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:30:14.295117 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295086 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:30:14.295117 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295093 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:30:14.295117 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295098 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:30:14.295117 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295103 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:30:14.295117 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295107 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:30:14.295117 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295112 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:30:14.295117 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295116 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:30:14.295117 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295121 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:30:14.295117 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295125 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:30:14.295684 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295131 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:30:14.295684 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295136 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:30:14.295684 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295140 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:30:14.295684 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295145 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:30:14.295684 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295150 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:30:14.295684 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295154 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:30:14.295684 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295158 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:30:14.295684 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295162 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:30:14.295684 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295166 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:30:14.295684 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295171 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:30:14.295684 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295175 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:30:14.295684 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295179 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:30:14.295684 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295184 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:30:14.295684 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295190 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:30:14.295684 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295194 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:30:14.295684 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295198 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:30:14.295684 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295202 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:30:14.295684 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295207 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:30:14.295684 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295211 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:30:14.295684 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295215 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:30:14.296407 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295219 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:30:14.296407 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295223 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:30:14.296407 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295227 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:30:14.296407 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295231 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:30:14.296407 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295235 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:30:14.296407 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295239 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:30:14.296407 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295243 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:30:14.296407 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295248 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:30:14.296407 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295252 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:30:14.296407 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295256 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:30:14.296407 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295261 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:30:14.296407 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295267 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:30:14.296407 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295274 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:30:14.296407 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295282 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:30:14.296407 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295287 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:30:14.296407 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295291 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:30:14.296407 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295295 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:30:14.296407 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295299 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:30:14.296407 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295303 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:30:14.296981 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295308 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:30:14.296981 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295312 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:30:14.296981 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295316 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:30:14.296981 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295320 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:30:14.296981 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295324 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:30:14.296981 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295329 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:30:14.296981 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295333 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:30:14.296981 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295337 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:30:14.296981 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295342 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:30:14.296981 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295346 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:30:14.296981 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295350 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:30:14.296981 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295354 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:30:14.296981 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295359 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:30:14.296981 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295363 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:30:14.296981 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295367 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:30:14.296981 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295371 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:30:14.296981 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295375 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:30:14.296981 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295380 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:30:14.296981 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295384 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:30:14.297621 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295388 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:30:14.297621 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295392 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:30:14.297621 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295397 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:30:14.297621 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295401 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:30:14.297621 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295405 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:30:14.297621 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295409 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:30:14.297621 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295413 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:30:14.297621 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295418 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:30:14.297621 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295424 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:30:14.297621 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295428 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:30:14.297621 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295432 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:30:14.297621 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295436 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:30:14.297621 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295440 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:30:14.297621 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295444 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:30:14.297621 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295448 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:30:14.298249 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.295456 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 22:30:14.298249 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295608 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:30:14.298249 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295616 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:30:14.298249 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295622 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:30:14.298249 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295627 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:30:14.298249 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295632 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:30:14.298249 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295636 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:30:14.298249 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295640 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:30:14.298249 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295644 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:30:14.298249 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295648 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:30:14.298249 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295652 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:30:14.298249 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295657 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:30:14.298249 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295661 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:30:14.298249 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295665 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:30:14.298249 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295669 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:30:14.298923 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295674 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:30:14.298923 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295678 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:30:14.298923 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295682 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:30:14.298923 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295686 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:30:14.298923 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295690 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:30:14.298923 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295694 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:30:14.298923 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295698 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:30:14.298923 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295703 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:30:14.298923 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295707 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:30:14.298923 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295712 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:30:14.298923 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295716 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:30:14.298923 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295721 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:30:14.298923 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295726 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:30:14.298923 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295730 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:30:14.298923 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295734 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:30:14.298923 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295739 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:30:14.298923 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295742 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:30:14.298923 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295746 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:30:14.298923 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295753 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:30:14.299436 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295758 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:30:14.299436 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295763 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:30:14.299436 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295767 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:30:14.299436 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295772 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:30:14.299436 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295776 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:30:14.299436 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295780 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:30:14.299436 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295784 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:30:14.299436 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295788 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:30:14.299436 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295793 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:30:14.299436 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295797 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:30:14.299436 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295801 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:30:14.299436 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295806 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:30:14.299436 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295810 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:30:14.299436 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295814 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:30:14.299436 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295817 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:30:14.299436 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295822 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:30:14.299436 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295826 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:30:14.299436 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295830 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:30:14.299436 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295833 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:30:14.299436 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295839 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:30:14.300081 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295844 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:30:14.300081 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295848 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:30:14.300081 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295853 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:30:14.300081 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295857 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:30:14.300081 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295861 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:30:14.300081 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295866 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:30:14.300081 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295889 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:30:14.300081 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295894 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:30:14.300081 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295898 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:30:14.300081 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295902 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:30:14.300081 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295906 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:30:14.300081 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295910 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:30:14.300081 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295914 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:30:14.300081 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295918 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:30:14.300081 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295923 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:30:14.300081 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295927 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:30:14.300081 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295932 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:30:14.300081 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295936 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:30:14.300081 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295940 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:30:14.300081 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295944 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:30:14.300573 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295948 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:30:14.300573 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295952 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:30:14.300573 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295956 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:30:14.300573 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295960 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:30:14.300573 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295964 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:30:14.300573 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295968 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:30:14.300573 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295972 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:30:14.300573 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295976 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:30:14.300573 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295980 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:30:14.300573 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295984 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:30:14.300573 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295988 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:30:14.300573 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295992 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:30:14.300573 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:14.295997 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:30:14.300573 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.296005 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 22:30:14.300573 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.296818 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 22:30:14.300999 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.299570 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 22:30:14.300999 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.300924 2574 server.go:1019] "Starting client certificate rotation" Apr 24 22:30:14.301061 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.301024 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 22:30:14.301091 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.301064 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 22:30:14.327146 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.327129 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 22:30:14.331601 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.331583 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 22:30:14.349147 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.349123 2574 log.go:25] "Validated CRI v1 runtime API" Apr 24 22:30:14.355082 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.355067 2574 log.go:25] "Validated CRI v1 image API" Apr 24 22:30:14.356304 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.356287 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 22:30:14.358776 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.358758 2574 fs.go:135] Filesystem UUIDs: map[06bfb089-ff8b-44dd-93d2-763e6464c179:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 f08bc7ca-4649-41b9-be5e-b2ef31637e96:/dev/nvme0n1p4] Apr 24 22:30:14.358833 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.358776 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 22:30:14.364745 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.364623 2574 manager.go:217] Machine: {Timestamp:2026-04-24 22:30:14.362549136 +0000 UTC m=+0.412405004 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3124536 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec20ef40d245181d27e3aa1fba42913a SystemUUID:ec20ef40-d245-181d-27e3-aa1fba42913a BootID:a504aca6-670b-4dfd-b054-f3bbc646f0d4 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:12:f6:c5:cf:a7 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:12:f6:c5:cf:a7 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:fe:fb:28:ce:17:c1 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 22:30:14.364745 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.364735 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 22:30:14.364859 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.364815 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 22:30:14.367999 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.367973 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 22:30:14.368163 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.368001 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-176.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 22:30:14.368264 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.368172 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 22:30:14.368264 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.368181 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 22:30:14.368264 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.368200 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 22:30:14.369188 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.369177 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 22:30:14.370149 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.370139 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 24 22:30:14.370257 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.370248 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 22:30:14.370484 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.370468 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 22:30:14.373284 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.373274 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 24 22:30:14.373322 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.373289 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 22:30:14.373322 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.373300 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 22:30:14.373322 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.373309 2574 kubelet.go:397] "Adding apiserver pod source" Apr 24 22:30:14.373322 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.373318 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 22:30:14.374523 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.374511 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 22:30:14.374572 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.374530 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 22:30:14.377899 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.377868 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 22:30:14.380283 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.380270 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 22:30:14.381752 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.381741 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 22:30:14.381791 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.381758 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 22:30:14.381791 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.381764 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 22:30:14.381791 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.381770 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 22:30:14.381791 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.381775 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 22:30:14.381791 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.381781 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 22:30:14.381791 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.381788 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 22:30:14.381976 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.381794 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 22:30:14.381976 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.381802 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 22:30:14.381976 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.381809 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 22:30:14.381976 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.381825 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 22:30:14.381976 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.381834 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 22:30:14.383834 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.383822 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 22:30:14.383890 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.383835 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 22:30:14.387324 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.387311 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 22:30:14.387442 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.387346 2574 server.go:1295] "Started kubelet" Apr 24 22:30:14.387488 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.387455 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 22:30:14.388271 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.387480 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 22:30:14.388333 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.388298 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 22:30:14.388844 ip-10-0-129-176 systemd[1]: Started Kubernetes Kubelet. Apr 24 22:30:14.390397 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.390382 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 22:30:14.390919 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.390898 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 24 22:30:14.394452 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:14.394429 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-176.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 22:30:14.394542 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.394506 2574 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-176.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 22:30:14.394542 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:14.394512 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 22:30:14.395530 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.395516 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 22:30:14.395618 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.395599 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 22:30:14.397540 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.397514 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 22:30:14.397540 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.397536 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 22:30:14.397682 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.397512 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 22:30:14.397682 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:14.397549 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-176.ec2.internal\" not found" Apr 24 22:30:14.397772 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.397695 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 24 22:30:14.397772 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.397704 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 24 22:30:14.397951 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.397936 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 22:30:14.398008 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.397955 2574 factory.go:55] Registering systemd factory Apr 24 22:30:14.398008 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.397964 2574 factory.go:223] Registration of the systemd container factory successfully Apr 24 22:30:14.398161 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.398147 2574 factory.go:153] Registering CRI-O factory Apr 24 22:30:14.398199 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.398166 2574 factory.go:223] Registration of the crio container factory successfully Apr 24 22:30:14.398199 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.398185 2574 factory.go:103] Registering Raw factory Apr 24 22:30:14.398199 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.398195 2574 manager.go:1196] Started watching for new ooms in manager Apr 24 22:30:14.398742 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.398727 2574 manager.go:319] Starting recovery of all containers Apr 24 22:30:14.401148 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:14.401115 2574 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 22:30:14.407253 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:14.407082 2574 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-129-176.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 22:30:14.407376 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:14.407354 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 22:30:14.408323 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:14.407285 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-176.ec2.internal.18a96b94b3a030a3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-176.ec2.internal,UID:ip-10-0-129-176.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-129-176.ec2.internal,},FirstTimestamp:2026-04-24 22:30:14.387323043 +0000 UTC m=+0.437178910,LastTimestamp:2026-04-24 22:30:14.387323043 +0000 UTC m=+0.437178910,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-176.ec2.internal,}" Apr 24 22:30:14.409193 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.409179 2574 manager.go:324] Recovery completed Apr 24 22:30:14.410582 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:14.410493 2574 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 24 22:30:14.413964 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.413948 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:30:14.416331 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.416309 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-176.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:30:14.416399 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.416342 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-176.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:30:14.416399 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.416355 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-176.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:30:14.416759 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.416745 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 22:30:14.416759 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.416758 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 22:30:14.416842 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.416776 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 24 22:30:14.418673 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.418658 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-bzwrf" Apr 24 22:30:14.419222 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.419211 2574 policy_none.go:49] "None policy: Start" Apr 24 22:30:14.419259 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.419226 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 22:30:14.419259 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.419236 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 24 22:30:14.419590 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:14.419522 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-176.ec2.internal.18a96b94b55aca59 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-176.ec2.internal,UID:ip-10-0-129-176.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-129-176.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-129-176.ec2.internal,},FirstTimestamp:2026-04-24 22:30:14.416329305 +0000 UTC m=+0.466185173,LastTimestamp:2026-04-24 22:30:14.416329305 +0000 UTC m=+0.466185173,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-176.ec2.internal,}" Apr 24 22:30:14.425890 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.425859 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-bzwrf" Apr 24 22:30:14.432732 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:14.432673 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-176.ec2.internal.18a96b94b55b172f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-176.ec2.internal,UID:ip-10-0-129-176.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-129-176.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-129-176.ec2.internal,},FirstTimestamp:2026-04-24 22:30:14.416348975 +0000 UTC m=+0.466204845,LastTimestamp:2026-04-24 22:30:14.416348975 +0000 UTC m=+0.466204845,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-176.ec2.internal,}" Apr 24 22:30:14.462985 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.461423 2574 manager.go:341] "Starting Device Plugin manager" Apr 24 22:30:14.462985 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:14.461448 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 22:30:14.462985 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.461456 2574 server.go:85] "Starting device plugin registration server" Apr 24 22:30:14.462985 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.461643 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 22:30:14.462985 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.461655 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 22:30:14.462985 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.461773 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 22:30:14.462985 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.461832 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 22:30:14.462985 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.461838 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 22:30:14.462985 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:14.462341 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 22:30:14.462985 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:14.462378 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-176.ec2.internal\" not found" Apr 24 22:30:14.529147 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.529074 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 22:30:14.530332 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.530310 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 22:30:14.530414 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.530343 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 22:30:14.530414 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.530365 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 22:30:14.530414 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.530377 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 22:30:14.530522 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:14.530415 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 22:30:14.533489 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.533473 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:30:14.562456 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.562440 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:30:14.563245 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.563230 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-176.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:30:14.563313 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.563257 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-176.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:30:14.563313 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.563268 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-176.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:30:14.563313 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.563290 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-176.ec2.internal" Apr 24 22:30:14.575597 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.575578 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-176.ec2.internal" Apr 24 22:30:14.575653 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:14.575602 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-176.ec2.internal\": node \"ip-10-0-129-176.ec2.internal\" not found" Apr 24 22:30:14.630485 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.630464 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-176.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-176.ec2.internal"] Apr 24 22:30:14.630552 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.630533 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:30:14.631164 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:14.631146 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-176.ec2.internal\" not found" Apr 24 22:30:14.631341 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.631327 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-176.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:30:14.631395 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.631354 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-176.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:30:14.631395 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.631369 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-176.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:30:14.632558 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.632546 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:30:14.632703 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.632690 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-176.ec2.internal" Apr 24 22:30:14.632749 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.632717 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:30:14.634121 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.634105 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-176.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:30:14.634217 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.634125 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-176.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:30:14.634217 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.634134 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-176.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:30:14.634217 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.634158 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-176.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:30:14.634217 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.634172 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-176.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:30:14.634217 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.634138 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-176.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:30:14.635508 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.635492 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-176.ec2.internal" Apr 24 22:30:14.635592 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.635520 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:30:14.636438 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.636407 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-176.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:30:14.636517 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.636442 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-176.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:30:14.636517 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.636457 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-176.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:30:14.656420 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:14.656402 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-176.ec2.internal\" not found" node="ip-10-0-129-176.ec2.internal" Apr 24 22:30:14.660535 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:14.660521 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-176.ec2.internal\" not found" node="ip-10-0-129-176.ec2.internal" Apr 24 22:30:14.699016 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.698995 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/868c258382b869546819753aefbb6e79-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-176.ec2.internal\" (UID: \"868c258382b869546819753aefbb6e79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-176.ec2.internal" Apr 24 22:30:14.699079 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.699021 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/868c258382b869546819753aefbb6e79-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-176.ec2.internal\" (UID: \"868c258382b869546819753aefbb6e79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-176.ec2.internal" Apr 24 22:30:14.699079 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.699041 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b3281f0d425c69d03b02f901cc1387c8-config\") pod \"kube-apiserver-proxy-ip-10-0-129-176.ec2.internal\" (UID: \"b3281f0d425c69d03b02f901cc1387c8\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-176.ec2.internal" Apr 24 22:30:14.731445 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:14.731425 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-176.ec2.internal\" not found" Apr 24 22:30:14.799893 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.799820 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/868c258382b869546819753aefbb6e79-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-176.ec2.internal\" (UID: \"868c258382b869546819753aefbb6e79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-176.ec2.internal" Apr 24 22:30:14.799893 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.799849 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/868c258382b869546819753aefbb6e79-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-176.ec2.internal\" (UID: \"868c258382b869546819753aefbb6e79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-176.ec2.internal" Apr 24 22:30:14.799893 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.799866 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b3281f0d425c69d03b02f901cc1387c8-config\") pod \"kube-apiserver-proxy-ip-10-0-129-176.ec2.internal\" (UID: \"b3281f0d425c69d03b02f901cc1387c8\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-176.ec2.internal" Apr 24 22:30:14.800056 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.799924 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b3281f0d425c69d03b02f901cc1387c8-config\") pod \"kube-apiserver-proxy-ip-10-0-129-176.ec2.internal\" (UID: \"b3281f0d425c69d03b02f901cc1387c8\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-176.ec2.internal" Apr 24 22:30:14.800056 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.799931 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/868c258382b869546819753aefbb6e79-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-176.ec2.internal\" (UID: \"868c258382b869546819753aefbb6e79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-176.ec2.internal" Apr 24 22:30:14.800056 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.799924 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/868c258382b869546819753aefbb6e79-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-176.ec2.internal\" (UID: \"868c258382b869546819753aefbb6e79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-176.ec2.internal" Apr 24 22:30:14.831522 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:14.831497 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-176.ec2.internal\" not found" Apr 24 22:30:14.932145 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:14.932109 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-176.ec2.internal\" not found" Apr 24 22:30:14.958292 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.958271 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-176.ec2.internal" Apr 24 22:30:14.962865 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:14.962847 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-176.ec2.internal" Apr 24 22:30:15.033062 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:15.033034 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-176.ec2.internal\" not found" Apr 24 22:30:15.133610 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:15.133554 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-176.ec2.internal\" not found" Apr 24 22:30:15.234099 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:15.234076 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-176.ec2.internal\" not found" Apr 24 22:30:15.300644 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:15.300625 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 22:30:15.301193 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:15.300750 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 22:30:15.335076 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:15.335054 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-176.ec2.internal\" not found" Apr 24 22:30:15.396742 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:15.396688 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 22:30:15.427269 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:15.427241 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 22:25:14 +0000 UTC" deadline="2027-12-11 08:21:49.341759973 +0000 UTC" Apr 24 22:30:15.427269 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:15.427266 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14289h51m33.914497265s" Apr 24 22:30:15.435375 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:15.435357 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-176.ec2.internal\" not found" Apr 24 22:30:15.436091 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:15.436066 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 22:30:15.474236 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:15.474209 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod868c258382b869546819753aefbb6e79.slice/crio-ceb03ba7d24e375d56f676a2d399123acd2869b68546cb1bfc72fad81f16af61 WatchSource:0}: Error finding container ceb03ba7d24e375d56f676a2d399123acd2869b68546cb1bfc72fad81f16af61: Status 404 returned error can't find the container with id ceb03ba7d24e375d56f676a2d399123acd2869b68546cb1bfc72fad81f16af61 Apr 24 22:30:15.474744 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:15.474718 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3281f0d425c69d03b02f901cc1387c8.slice/crio-93ab5611f8c22ec67a242602f8683771fc063ae78d35aaaae97ace16017ac269 WatchSource:0}: Error finding container 93ab5611f8c22ec67a242602f8683771fc063ae78d35aaaae97ace16017ac269: Status 404 returned error can't find the container with id 93ab5611f8c22ec67a242602f8683771fc063ae78d35aaaae97ace16017ac269 Apr 24 22:30:15.479100 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:15.479083 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:30:15.493472 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:15.493452 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-w9qzr" Apr 24 22:30:15.513490 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:15.513467 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-w9qzr" Apr 24 22:30:15.529274 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:15.529257 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:30:15.533031 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:15.532989 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-176.ec2.internal" event={"ID":"868c258382b869546819753aefbb6e79","Type":"ContainerStarted","Data":"ceb03ba7d24e375d56f676a2d399123acd2869b68546cb1bfc72fad81f16af61"} Apr 24 22:30:15.533859 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:15.533840 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-176.ec2.internal" event={"ID":"b3281f0d425c69d03b02f901cc1387c8","Type":"ContainerStarted","Data":"93ab5611f8c22ec67a242602f8683771fc063ae78d35aaaae97ace16017ac269"} Apr 24 22:30:15.535985 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:15.535969 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-176.ec2.internal\" not found" Apr 24 22:30:15.595278 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:15.595256 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:30:15.596234 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:15.596220 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-176.ec2.internal" Apr 24 22:30:15.620395 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:15.620377 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 22:30:15.622296 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:15.622280 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-176.ec2.internal" Apr 24 22:30:15.639433 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:15.639415 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 22:30:15.687134 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:15.687090 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:30:16.374515 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.374487 2574 apiserver.go:52] "Watching apiserver" Apr 24 22:30:16.382099 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.382074 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 22:30:16.384047 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.384008 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-129-176.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h7b8","openshift-image-registry/node-ca-tb9tr","openshift-network-diagnostics/network-check-target-lh47p","openshift-network-operator/iptables-alerter-nzrnj","openshift-ovn-kubernetes/ovnkube-node-xsvks","kube-system/konnectivity-agent-vqh8g","openshift-cluster-node-tuning-operator/tuned-tlfts","openshift-dns/node-resolver-t25xz","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-176.ec2.internal","openshift-multus/multus-additional-cni-plugins-2mz4f","openshift-multus/multus-jdw5l","openshift-multus/network-metrics-daemon-n8ntw"] Apr 24 22:30:16.386485 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.386463 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vqh8g" Apr 24 22:30:16.388606 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.388584 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h7b8" Apr 24 22:30:16.390715 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.390695 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 22:30:16.390834 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.390743 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 22:30:16.390834 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.390700 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tb9tr" Apr 24 22:30:16.391064 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.391045 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-zdncl\"" Apr 24 22:30:16.392799 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.392774 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lh47p" Apr 24 22:30:16.392946 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:16.392923 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lh47p" podUID="bc853d31-7ceb-415d-a71b-f18ee833a50c" Apr 24 22:30:16.393691 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.393676 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 22:30:16.393779 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.393731 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 22:30:16.393850 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.393826 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 22:30:16.394017 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.394003 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-js2l7\"" Apr 24 22:30:16.396296 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.396272 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 22:30:16.397152 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.396321 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-ppgwm\"" Apr 24 22:30:16.397152 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.397049 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 22:30:16.397336 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.397184 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-nzrnj" Apr 24 22:30:16.397336 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.397332 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.397800 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.397772 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 22:30:16.400499 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.400476 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.400589 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.400576 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-kfp54\"" Apr 24 22:30:16.400674 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.400491 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 22:30:16.400936 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.400916 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 22:30:16.401471 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.401452 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 22:30:16.401950 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.401931 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 22:30:16.402820 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.402802 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-t25xz" Apr 24 22:30:16.403042 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.402919 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 22:30:16.403292 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.403275 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 22:30:16.404498 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.404481 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:30:16.404498 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.404496 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 22:30:16.404718 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.404699 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:30:16.404817 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.404717 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-dzwh4\"" Apr 24 22:30:16.405177 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.405159 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2mz4f" Apr 24 22:30:16.407438 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.407413 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 22:30:16.407557 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.407465 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.407557 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.407536 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 22:30:16.407670 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.407659 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-f8tbk\"" Apr 24 22:30:16.408414 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.408394 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn9hf\" (UniqueName: \"kubernetes.io/projected/b188c294-c06f-4cc9-ab29-5edd0333288d-kube-api-access-mn9hf\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.408506 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.408422 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-run-openvswitch\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.408506 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.408446 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bc479082-3849-4902-830a-4a785973b983-iptables-alerter-script\") pod \"iptables-alerter-nzrnj\" (UID: \"bc479082-3849-4902-830a-4a785973b983\") " pod="openshift-network-operator/iptables-alerter-nzrnj" Apr 24 22:30:16.408506 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.408471 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-host-slash\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.408506 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.408494 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-run-systemd\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.408762 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.408535 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-host-cni-bin\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.408762 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.408568 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b188c294-c06f-4cc9-ab29-5edd0333288d-env-overrides\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.408762 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.408600 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-host-kubelet\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.408762 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.408643 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/04b95921-75f8-4261-8992-f655aedb0790-etc-kubernetes\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.408762 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.408678 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/04b95921-75f8-4261-8992-f655aedb0790-etc-sysctl-d\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.408762 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.408711 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/04b95921-75f8-4261-8992-f655aedb0790-etc-sysctl-conf\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.408762 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.408733 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/04b95921-75f8-4261-8992-f655aedb0790-host\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.408762 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.408756 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d4161e94-f0bf-458d-9264-fe2fc28cab32-sys-fs\") pod \"aws-ebs-csi-driver-node-2h7b8\" (UID: \"d4161e94-f0bf-458d-9264-fe2fc28cab32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h7b8" Apr 24 22:30:16.409129 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.408781 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/10e80548-a374-4b56-8428-91554c6f203b-konnectivity-ca\") pod \"konnectivity-agent-vqh8g\" (UID: \"10e80548-a374-4b56-8428-91554c6f203b\") " pod="kube-system/konnectivity-agent-vqh8g" Apr 24 22:30:16.409129 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.408807 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-host-cni-netd\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.409129 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.408840 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.409129 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.408892 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/04b95921-75f8-4261-8992-f655aedb0790-etc-modprobe-d\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.409129 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.408922 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d4161e94-f0bf-458d-9264-fe2fc28cab32-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2h7b8\" (UID: \"d4161e94-f0bf-458d-9264-fe2fc28cab32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h7b8" Apr 24 22:30:16.409129 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.408945 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd68n\" (UniqueName: \"kubernetes.io/projected/d4161e94-f0bf-458d-9264-fe2fc28cab32-kube-api-access-fd68n\") pod \"aws-ebs-csi-driver-node-2h7b8\" (UID: \"d4161e94-f0bf-458d-9264-fe2fc28cab32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h7b8" Apr 24 22:30:16.409129 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.408964 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ed31fcf6-85e1-43d1-86ff-16bb6763e3e6-host\") pod \"node-ca-tb9tr\" (UID: \"ed31fcf6-85e1-43d1-86ff-16bb6763e3e6\") " pod="openshift-image-registry/node-ca-tb9tr" Apr 24 22:30:16.409129 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.408979 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4cst\" (UniqueName: \"kubernetes.io/projected/bc853d31-7ceb-415d-a71b-f18ee833a50c-kube-api-access-m4cst\") pod \"network-check-target-lh47p\" (UID: \"bc853d31-7ceb-415d-a71b-f18ee833a50c\") " pod="openshift-network-diagnostics/network-check-target-lh47p" Apr 24 22:30:16.409129 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.408994 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b188c294-c06f-4cc9-ab29-5edd0333288d-ovn-node-metrics-cert\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.409129 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.409023 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/04b95921-75f8-4261-8992-f655aedb0790-etc-systemd\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.409129 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.409040 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/04b95921-75f8-4261-8992-f655aedb0790-sys\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.409129 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.409055 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ed31fcf6-85e1-43d1-86ff-16bb6763e3e6-serviceca\") pod \"node-ca-tb9tr\" (UID: \"ed31fcf6-85e1-43d1-86ff-16bb6763e3e6\") " pod="openshift-image-registry/node-ca-tb9tr" Apr 24 22:30:16.409129 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.409073 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/04b95921-75f8-4261-8992-f655aedb0790-etc-sysconfig\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.409129 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.409091 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-etc-openvswitch\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.409129 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.409120 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-host-run-ovn-kubernetes\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.409913 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.409155 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b188c294-c06f-4cc9-ab29-5edd0333288d-ovnkube-config\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.409913 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.409196 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b188c294-c06f-4cc9-ab29-5edd0333288d-ovnkube-script-lib\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.409913 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.409227 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/04b95921-75f8-4261-8992-f655aedb0790-run\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.409913 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.409252 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rlgj\" (UniqueName: \"kubernetes.io/projected/04b95921-75f8-4261-8992-f655aedb0790-kube-api-access-8rlgj\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.409913 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.409277 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bc479082-3849-4902-830a-4a785973b983-host-slash\") pod \"iptables-alerter-nzrnj\" (UID: \"bc479082-3849-4902-830a-4a785973b983\") " pod="openshift-network-operator/iptables-alerter-nzrnj" Apr 24 22:30:16.409913 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.409300 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9kw8\" (UniqueName: \"kubernetes.io/projected/bc479082-3849-4902-830a-4a785973b983-kube-api-access-k9kw8\") pod \"iptables-alerter-nzrnj\" (UID: \"bc479082-3849-4902-830a-4a785973b983\") " pod="openshift-network-operator/iptables-alerter-nzrnj" Apr 24 22:30:16.409913 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.409326 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/04b95921-75f8-4261-8992-f655aedb0790-lib-modules\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.409913 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.409348 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/04b95921-75f8-4261-8992-f655aedb0790-var-lib-kubelet\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.409913 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.409371 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/04b95921-75f8-4261-8992-f655aedb0790-tmp\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.409913 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.409408 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d4161e94-f0bf-458d-9264-fe2fc28cab32-socket-dir\") pod \"aws-ebs-csi-driver-node-2h7b8\" (UID: \"d4161e94-f0bf-458d-9264-fe2fc28cab32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h7b8" Apr 24 22:30:16.409913 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.409432 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d4161e94-f0bf-458d-9264-fe2fc28cab32-registration-dir\") pod \"aws-ebs-csi-driver-node-2h7b8\" (UID: \"d4161e94-f0bf-458d-9264-fe2fc28cab32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h7b8" Apr 24 22:30:16.409913 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.409456 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/10e80548-a374-4b56-8428-91554c6f203b-agent-certs\") pod \"konnectivity-agent-vqh8g\" (UID: \"10e80548-a374-4b56-8428-91554c6f203b\") " pod="kube-system/konnectivity-agent-vqh8g" Apr 24 22:30:16.409913 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.409478 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-systemd-units\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.409913 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.409505 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-host-run-netns\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.409913 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.409528 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-var-lib-openvswitch\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.409913 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.409565 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-log-socket\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.410522 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.409593 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/04b95921-75f8-4261-8992-f655aedb0790-etc-tuned\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.410522 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.409618 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d4161e94-f0bf-458d-9264-fe2fc28cab32-device-dir\") pod \"aws-ebs-csi-driver-node-2h7b8\" (UID: \"d4161e94-f0bf-458d-9264-fe2fc28cab32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h7b8" Apr 24 22:30:16.410522 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.409640 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d4161e94-f0bf-458d-9264-fe2fc28cab32-etc-selinux\") pod \"aws-ebs-csi-driver-node-2h7b8\" (UID: \"d4161e94-f0bf-458d-9264-fe2fc28cab32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h7b8" Apr 24 22:30:16.410522 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.409662 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-run-ovn\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.410522 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.409683 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-node-log\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.410522 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.409706 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w2fh\" (UniqueName: \"kubernetes.io/projected/ed31fcf6-85e1-43d1-86ff-16bb6763e3e6-kube-api-access-4w2fh\") pod \"node-ca-tb9tr\" (UID: \"ed31fcf6-85e1-43d1-86ff-16bb6763e3e6\") " pod="openshift-image-registry/node-ca-tb9tr" Apr 24 22:30:16.410522 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.409785 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8ntw" Apr 24 22:30:16.410522 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:16.409842 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n8ntw" podUID="fd77a426-e63b-4027-97b7-e9893fd72601" Apr 24 22:30:16.410862 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.410533 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 22:30:16.410862 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.410564 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-87hfw\"" Apr 24 22:30:16.410862 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.410855 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 22:30:16.411020 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.410897 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 22:30:16.411072 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.411051 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 22:30:16.411155 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.411134 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 22:30:16.411227 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.411207 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 22:30:16.411319 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.411254 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 22:30:16.411392 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.411375 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-44l7b\"" Apr 24 22:30:16.411452 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.411433 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 22:30:16.417493 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.417473 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-n8x2b\"" Apr 24 22:30:16.498928 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.498899 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 22:30:16.510432 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.510405 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/04b95921-75f8-4261-8992-f655aedb0790-etc-modprobe-d\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.510560 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.510445 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fd68n\" (UniqueName: \"kubernetes.io/projected/d4161e94-f0bf-458d-9264-fe2fc28cab32-kube-api-access-fd68n\") pod \"aws-ebs-csi-driver-node-2h7b8\" (UID: \"d4161e94-f0bf-458d-9264-fe2fc28cab32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h7b8" Apr 24 22:30:16.510560 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.510475 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-cni-binary-copy\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.510560 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.510498 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-host-run-netns\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.510560 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.510522 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4fb2f48b-e735-4e72-896c-724bf453529c-os-release\") pod \"multus-additional-cni-plugins-2mz4f\" (UID: \"4fb2f48b-e735-4e72-896c-724bf453529c\") " pod="openshift-multus/multus-additional-cni-plugins-2mz4f" Apr 24 22:30:16.510560 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.510547 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b188c294-c06f-4cc9-ab29-5edd0333288d-ovn-node-metrics-cert\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.510796 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.510556 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/04b95921-75f8-4261-8992-f655aedb0790-etc-modprobe-d\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.510796 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.510569 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/04b95921-75f8-4261-8992-f655aedb0790-etc-systemd\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.510796 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.510630 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/04b95921-75f8-4261-8992-f655aedb0790-sys\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.510796 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.510644 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/04b95921-75f8-4261-8992-f655aedb0790-etc-systemd\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.510796 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.510739 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4fb2f48b-e735-4e72-896c-724bf453529c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2mz4f\" (UID: \"4fb2f48b-e735-4e72-896c-724bf453529c\") " pod="openshift-multus/multus-additional-cni-plugins-2mz4f" Apr 24 22:30:16.510796 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.510753 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/04b95921-75f8-4261-8992-f655aedb0790-sys\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.510796 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.510794 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-host-run-ovn-kubernetes\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.511135 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.510824 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b188c294-c06f-4cc9-ab29-5edd0333288d-ovnkube-script-lib\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.511135 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.510843 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-host-run-ovn-kubernetes\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.511135 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.510898 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/04b95921-75f8-4261-8992-f655aedb0790-run\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.511135 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.510926 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d4161e94-f0bf-458d-9264-fe2fc28cab32-socket-dir\") pod \"aws-ebs-csi-driver-node-2h7b8\" (UID: \"d4161e94-f0bf-458d-9264-fe2fc28cab32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h7b8" Apr 24 22:30:16.511135 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.510922 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 22:30:16.511135 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.510953 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/04b95921-75f8-4261-8992-f655aedb0790-run\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.511135 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.510956 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-os-release\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.511135 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.510997 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k9kw8\" (UniqueName: \"kubernetes.io/projected/bc479082-3849-4902-830a-4a785973b983-kube-api-access-k9kw8\") pod \"iptables-alerter-nzrnj\" (UID: \"bc479082-3849-4902-830a-4a785973b983\") " pod="openshift-network-operator/iptables-alerter-nzrnj" Apr 24 22:30:16.511135 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.511022 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/04b95921-75f8-4261-8992-f655aedb0790-tmp\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.511135 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.511038 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d4161e94-f0bf-458d-9264-fe2fc28cab32-socket-dir\") pod \"aws-ebs-csi-driver-node-2h7b8\" (UID: \"d4161e94-f0bf-458d-9264-fe2fc28cab32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h7b8" Apr 24 22:30:16.511135 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.511048 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d4161e94-f0bf-458d-9264-fe2fc28cab32-registration-dir\") pod \"aws-ebs-csi-driver-node-2h7b8\" (UID: \"d4161e94-f0bf-458d-9264-fe2fc28cab32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h7b8" Apr 24 22:30:16.511135 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.511074 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/10e80548-a374-4b56-8428-91554c6f203b-agent-certs\") pod \"konnectivity-agent-vqh8g\" (UID: \"10e80548-a374-4b56-8428-91554c6f203b\") " pod="kube-system/konnectivity-agent-vqh8g" Apr 24 22:30:16.511135 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.511103 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/31be179c-c441-48a4-8779-593458646c77-tmp-dir\") pod \"node-resolver-t25xz\" (UID: \"31be179c-c441-48a4-8779-593458646c77\") " pod="openshift-dns/node-resolver-t25xz" Apr 24 22:30:16.511135 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.511130 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-host-var-lib-cni-multus\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.511759 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.511151 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d4161e94-f0bf-458d-9264-fe2fc28cab32-registration-dir\") pod \"aws-ebs-csi-driver-node-2h7b8\" (UID: \"d4161e94-f0bf-458d-9264-fe2fc28cab32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h7b8" Apr 24 22:30:16.511759 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.511157 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-systemd-units\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.511759 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.511198 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-host-run-netns\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.511759 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.511212 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-systemd-units\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.511759 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.511223 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/04b95921-75f8-4261-8992-f655aedb0790-etc-tuned\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.511759 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.511252 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64r6t\" (UniqueName: \"kubernetes.io/projected/31be179c-c441-48a4-8779-593458646c77-kube-api-access-64r6t\") pod \"node-resolver-t25xz\" (UID: \"31be179c-c441-48a4-8779-593458646c77\") " pod="openshift-dns/node-resolver-t25xz" Apr 24 22:30:16.511759 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.511280 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-system-cni-dir\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.511759 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.511288 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-host-run-netns\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.511759 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.511305 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mn9hf\" (UniqueName: \"kubernetes.io/projected/b188c294-c06f-4cc9-ab29-5edd0333288d-kube-api-access-mn9hf\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.511759 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.511330 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-run-openvswitch\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.511759 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.511357 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bc479082-3849-4902-830a-4a785973b983-iptables-alerter-script\") pod \"iptables-alerter-nzrnj\" (UID: \"bc479082-3849-4902-830a-4a785973b983\") " pod="openshift-network-operator/iptables-alerter-nzrnj" Apr 24 22:30:16.511759 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.511385 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-run-systemd\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.511759 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.511410 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b188c294-c06f-4cc9-ab29-5edd0333288d-env-overrides\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.511759 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.511435 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/04b95921-75f8-4261-8992-f655aedb0790-etc-sysctl-d\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.511759 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.511460 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4fb2f48b-e735-4e72-896c-724bf453529c-cnibin\") pod \"multus-additional-cni-plugins-2mz4f\" (UID: \"4fb2f48b-e735-4e72-896c-724bf453529c\") " pod="openshift-multus/multus-additional-cni-plugins-2mz4f" Apr 24 22:30:16.511759 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.511487 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-host-var-lib-kubelet\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.511759 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.511509 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-run-systemd\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.512572 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.511514 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-host-run-multus-certs\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.512572 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.511533 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b188c294-c06f-4cc9-ab29-5edd0333288d-ovnkube-script-lib\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.512572 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.511548 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-run-openvswitch\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.512572 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.511574 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4fb2f48b-e735-4e72-896c-724bf453529c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2mz4f\" (UID: \"4fb2f48b-e735-4e72-896c-724bf453529c\") " pod="openshift-multus/multus-additional-cni-plugins-2mz4f" Apr 24 22:30:16.512572 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.511605 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-host-cni-netd\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.512572 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.511632 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d4161e94-f0bf-458d-9264-fe2fc28cab32-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2h7b8\" (UID: \"d4161e94-f0bf-458d-9264-fe2fc28cab32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h7b8" Apr 24 22:30:16.512572 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.511676 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-host-cni-netd\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.512572 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.511706 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ed31fcf6-85e1-43d1-86ff-16bb6763e3e6-host\") pod \"node-ca-tb9tr\" (UID: \"ed31fcf6-85e1-43d1-86ff-16bb6763e3e6\") " pod="openshift-image-registry/node-ca-tb9tr" Apr 24 22:30:16.512572 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.511736 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4fb2f48b-e735-4e72-896c-724bf453529c-system-cni-dir\") pod \"multus-additional-cni-plugins-2mz4f\" (UID: \"4fb2f48b-e735-4e72-896c-724bf453529c\") " pod="openshift-multus/multus-additional-cni-plugins-2mz4f" Apr 24 22:30:16.512572 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.511766 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ed31fcf6-85e1-43d1-86ff-16bb6763e3e6-serviceca\") pod \"node-ca-tb9tr\" (UID: \"ed31fcf6-85e1-43d1-86ff-16bb6763e3e6\") " pod="openshift-image-registry/node-ca-tb9tr" Apr 24 22:30:16.512572 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.511818 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-multus-daemon-config\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.512572 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.511861 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxdhg\" (UniqueName: \"kubernetes.io/projected/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-kube-api-access-jxdhg\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.512572 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.511907 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b188c294-c06f-4cc9-ab29-5edd0333288d-env-overrides\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.512572 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.511905 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4fb2f48b-e735-4e72-896c-724bf453529c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2mz4f\" (UID: \"4fb2f48b-e735-4e72-896c-724bf453529c\") " pod="openshift-multus/multus-additional-cni-plugins-2mz4f" Apr 24 22:30:16.512572 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.511943 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/04b95921-75f8-4261-8992-f655aedb0790-etc-sysconfig\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.512572 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.511961 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d4161e94-f0bf-458d-9264-fe2fc28cab32-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2h7b8\" (UID: \"d4161e94-f0bf-458d-9264-fe2fc28cab32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h7b8" Apr 24 22:30:16.512572 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512005 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-etc-openvswitch\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.513386 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512011 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ed31fcf6-85e1-43d1-86ff-16bb6763e3e6-host\") pod \"node-ca-tb9tr\" (UID: \"ed31fcf6-85e1-43d1-86ff-16bb6763e3e6\") " pod="openshift-image-registry/node-ca-tb9tr" Apr 24 22:30:16.513386 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512030 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/04b95921-75f8-4261-8992-f655aedb0790-etc-sysctl-d\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.513386 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512037 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b188c294-c06f-4cc9-ab29-5edd0333288d-ovnkube-config\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.513386 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512066 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8rlgj\" (UniqueName: \"kubernetes.io/projected/04b95921-75f8-4261-8992-f655aedb0790-kube-api-access-8rlgj\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.513386 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512092 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4w2fh\" (UniqueName: \"kubernetes.io/projected/ed31fcf6-85e1-43d1-86ff-16bb6763e3e6-kube-api-access-4w2fh\") pod \"node-ca-tb9tr\" (UID: \"ed31fcf6-85e1-43d1-86ff-16bb6763e3e6\") " pod="openshift-image-registry/node-ca-tb9tr" Apr 24 22:30:16.513386 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512100 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/04b95921-75f8-4261-8992-f655aedb0790-etc-sysconfig\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.513386 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512118 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/31be179c-c441-48a4-8779-593458646c77-hosts-file\") pod \"node-resolver-t25xz\" (UID: \"31be179c-c441-48a4-8779-593458646c77\") " pod="openshift-dns/node-resolver-t25xz" Apr 24 22:30:16.513386 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512144 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4fb2f48b-e735-4e72-896c-724bf453529c-cni-binary-copy\") pod \"multus-additional-cni-plugins-2mz4f\" (UID: \"4fb2f48b-e735-4e72-896c-724bf453529c\") " pod="openshift-multus/multus-additional-cni-plugins-2mz4f" Apr 24 22:30:16.513386 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512089 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bc479082-3849-4902-830a-4a785973b983-iptables-alerter-script\") pod \"iptables-alerter-nzrnj\" (UID: \"bc479082-3849-4902-830a-4a785973b983\") " pod="openshift-network-operator/iptables-alerter-nzrnj" Apr 24 22:30:16.513386 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512174 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bc479082-3849-4902-830a-4a785973b983-host-slash\") pod \"iptables-alerter-nzrnj\" (UID: \"bc479082-3849-4902-830a-4a785973b983\") " pod="openshift-network-operator/iptables-alerter-nzrnj" Apr 24 22:30:16.513386 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512208 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/04b95921-75f8-4261-8992-f655aedb0790-lib-modules\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.513386 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512215 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bc479082-3849-4902-830a-4a785973b983-host-slash\") pod \"iptables-alerter-nzrnj\" (UID: \"bc479082-3849-4902-830a-4a785973b983\") " pod="openshift-network-operator/iptables-alerter-nzrnj" Apr 24 22:30:16.513386 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512240 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/04b95921-75f8-4261-8992-f655aedb0790-var-lib-kubelet\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.513386 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512268 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-cnibin\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.513386 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512291 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-host-var-lib-cni-bin\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.513386 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512316 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-var-lib-openvswitch\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.513386 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512340 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-log-socket\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.514169 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512364 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d4161e94-f0bf-458d-9264-fe2fc28cab32-device-dir\") pod \"aws-ebs-csi-driver-node-2h7b8\" (UID: \"d4161e94-f0bf-458d-9264-fe2fc28cab32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h7b8" Apr 24 22:30:16.514169 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512383 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ed31fcf6-85e1-43d1-86ff-16bb6763e3e6-serviceca\") pod \"node-ca-tb9tr\" (UID: \"ed31fcf6-85e1-43d1-86ff-16bb6763e3e6\") " pod="openshift-image-registry/node-ca-tb9tr" Apr 24 22:30:16.514169 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512387 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d4161e94-f0bf-458d-9264-fe2fc28cab32-etc-selinux\") pod \"aws-ebs-csi-driver-node-2h7b8\" (UID: \"d4161e94-f0bf-458d-9264-fe2fc28cab32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h7b8" Apr 24 22:30:16.514169 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512147 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-etc-openvswitch\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.514169 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512426 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-host-run-k8s-cni-cncf-io\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.514169 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512464 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d4161e94-f0bf-458d-9264-fe2fc28cab32-etc-selinux\") pod \"aws-ebs-csi-driver-node-2h7b8\" (UID: \"d4161e94-f0bf-458d-9264-fe2fc28cab32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h7b8" Apr 24 22:30:16.514169 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512497 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/04b95921-75f8-4261-8992-f655aedb0790-var-lib-kubelet\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.514169 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512530 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/04b95921-75f8-4261-8992-f655aedb0790-lib-modules\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.514169 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512543 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-var-lib-openvswitch\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.514169 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512547 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b188c294-c06f-4cc9-ab29-5edd0333288d-ovnkube-config\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.514169 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512552 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-log-socket\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.514169 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512581 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-etc-kubernetes\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.514169 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512616 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b7qm\" (UniqueName: \"kubernetes.io/projected/4fb2f48b-e735-4e72-896c-724bf453529c-kube-api-access-4b7qm\") pod \"multus-additional-cni-plugins-2mz4f\" (UID: \"4fb2f48b-e735-4e72-896c-724bf453529c\") " pod="openshift-multus/multus-additional-cni-plugins-2mz4f" Apr 24 22:30:16.514169 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512588 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d4161e94-f0bf-458d-9264-fe2fc28cab32-device-dir\") pod \"aws-ebs-csi-driver-node-2h7b8\" (UID: \"d4161e94-f0bf-458d-9264-fe2fc28cab32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h7b8" Apr 24 22:30:16.514169 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512647 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-run-ovn\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.514169 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512684 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-node-log\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.514169 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512715 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-multus-socket-dir-parent\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.514837 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512718 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-run-ovn\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.514837 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512751 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-host-slash\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.514837 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512769 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-node-log\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.514837 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512796 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-host-slash\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.514837 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512824 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-host-cni-bin\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.514837 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512854 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-multus-cni-dir\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.514837 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512856 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-host-cni-bin\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.514837 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512891 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-hostroot\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.514837 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512923 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fd77a426-e63b-4027-97b7-e9893fd72601-metrics-certs\") pod \"network-metrics-daemon-n8ntw\" (UID: \"fd77a426-e63b-4027-97b7-e9893fd72601\") " pod="openshift-multus/network-metrics-daemon-n8ntw" Apr 24 22:30:16.514837 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.512957 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m598x\" (UniqueName: \"kubernetes.io/projected/fd77a426-e63b-4027-97b7-e9893fd72601-kube-api-access-m598x\") pod \"network-metrics-daemon-n8ntw\" (UID: \"fd77a426-e63b-4027-97b7-e9893fd72601\") " pod="openshift-multus/network-metrics-daemon-n8ntw" Apr 24 22:30:16.514837 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.513001 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-host-kubelet\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.514837 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.513045 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/04b95921-75f8-4261-8992-f655aedb0790-etc-kubernetes\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.514837 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.513064 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-host-kubelet\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.514837 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.513093 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/04b95921-75f8-4261-8992-f655aedb0790-etc-sysctl-conf\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.514837 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.513109 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/04b95921-75f8-4261-8992-f655aedb0790-host\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.514837 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.513130 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d4161e94-f0bf-458d-9264-fe2fc28cab32-sys-fs\") pod \"aws-ebs-csi-driver-node-2h7b8\" (UID: \"d4161e94-f0bf-458d-9264-fe2fc28cab32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h7b8" Apr 24 22:30:16.514837 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.513155 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/10e80548-a374-4b56-8428-91554c6f203b-konnectivity-ca\") pod \"konnectivity-agent-vqh8g\" (UID: \"10e80548-a374-4b56-8428-91554c6f203b\") " pod="kube-system/konnectivity-agent-vqh8g" Apr 24 22:30:16.515316 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.513180 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m4cst\" (UniqueName: \"kubernetes.io/projected/bc853d31-7ceb-415d-a71b-f18ee833a50c-kube-api-access-m4cst\") pod \"network-check-target-lh47p\" (UID: \"bc853d31-7ceb-415d-a71b-f18ee833a50c\") " pod="openshift-network-diagnostics/network-check-target-lh47p" Apr 24 22:30:16.515316 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.513187 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/04b95921-75f8-4261-8992-f655aedb0790-host\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.515316 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.513213 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/04b95921-75f8-4261-8992-f655aedb0790-etc-sysctl-conf\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.515316 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.513223 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-multus-conf-dir\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.515316 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.513262 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.515316 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.513267 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/04b95921-75f8-4261-8992-f655aedb0790-etc-kubernetes\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.515316 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.513350 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b188c294-c06f-4cc9-ab29-5edd0333288d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.515316 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.513425 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d4161e94-f0bf-458d-9264-fe2fc28cab32-sys-fs\") pod \"aws-ebs-csi-driver-node-2h7b8\" (UID: \"d4161e94-f0bf-458d-9264-fe2fc28cab32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h7b8" Apr 24 22:30:16.515316 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.513619 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/10e80548-a374-4b56-8428-91554c6f203b-konnectivity-ca\") pod \"konnectivity-agent-vqh8g\" (UID: \"10e80548-a374-4b56-8428-91554c6f203b\") " pod="kube-system/konnectivity-agent-vqh8g" Apr 24 22:30:16.515316 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.514208 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 22:25:15 +0000 UTC" deadline="2027-09-26 20:20:27.646707892 +0000 UTC" Apr 24 22:30:16.515316 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.514228 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12477h50m11.132482452s" Apr 24 22:30:16.515316 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.514439 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b188c294-c06f-4cc9-ab29-5edd0333288d-ovn-node-metrics-cert\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.515316 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.514489 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/04b95921-75f8-4261-8992-f655aedb0790-etc-tuned\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.515316 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.514522 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/04b95921-75f8-4261-8992-f655aedb0790-tmp\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.515316 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.514630 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/10e80548-a374-4b56-8428-91554c6f203b-agent-certs\") pod \"konnectivity-agent-vqh8g\" (UID: \"10e80548-a374-4b56-8428-91554c6f203b\") " pod="kube-system/konnectivity-agent-vqh8g" Apr 24 22:30:16.535486 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:16.535448 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:30:16.535486 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:16.535487 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:30:16.535657 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:16.535514 2574 projected.go:194] Error preparing data for projected volume kube-api-access-m4cst for pod openshift-network-diagnostics/network-check-target-lh47p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:16.535657 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:16.535605 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bc853d31-7ceb-415d-a71b-f18ee833a50c-kube-api-access-m4cst podName:bc853d31-7ceb-415d-a71b-f18ee833a50c nodeName:}" failed. No retries permitted until 2026-04-24 22:30:17.035575651 +0000 UTC m=+3.085431521 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-m4cst" (UniqueName: "kubernetes.io/projected/bc853d31-7ceb-415d-a71b-f18ee833a50c-kube-api-access-m4cst") pod "network-check-target-lh47p" (UID: "bc853d31-7ceb-415d-a71b-f18ee833a50c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:16.537362 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.537321 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rlgj\" (UniqueName: \"kubernetes.io/projected/04b95921-75f8-4261-8992-f655aedb0790-kube-api-access-8rlgj\") pod \"tuned-tlfts\" (UID: \"04b95921-75f8-4261-8992-f655aedb0790\") " pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.537458 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.537408 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9kw8\" (UniqueName: \"kubernetes.io/projected/bc479082-3849-4902-830a-4a785973b983-kube-api-access-k9kw8\") pod \"iptables-alerter-nzrnj\" (UID: \"bc479082-3849-4902-830a-4a785973b983\") " pod="openshift-network-operator/iptables-alerter-nzrnj" Apr 24 22:30:16.537746 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.537726 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn9hf\" (UniqueName: \"kubernetes.io/projected/b188c294-c06f-4cc9-ab29-5edd0333288d-kube-api-access-mn9hf\") pod \"ovnkube-node-xsvks\" (UID: \"b188c294-c06f-4cc9-ab29-5edd0333288d\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.538288 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.538265 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd68n\" (UniqueName: \"kubernetes.io/projected/d4161e94-f0bf-458d-9264-fe2fc28cab32-kube-api-access-fd68n\") pod \"aws-ebs-csi-driver-node-2h7b8\" (UID: \"d4161e94-f0bf-458d-9264-fe2fc28cab32\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h7b8" Apr 24 22:30:16.550508 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.550489 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w2fh\" (UniqueName: \"kubernetes.io/projected/ed31fcf6-85e1-43d1-86ff-16bb6763e3e6-kube-api-access-4w2fh\") pod \"node-ca-tb9tr\" (UID: \"ed31fcf6-85e1-43d1-86ff-16bb6763e3e6\") " pod="openshift-image-registry/node-ca-tb9tr" Apr 24 22:30:16.613783 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.613757 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-system-cni-dir\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.613913 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.613793 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4fb2f48b-e735-4e72-896c-724bf453529c-cnibin\") pod \"multus-additional-cni-plugins-2mz4f\" (UID: \"4fb2f48b-e735-4e72-896c-724bf453529c\") " pod="openshift-multus/multus-additional-cni-plugins-2mz4f" Apr 24 22:30:16.613913 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.613811 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-host-var-lib-kubelet\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.613913 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.613828 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-host-run-multus-certs\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.613913 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.613849 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4fb2f48b-e735-4e72-896c-724bf453529c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2mz4f\" (UID: \"4fb2f48b-e735-4e72-896c-724bf453529c\") " pod="openshift-multus/multus-additional-cni-plugins-2mz4f" Apr 24 22:30:16.613913 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.613867 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-system-cni-dir\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.613913 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.613893 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4fb2f48b-e735-4e72-896c-724bf453529c-system-cni-dir\") pod \"multus-additional-cni-plugins-2mz4f\" (UID: \"4fb2f48b-e735-4e72-896c-724bf453529c\") " pod="openshift-multus/multus-additional-cni-plugins-2mz4f" Apr 24 22:30:16.613913 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.613904 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4fb2f48b-e735-4e72-896c-724bf453529c-cnibin\") pod \"multus-additional-cni-plugins-2mz4f\" (UID: \"4fb2f48b-e735-4e72-896c-724bf453529c\") " pod="openshift-multus/multus-additional-cni-plugins-2mz4f" Apr 24 22:30:16.614256 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.613918 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-host-var-lib-kubelet\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.614256 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.613934 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4fb2f48b-e735-4e72-896c-724bf453529c-system-cni-dir\") pod \"multus-additional-cni-plugins-2mz4f\" (UID: \"4fb2f48b-e735-4e72-896c-724bf453529c\") " pod="openshift-multus/multus-additional-cni-plugins-2mz4f" Apr 24 22:30:16.614256 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.613952 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-host-run-multus-certs\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.614256 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.613955 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-multus-daemon-config\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.614256 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.614029 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jxdhg\" (UniqueName: \"kubernetes.io/projected/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-kube-api-access-jxdhg\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.614256 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.614058 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4fb2f48b-e735-4e72-896c-724bf453529c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2mz4f\" (UID: \"4fb2f48b-e735-4e72-896c-724bf453529c\") " pod="openshift-multus/multus-additional-cni-plugins-2mz4f" Apr 24 22:30:16.614256 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.614178 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/31be179c-c441-48a4-8779-593458646c77-hosts-file\") pod \"node-resolver-t25xz\" (UID: \"31be179c-c441-48a4-8779-593458646c77\") " pod="openshift-dns/node-resolver-t25xz" Apr 24 22:30:16.614256 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.614204 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4fb2f48b-e735-4e72-896c-724bf453529c-cni-binary-copy\") pod \"multus-additional-cni-plugins-2mz4f\" (UID: \"4fb2f48b-e735-4e72-896c-724bf453529c\") " pod="openshift-multus/multus-additional-cni-plugins-2mz4f" Apr 24 22:30:16.614256 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.614231 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-cnibin\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.614256 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.614259 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-host-var-lib-cni-bin\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.614707 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.614285 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-host-run-k8s-cni-cncf-io\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.614707 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.614294 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/31be179c-c441-48a4-8779-593458646c77-hosts-file\") pod \"node-resolver-t25xz\" (UID: \"31be179c-c441-48a4-8779-593458646c77\") " pod="openshift-dns/node-resolver-t25xz" Apr 24 22:30:16.614707 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.614306 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-cnibin\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.614707 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.614309 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-etc-kubernetes\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.614707 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.614348 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-etc-kubernetes\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.614707 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.614352 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-host-var-lib-cni-bin\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.614707 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.614375 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4b7qm\" (UniqueName: \"kubernetes.io/projected/4fb2f48b-e735-4e72-896c-724bf453529c-kube-api-access-4b7qm\") pod \"multus-additional-cni-plugins-2mz4f\" (UID: \"4fb2f48b-e735-4e72-896c-724bf453529c\") " pod="openshift-multus/multus-additional-cni-plugins-2mz4f" Apr 24 22:30:16.614707 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.614392 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-host-run-k8s-cni-cncf-io\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.614707 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.614407 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-multus-socket-dir-parent\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.614707 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.614435 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-multus-cni-dir\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.614707 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.614460 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-hostroot\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.614707 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.614478 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-multus-socket-dir-parent\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.614707 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.614485 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fd77a426-e63b-4027-97b7-e9893fd72601-metrics-certs\") pod \"network-metrics-daemon-n8ntw\" (UID: \"fd77a426-e63b-4027-97b7-e9893fd72601\") " pod="openshift-multus/network-metrics-daemon-n8ntw" Apr 24 22:30:16.614707 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.614511 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m598x\" (UniqueName: \"kubernetes.io/projected/fd77a426-e63b-4027-97b7-e9893fd72601-kube-api-access-m598x\") pod \"network-metrics-daemon-n8ntw\" (UID: \"fd77a426-e63b-4027-97b7-e9893fd72601\") " pod="openshift-multus/network-metrics-daemon-n8ntw" Apr 24 22:30:16.614707 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.614540 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-multus-cni-dir\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.614707 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.614552 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-multus-conf-dir\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.614707 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.614582 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-cni-binary-copy\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.614707 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.614588 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-hostroot\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.615423 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.614597 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4fb2f48b-e735-4e72-896c-724bf453529c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2mz4f\" (UID: \"4fb2f48b-e735-4e72-896c-724bf453529c\") " pod="openshift-multus/multus-additional-cni-plugins-2mz4f" Apr 24 22:30:16.615423 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.614608 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-host-run-netns\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.615423 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.614538 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4fb2f48b-e735-4e72-896c-724bf453529c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2mz4f\" (UID: \"4fb2f48b-e735-4e72-896c-724bf453529c\") " pod="openshift-multus/multus-additional-cni-plugins-2mz4f" Apr 24 22:30:16.615423 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.614641 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-multus-conf-dir\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.615423 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.614647 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4fb2f48b-e735-4e72-896c-724bf453529c-os-release\") pod \"multus-additional-cni-plugins-2mz4f\" (UID: \"4fb2f48b-e735-4e72-896c-724bf453529c\") " pod="openshift-multus/multus-additional-cni-plugins-2mz4f" Apr 24 22:30:16.615423 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:16.614653 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:16.615423 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.614677 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4fb2f48b-e735-4e72-896c-724bf453529c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2mz4f\" (UID: \"4fb2f48b-e735-4e72-896c-724bf453529c\") " pod="openshift-multus/multus-additional-cni-plugins-2mz4f" Apr 24 22:30:16.615423 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.614691 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-host-run-netns\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.615423 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:16.614726 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd77a426-e63b-4027-97b7-e9893fd72601-metrics-certs podName:fd77a426-e63b-4027-97b7-e9893fd72601 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:17.114708035 +0000 UTC m=+3.164563893 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fd77a426-e63b-4027-97b7-e9893fd72601-metrics-certs") pod "network-metrics-daemon-n8ntw" (UID: "fd77a426-e63b-4027-97b7-e9893fd72601") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:16.615423 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.614706 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-os-release\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.615423 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.614748 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4fb2f48b-e735-4e72-896c-724bf453529c-os-release\") pod \"multus-additional-cni-plugins-2mz4f\" (UID: \"4fb2f48b-e735-4e72-896c-724bf453529c\") " pod="openshift-multus/multus-additional-cni-plugins-2mz4f" Apr 24 22:30:16.615423 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.614757 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/31be179c-c441-48a4-8779-593458646c77-tmp-dir\") pod \"node-resolver-t25xz\" (UID: \"31be179c-c441-48a4-8779-593458646c77\") " pod="openshift-dns/node-resolver-t25xz" Apr 24 22:30:16.615423 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.614782 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-host-var-lib-cni-multus\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.615423 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.614810 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-os-release\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.615423 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.614814 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-64r6t\" (UniqueName: \"kubernetes.io/projected/31be179c-c441-48a4-8779-593458646c77-kube-api-access-64r6t\") pod \"node-resolver-t25xz\" (UID: \"31be179c-c441-48a4-8779-593458646c77\") " pod="openshift-dns/node-resolver-t25xz" Apr 24 22:30:16.615423 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.614822 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4fb2f48b-e735-4e72-896c-724bf453529c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2mz4f\" (UID: \"4fb2f48b-e735-4e72-896c-724bf453529c\") " pod="openshift-multus/multus-additional-cni-plugins-2mz4f" Apr 24 22:30:16.615423 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.614857 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-host-var-lib-cni-multus\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.616078 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.615147 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/31be179c-c441-48a4-8779-593458646c77-tmp-dir\") pod \"node-resolver-t25xz\" (UID: \"31be179c-c441-48a4-8779-593458646c77\") " pod="openshift-dns/node-resolver-t25xz" Apr 24 22:30:16.616078 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.615155 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-multus-daemon-config\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.616078 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.615205 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4fb2f48b-e735-4e72-896c-724bf453529c-cni-binary-copy\") pod \"multus-additional-cni-plugins-2mz4f\" (UID: \"4fb2f48b-e735-4e72-896c-724bf453529c\") " pod="openshift-multus/multus-additional-cni-plugins-2mz4f" Apr 24 22:30:16.616078 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.615269 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-cni-binary-copy\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.633596 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.633540 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxdhg\" (UniqueName: \"kubernetes.io/projected/ef462afb-90e9-45bc-9e65-2b9c01d7f73a-kube-api-access-jxdhg\") pod \"multus-jdw5l\" (UID: \"ef462afb-90e9-45bc-9e65-2b9c01d7f73a\") " pod="openshift-multus/multus-jdw5l" Apr 24 22:30:16.638147 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.638122 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-64r6t\" (UniqueName: \"kubernetes.io/projected/31be179c-c441-48a4-8779-593458646c77-kube-api-access-64r6t\") pod \"node-resolver-t25xz\" (UID: \"31be179c-c441-48a4-8779-593458646c77\") " pod="openshift-dns/node-resolver-t25xz" Apr 24 22:30:16.640358 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.640339 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b7qm\" (UniqueName: \"kubernetes.io/projected/4fb2f48b-e735-4e72-896c-724bf453529c-kube-api-access-4b7qm\") pod \"multus-additional-cni-plugins-2mz4f\" (UID: \"4fb2f48b-e735-4e72-896c-724bf453529c\") " pod="openshift-multus/multus-additional-cni-plugins-2mz4f" Apr 24 22:30:16.669947 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.669926 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m598x\" (UniqueName: \"kubernetes.io/projected/fd77a426-e63b-4027-97b7-e9893fd72601-kube-api-access-m598x\") pod \"network-metrics-daemon-n8ntw\" (UID: \"fd77a426-e63b-4027-97b7-e9893fd72601\") " pod="openshift-multus/network-metrics-daemon-n8ntw" Apr 24 22:30:16.684330 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.684308 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:30:16.698180 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.698157 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vqh8g" Apr 24 22:30:16.705921 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.705882 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h7b8" Apr 24 22:30:16.713758 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.713739 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tb9tr" Apr 24 22:30:16.721579 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.721561 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-nzrnj" Apr 24 22:30:16.727825 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.727808 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:16.735154 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.735132 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tlfts" Apr 24 22:30:16.741644 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.741622 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-t25xz" Apr 24 22:30:16.746438 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.746419 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2mz4f" Apr 24 22:30:16.754154 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:16.754136 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jdw5l" Apr 24 22:30:17.094390 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:17.094359 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4161e94_f0bf_458d_9264_fe2fc28cab32.slice/crio-7c030194b6ea1713eaf5a4a6338b3abde4c2e3134f32a51d48fec5037d2752b9 WatchSource:0}: Error finding container 7c030194b6ea1713eaf5a4a6338b3abde4c2e3134f32a51d48fec5037d2752b9: Status 404 returned error can't find the container with id 7c030194b6ea1713eaf5a4a6338b3abde4c2e3134f32a51d48fec5037d2752b9 Apr 24 22:30:17.098807 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:17.098768 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded31fcf6_85e1_43d1_86ff_16bb6763e3e6.slice/crio-7c143993f49dffb0f1477ac6d484c2e5db3ff95909576ace961e6764a0165f82 WatchSource:0}: Error finding container 7c143993f49dffb0f1477ac6d484c2e5db3ff95909576ace961e6764a0165f82: Status 404 returned error can't find the container with id 7c143993f49dffb0f1477ac6d484c2e5db3ff95909576ace961e6764a0165f82 Apr 24 22:30:17.099596 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:17.099574 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc479082_3849_4902_830a_4a785973b983.slice/crio-0a53877c321126c9d1f2bc07a7b4315dbccf4548ddb8ed6dcae141a70d8f1589 WatchSource:0}: Error finding container 0a53877c321126c9d1f2bc07a7b4315dbccf4548ddb8ed6dcae141a70d8f1589: Status 404 returned error can't find the container with id 0a53877c321126c9d1f2bc07a7b4315dbccf4548ddb8ed6dcae141a70d8f1589 Apr 24 22:30:17.100723 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:17.100653 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10e80548_a374_4b56_8428_91554c6f203b.slice/crio-b5ff07108d504e419153d77047ad08728dc732b0a9efdded57df76dbd797280d WatchSource:0}: Error finding container b5ff07108d504e419153d77047ad08728dc732b0a9efdded57df76dbd797280d: Status 404 returned error can't find the container with id b5ff07108d504e419153d77047ad08728dc732b0a9efdded57df76dbd797280d Apr 24 22:30:17.101806 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:17.101746 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fb2f48b_e735_4e72_896c_724bf453529c.slice/crio-780bfa8caa6e8b69371a6720b8c0f7f6cb2d68e61d53c45841917f5d7ffb656b WatchSource:0}: Error finding container 780bfa8caa6e8b69371a6720b8c0f7f6cb2d68e61d53c45841917f5d7ffb656b: Status 404 returned error can't find the container with id 780bfa8caa6e8b69371a6720b8c0f7f6cb2d68e61d53c45841917f5d7ffb656b Apr 24 22:30:17.102470 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:17.102436 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04b95921_75f8_4261_8992_f655aedb0790.slice/crio-716ad14fb04c41e8f904f5aae7b851a4811b184935a7c01238953ffe8a53ac4c WatchSource:0}: Error finding container 716ad14fb04c41e8f904f5aae7b851a4811b184935a7c01238953ffe8a53ac4c: Status 404 returned error can't find the container with id 716ad14fb04c41e8f904f5aae7b851a4811b184935a7c01238953ffe8a53ac4c Apr 24 22:30:17.103005 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:17.102980 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb188c294_c06f_4cc9_ab29_5edd0333288d.slice/crio-f371b6a4714fd6898130fe5adf6c1db0a82b4884c5ff17fc173c4688b2289511 WatchSource:0}: Error finding container f371b6a4714fd6898130fe5adf6c1db0a82b4884c5ff17fc173c4688b2289511: Status 404 returned error can't find the container with id f371b6a4714fd6898130fe5adf6c1db0a82b4884c5ff17fc173c4688b2289511 Apr 24 22:30:17.104504 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:17.104245 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31be179c_c441_48a4_8779_593458646c77.slice/crio-bc320ac97a58e17a0648f34843ba515dcf96f6fd95501f037e617c6d74e9a3ce WatchSource:0}: Error finding container bc320ac97a58e17a0648f34843ba515dcf96f6fd95501f037e617c6d74e9a3ce: Status 404 returned error can't find the container with id bc320ac97a58e17a0648f34843ba515dcf96f6fd95501f037e617c6d74e9a3ce Apr 24 22:30:17.105061 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:30:17.104945 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef462afb_90e9_45bc_9e65_2b9c01d7f73a.slice/crio-57445fedd6a45d00828b193084e81e1407b1c959f9c2d627c3926ae8f6982a81 WatchSource:0}: Error finding container 57445fedd6a45d00828b193084e81e1407b1c959f9c2d627c3926ae8f6982a81: Status 404 returned error can't find the container with id 57445fedd6a45d00828b193084e81e1407b1c959f9c2d627c3926ae8f6982a81 Apr 24 22:30:17.119376 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:17.119346 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fd77a426-e63b-4027-97b7-e9893fd72601-metrics-certs\") pod \"network-metrics-daemon-n8ntw\" (UID: \"fd77a426-e63b-4027-97b7-e9893fd72601\") " pod="openshift-multus/network-metrics-daemon-n8ntw" Apr 24 22:30:17.119449 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:17.119381 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m4cst\" (UniqueName: \"kubernetes.io/projected/bc853d31-7ceb-415d-a71b-f18ee833a50c-kube-api-access-m4cst\") pod \"network-check-target-lh47p\" (UID: \"bc853d31-7ceb-415d-a71b-f18ee833a50c\") " pod="openshift-network-diagnostics/network-check-target-lh47p" Apr 24 22:30:17.119525 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:17.119503 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:17.119556 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:17.119521 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:30:17.119556 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:17.119539 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:30:17.119556 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:17.119552 2574 projected.go:194] Error preparing data for projected volume kube-api-access-m4cst for pod openshift-network-diagnostics/network-check-target-lh47p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:17.119642 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:17.119556 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd77a426-e63b-4027-97b7-e9893fd72601-metrics-certs podName:fd77a426-e63b-4027-97b7-e9893fd72601 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:18.119538775 +0000 UTC m=+4.169394633 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fd77a426-e63b-4027-97b7-e9893fd72601-metrics-certs") pod "network-metrics-daemon-n8ntw" (UID: "fd77a426-e63b-4027-97b7-e9893fd72601") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:17.119642 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:17.119599 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bc853d31-7ceb-415d-a71b-f18ee833a50c-kube-api-access-m4cst podName:bc853d31-7ceb-415d-a71b-f18ee833a50c nodeName:}" failed. No retries permitted until 2026-04-24 22:30:18.119582195 +0000 UTC m=+4.169438067 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-m4cst" (UniqueName: "kubernetes.io/projected/bc853d31-7ceb-415d-a71b-f18ee833a50c-kube-api-access-m4cst") pod "network-check-target-lh47p" (UID: "bc853d31-7ceb-415d-a71b-f18ee833a50c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:17.514777 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:17.514367 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 22:25:15 +0000 UTC" deadline="2027-11-03 10:53:44.44408499 +0000 UTC" Apr 24 22:30:17.514777 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:17.514633 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13380h23m26.929467115s" Apr 24 22:30:17.543209 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:17.543159 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tb9tr" event={"ID":"ed31fcf6-85e1-43d1-86ff-16bb6763e3e6","Type":"ContainerStarted","Data":"7c143993f49dffb0f1477ac6d484c2e5db3ff95909576ace961e6764a0165f82"} Apr 24 22:30:17.548785 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:17.548725 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h7b8" event={"ID":"d4161e94-f0bf-458d-9264-fe2fc28cab32","Type":"ContainerStarted","Data":"7c030194b6ea1713eaf5a4a6338b3abde4c2e3134f32a51d48fec5037d2752b9"} Apr 24 22:30:17.551647 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:17.551623 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-176.ec2.internal" event={"ID":"b3281f0d425c69d03b02f901cc1387c8","Type":"ContainerStarted","Data":"8d2ac7073a6a94bf6df5fecf8356ed0affb987e6ffc35d1dc567f4e11927f580"} Apr 24 22:30:17.558687 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:17.558661 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jdw5l" event={"ID":"ef462afb-90e9-45bc-9e65-2b9c01d7f73a","Type":"ContainerStarted","Data":"57445fedd6a45d00828b193084e81e1407b1c959f9c2d627c3926ae8f6982a81"} Apr 24 22:30:17.560584 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:17.560558 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" event={"ID":"b188c294-c06f-4cc9-ab29-5edd0333288d","Type":"ContainerStarted","Data":"f371b6a4714fd6898130fe5adf6c1db0a82b4884c5ff17fc173c4688b2289511"} Apr 24 22:30:17.570761 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:17.570716 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-176.ec2.internal" podStartSLOduration=2.570702564 podStartE2EDuration="2.570702564s" podCreationTimestamp="2026-04-24 22:30:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:30:17.570562203 +0000 UTC m=+3.620418081" watchObservedRunningTime="2026-04-24 22:30:17.570702564 +0000 UTC m=+3.620558424" Apr 24 22:30:17.577927 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:17.574488 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2mz4f" event={"ID":"4fb2f48b-e735-4e72-896c-724bf453529c","Type":"ContainerStarted","Data":"780bfa8caa6e8b69371a6720b8c0f7f6cb2d68e61d53c45841917f5d7ffb656b"} Apr 24 22:30:17.578535 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:17.578514 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vqh8g" event={"ID":"10e80548-a374-4b56-8428-91554c6f203b","Type":"ContainerStarted","Data":"b5ff07108d504e419153d77047ad08728dc732b0a9efdded57df76dbd797280d"} Apr 24 22:30:17.585991 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:17.585942 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-nzrnj" event={"ID":"bc479082-3849-4902-830a-4a785973b983","Type":"ContainerStarted","Data":"0a53877c321126c9d1f2bc07a7b4315dbccf4548ddb8ed6dcae141a70d8f1589"} Apr 24 22:30:17.603477 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:17.603435 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-t25xz" event={"ID":"31be179c-c441-48a4-8779-593458646c77","Type":"ContainerStarted","Data":"bc320ac97a58e17a0648f34843ba515dcf96f6fd95501f037e617c6d74e9a3ce"} Apr 24 22:30:17.607577 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:17.607536 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tlfts" event={"ID":"04b95921-75f8-4261-8992-f655aedb0790","Type":"ContainerStarted","Data":"716ad14fb04c41e8f904f5aae7b851a4811b184935a7c01238953ffe8a53ac4c"} Apr 24 22:30:18.129666 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:18.128888 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fd77a426-e63b-4027-97b7-e9893fd72601-metrics-certs\") pod \"network-metrics-daemon-n8ntw\" (UID: \"fd77a426-e63b-4027-97b7-e9893fd72601\") " pod="openshift-multus/network-metrics-daemon-n8ntw" Apr 24 22:30:18.129666 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:18.128963 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m4cst\" (UniqueName: \"kubernetes.io/projected/bc853d31-7ceb-415d-a71b-f18ee833a50c-kube-api-access-m4cst\") pod \"network-check-target-lh47p\" (UID: \"bc853d31-7ceb-415d-a71b-f18ee833a50c\") " pod="openshift-network-diagnostics/network-check-target-lh47p" Apr 24 22:30:18.129666 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:18.129128 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:30:18.129666 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:18.129147 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:30:18.129666 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:18.129160 2574 projected.go:194] Error preparing data for projected volume kube-api-access-m4cst for pod openshift-network-diagnostics/network-check-target-lh47p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:18.129666 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:18.129211 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bc853d31-7ceb-415d-a71b-f18ee833a50c-kube-api-access-m4cst podName:bc853d31-7ceb-415d-a71b-f18ee833a50c nodeName:}" failed. No retries permitted until 2026-04-24 22:30:20.129192286 +0000 UTC m=+6.179048146 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-m4cst" (UniqueName: "kubernetes.io/projected/bc853d31-7ceb-415d-a71b-f18ee833a50c-kube-api-access-m4cst") pod "network-check-target-lh47p" (UID: "bc853d31-7ceb-415d-a71b-f18ee833a50c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:18.129666 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:18.129582 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:18.129666 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:18.129631 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd77a426-e63b-4027-97b7-e9893fd72601-metrics-certs podName:fd77a426-e63b-4027-97b7-e9893fd72601 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:20.129616543 +0000 UTC m=+6.179472399 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fd77a426-e63b-4027-97b7-e9893fd72601-metrics-certs") pod "network-metrics-daemon-n8ntw" (UID: "fd77a426-e63b-4027-97b7-e9893fd72601") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:18.533836 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:18.532941 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8ntw" Apr 24 22:30:18.533836 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:18.533072 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n8ntw" podUID="fd77a426-e63b-4027-97b7-e9893fd72601" Apr 24 22:30:18.533836 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:18.533462 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lh47p" Apr 24 22:30:18.533836 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:18.533546 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lh47p" podUID="bc853d31-7ceb-415d-a71b-f18ee833a50c" Apr 24 22:30:18.618537 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:18.618491 2574 generic.go:358] "Generic (PLEG): container finished" podID="868c258382b869546819753aefbb6e79" containerID="9323d61329339709f26037a929b73af595007bfc7e08cc1e41e247777e566f0e" exitCode=0 Apr 24 22:30:18.619411 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:18.619382 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-176.ec2.internal" event={"ID":"868c258382b869546819753aefbb6e79","Type":"ContainerDied","Data":"9323d61329339709f26037a929b73af595007bfc7e08cc1e41e247777e566f0e"} Apr 24 22:30:19.624227 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:19.624186 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-176.ec2.internal" event={"ID":"868c258382b869546819753aefbb6e79","Type":"ContainerStarted","Data":"02d6f94fdbcc33071a50b76e50d61b8927ea9cef32bcf627f569cef1668f6ac4"} Apr 24 22:30:20.146066 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:20.146033 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fd77a426-e63b-4027-97b7-e9893fd72601-metrics-certs\") pod \"network-metrics-daemon-n8ntw\" (UID: \"fd77a426-e63b-4027-97b7-e9893fd72601\") " pod="openshift-multus/network-metrics-daemon-n8ntw" Apr 24 22:30:20.146292 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:20.146085 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m4cst\" (UniqueName: \"kubernetes.io/projected/bc853d31-7ceb-415d-a71b-f18ee833a50c-kube-api-access-m4cst\") pod \"network-check-target-lh47p\" (UID: \"bc853d31-7ceb-415d-a71b-f18ee833a50c\") " pod="openshift-network-diagnostics/network-check-target-lh47p" Apr 24 22:30:20.146292 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:20.146248 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:30:20.146292 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:20.146268 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:30:20.146292 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:20.146281 2574 projected.go:194] Error preparing data for projected volume kube-api-access-m4cst for pod openshift-network-diagnostics/network-check-target-lh47p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:20.146511 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:20.146338 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bc853d31-7ceb-415d-a71b-f18ee833a50c-kube-api-access-m4cst podName:bc853d31-7ceb-415d-a71b-f18ee833a50c nodeName:}" failed. No retries permitted until 2026-04-24 22:30:24.146320619 +0000 UTC m=+10.196176479 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-m4cst" (UniqueName: "kubernetes.io/projected/bc853d31-7ceb-415d-a71b-f18ee833a50c-kube-api-access-m4cst") pod "network-check-target-lh47p" (UID: "bc853d31-7ceb-415d-a71b-f18ee833a50c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:20.146577 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:20.146514 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:20.146577 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:20.146559 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd77a426-e63b-4027-97b7-e9893fd72601-metrics-certs podName:fd77a426-e63b-4027-97b7-e9893fd72601 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:24.146546087 +0000 UTC m=+10.196401947 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fd77a426-e63b-4027-97b7-e9893fd72601-metrics-certs") pod "network-metrics-daemon-n8ntw" (UID: "fd77a426-e63b-4027-97b7-e9893fd72601") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:20.532083 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:20.532005 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lh47p" Apr 24 22:30:20.532237 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:20.532130 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lh47p" podUID="bc853d31-7ceb-415d-a71b-f18ee833a50c" Apr 24 22:30:20.532535 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:20.532516 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8ntw" Apr 24 22:30:20.532676 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:20.532625 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n8ntw" podUID="fd77a426-e63b-4027-97b7-e9893fd72601" Apr 24 22:30:22.531060 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:22.531026 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lh47p" Apr 24 22:30:22.531576 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:22.531157 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lh47p" podUID="bc853d31-7ceb-415d-a71b-f18ee833a50c" Apr 24 22:30:22.531680 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:22.531643 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8ntw" Apr 24 22:30:22.531784 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:22.531759 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n8ntw" podUID="fd77a426-e63b-4027-97b7-e9893fd72601" Apr 24 22:30:24.180745 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:24.180144 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fd77a426-e63b-4027-97b7-e9893fd72601-metrics-certs\") pod \"network-metrics-daemon-n8ntw\" (UID: \"fd77a426-e63b-4027-97b7-e9893fd72601\") " pod="openshift-multus/network-metrics-daemon-n8ntw" Apr 24 22:30:24.180745 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:24.180192 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m4cst\" (UniqueName: \"kubernetes.io/projected/bc853d31-7ceb-415d-a71b-f18ee833a50c-kube-api-access-m4cst\") pod \"network-check-target-lh47p\" (UID: \"bc853d31-7ceb-415d-a71b-f18ee833a50c\") " pod="openshift-network-diagnostics/network-check-target-lh47p" Apr 24 22:30:24.180745 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:24.180328 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:30:24.180745 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:24.180345 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:30:24.180745 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:24.180357 2574 projected.go:194] Error preparing data for projected volume kube-api-access-m4cst for pod openshift-network-diagnostics/network-check-target-lh47p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:24.180745 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:24.180408 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bc853d31-7ceb-415d-a71b-f18ee833a50c-kube-api-access-m4cst podName:bc853d31-7ceb-415d-a71b-f18ee833a50c nodeName:}" failed. No retries permitted until 2026-04-24 22:30:32.180390996 +0000 UTC m=+18.230246856 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-m4cst" (UniqueName: "kubernetes.io/projected/bc853d31-7ceb-415d-a71b-f18ee833a50c-kube-api-access-m4cst") pod "network-check-target-lh47p" (UID: "bc853d31-7ceb-415d-a71b-f18ee833a50c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:24.180745 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:24.180601 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:24.180745 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:24.180667 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd77a426-e63b-4027-97b7-e9893fd72601-metrics-certs podName:fd77a426-e63b-4027-97b7-e9893fd72601 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:32.180649089 +0000 UTC m=+18.230504948 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fd77a426-e63b-4027-97b7-e9893fd72601-metrics-certs") pod "network-metrics-daemon-n8ntw" (UID: "fd77a426-e63b-4027-97b7-e9893fd72601") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:24.532517 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:24.532117 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lh47p" Apr 24 22:30:24.532517 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:24.532228 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lh47p" podUID="bc853d31-7ceb-415d-a71b-f18ee833a50c" Apr 24 22:30:24.532731 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:24.532596 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8ntw" Apr 24 22:30:24.532731 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:24.532701 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n8ntw" podUID="fd77a426-e63b-4027-97b7-e9893fd72601" Apr 24 22:30:26.531190 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:26.531151 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lh47p" Apr 24 22:30:26.531677 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:26.531349 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8ntw" Apr 24 22:30:26.531677 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:26.531290 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lh47p" podUID="bc853d31-7ceb-415d-a71b-f18ee833a50c" Apr 24 22:30:26.531827 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:26.531801 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n8ntw" podUID="fd77a426-e63b-4027-97b7-e9893fd72601" Apr 24 22:30:28.531160 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:28.531068 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lh47p" Apr 24 22:30:28.531160 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:28.531106 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8ntw" Apr 24 22:30:28.531652 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:28.531188 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lh47p" podUID="bc853d31-7ceb-415d-a71b-f18ee833a50c" Apr 24 22:30:28.531652 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:28.531592 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n8ntw" podUID="fd77a426-e63b-4027-97b7-e9893fd72601" Apr 24 22:30:30.531370 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:30.531337 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lh47p" Apr 24 22:30:30.531811 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:30.531345 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8ntw" Apr 24 22:30:30.531811 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:30.531457 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lh47p" podUID="bc853d31-7ceb-415d-a71b-f18ee833a50c" Apr 24 22:30:30.531811 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:30.531561 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n8ntw" podUID="fd77a426-e63b-4027-97b7-e9893fd72601" Apr 24 22:30:32.238998 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:32.238962 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fd77a426-e63b-4027-97b7-e9893fd72601-metrics-certs\") pod \"network-metrics-daemon-n8ntw\" (UID: \"fd77a426-e63b-4027-97b7-e9893fd72601\") " pod="openshift-multus/network-metrics-daemon-n8ntw" Apr 24 22:30:32.239481 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:32.239011 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m4cst\" (UniqueName: \"kubernetes.io/projected/bc853d31-7ceb-415d-a71b-f18ee833a50c-kube-api-access-m4cst\") pod \"network-check-target-lh47p\" (UID: \"bc853d31-7ceb-415d-a71b-f18ee833a50c\") " pod="openshift-network-diagnostics/network-check-target-lh47p" Apr 24 22:30:32.239481 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:32.239121 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:32.239481 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:32.239195 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd77a426-e63b-4027-97b7-e9893fd72601-metrics-certs podName:fd77a426-e63b-4027-97b7-e9893fd72601 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:48.23917736 +0000 UTC m=+34.289033225 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fd77a426-e63b-4027-97b7-e9893fd72601-metrics-certs") pod "network-metrics-daemon-n8ntw" (UID: "fd77a426-e63b-4027-97b7-e9893fd72601") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:32.239481 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:32.239127 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:30:32.239481 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:32.239237 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:30:32.239481 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:32.239249 2574 projected.go:194] Error preparing data for projected volume kube-api-access-m4cst for pod openshift-network-diagnostics/network-check-target-lh47p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:32.239481 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:32.239303 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bc853d31-7ceb-415d-a71b-f18ee833a50c-kube-api-access-m4cst podName:bc853d31-7ceb-415d-a71b-f18ee833a50c nodeName:}" failed. No retries permitted until 2026-04-24 22:30:48.239289412 +0000 UTC m=+34.289145268 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-m4cst" (UniqueName: "kubernetes.io/projected/bc853d31-7ceb-415d-a71b-f18ee833a50c-kube-api-access-m4cst") pod "network-check-target-lh47p" (UID: "bc853d31-7ceb-415d-a71b-f18ee833a50c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:32.531155 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:32.531123 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lh47p" Apr 24 22:30:32.531294 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:32.531180 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8ntw" Apr 24 22:30:32.531294 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:32.531266 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n8ntw" podUID="fd77a426-e63b-4027-97b7-e9893fd72601" Apr 24 22:30:32.531456 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:32.531421 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lh47p" podUID="bc853d31-7ceb-415d-a71b-f18ee833a50c" Apr 24 22:30:34.531692 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:34.531565 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8ntw" Apr 24 22:30:34.532154 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:34.531777 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n8ntw" podUID="fd77a426-e63b-4027-97b7-e9893fd72601" Apr 24 22:30:34.532154 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:34.531844 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lh47p" Apr 24 22:30:34.532154 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:34.531938 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lh47p" podUID="bc853d31-7ceb-415d-a71b-f18ee833a50c" Apr 24 22:30:34.649324 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:34.649293 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h7b8" event={"ID":"d4161e94-f0bf-458d-9264-fe2fc28cab32","Type":"ContainerStarted","Data":"f58da985317fd6050a2113a90ee24a291288403b076d2b917e55be076cc0e9d6"} Apr 24 22:30:34.650772 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:34.650584 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jdw5l" event={"ID":"ef462afb-90e9-45bc-9e65-2b9c01d7f73a","Type":"ContainerStarted","Data":"85af419e91c7932ad0a20147431f9d5abf42437a67972f74e13f25c3073ab6b8"} Apr 24 22:30:34.652673 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:34.652643 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" event={"ID":"b188c294-c06f-4cc9-ab29-5edd0333288d","Type":"ContainerStarted","Data":"3f757b836fa19ad6a00ebdb6d19fac7b5693dc00a7cd088eda43d6936d3c7c64"} Apr 24 22:30:34.652673 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:34.652669 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" event={"ID":"b188c294-c06f-4cc9-ab29-5edd0333288d","Type":"ContainerStarted","Data":"190c8f1a7c35e306f1fbfc7348695abe2a17b9c6219a38858ab7831ee160c58d"} Apr 24 22:30:34.652804 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:34.652681 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" event={"ID":"b188c294-c06f-4cc9-ab29-5edd0333288d","Type":"ContainerStarted","Data":"3d5c2f13cbb0b3eaab364a838d387d977676cff1da6c48fe25565c2297b8e591"} Apr 24 22:30:34.652804 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:34.652694 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" event={"ID":"b188c294-c06f-4cc9-ab29-5edd0333288d","Type":"ContainerStarted","Data":"cbad0e5db0405a5cf8ee2a91da830adcb07ce73bb80b9c765ca34aabd6585003"} Apr 24 22:30:34.655682 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:34.655650 2574 generic.go:358] "Generic (PLEG): container finished" podID="4fb2f48b-e735-4e72-896c-724bf453529c" containerID="ba1e681ecc2202dffc2dc845eec73cb0e2919ff379a7daaeb1d8c07d9c66393e" exitCode=0 Apr 24 22:30:34.655783 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:34.655688 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2mz4f" event={"ID":"4fb2f48b-e735-4e72-896c-724bf453529c","Type":"ContainerDied","Data":"ba1e681ecc2202dffc2dc845eec73cb0e2919ff379a7daaeb1d8c07d9c66393e"} Apr 24 22:30:34.657927 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:34.657903 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vqh8g" event={"ID":"10e80548-a374-4b56-8428-91554c6f203b","Type":"ContainerStarted","Data":"6aceacea682e41296ac1b85d77ec0813c77d4ff006555d3f82fc21bbdb7de8f8"} Apr 24 22:30:34.659756 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:34.659732 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-t25xz" event={"ID":"31be179c-c441-48a4-8779-593458646c77","Type":"ContainerStarted","Data":"318dca39876e564cb469c4227625c403d7af6f1a7a507f5d368542f8c152f3aa"} Apr 24 22:30:34.661225 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:34.661188 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tlfts" event={"ID":"04b95921-75f8-4261-8992-f655aedb0790","Type":"ContainerStarted","Data":"17d7ca78851b17dec62775b8c76e9628b984ed4f5c9b27d9158e8860b7269191"} Apr 24 22:30:34.662437 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:34.662409 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tb9tr" event={"ID":"ed31fcf6-85e1-43d1-86ff-16bb6763e3e6","Type":"ContainerStarted","Data":"a3c6ad51fa97bfedd4f85475dd3927592bb0529a6e2acb9d49f0710f68054288"} Apr 24 22:30:34.680783 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:34.680728 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-176.ec2.internal" podStartSLOduration=19.680714118 podStartE2EDuration="19.680714118s" podCreationTimestamp="2026-04-24 22:30:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:30:19.643653865 +0000 UTC m=+5.693509745" watchObservedRunningTime="2026-04-24 22:30:34.680714118 +0000 UTC m=+20.730569997" Apr 24 22:30:34.681030 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:34.680998 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-jdw5l" podStartSLOduration=3.649937141 podStartE2EDuration="20.680987575s" podCreationTimestamp="2026-04-24 22:30:14 +0000 UTC" firstStartedPulling="2026-04-24 22:30:17.107046105 +0000 UTC m=+3.156901960" lastFinishedPulling="2026-04-24 22:30:34.138096535 +0000 UTC m=+20.187952394" observedRunningTime="2026-04-24 22:30:34.68066385 +0000 UTC m=+20.730519727" watchObservedRunningTime="2026-04-24 22:30:34.680987575 +0000 UTC m=+20.730843455" Apr 24 22:30:34.731991 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:34.731942 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-tlfts" podStartSLOduration=3.707713882 podStartE2EDuration="20.731923918s" podCreationTimestamp="2026-04-24 22:30:14 +0000 UTC" firstStartedPulling="2026-04-24 22:30:17.104176175 +0000 UTC m=+3.154032033" lastFinishedPulling="2026-04-24 22:30:34.128386207 +0000 UTC m=+20.178242069" observedRunningTime="2026-04-24 22:30:34.731250045 +0000 UTC m=+20.781105923" watchObservedRunningTime="2026-04-24 22:30:34.731923918 +0000 UTC m=+20.781779796" Apr 24 22:30:34.749533 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:34.749476 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-t25xz" podStartSLOduration=3.775559889 podStartE2EDuration="20.749458397s" podCreationTimestamp="2026-04-24 22:30:14 +0000 UTC" firstStartedPulling="2026-04-24 22:30:17.106513035 +0000 UTC m=+3.156368897" lastFinishedPulling="2026-04-24 22:30:34.080411534 +0000 UTC m=+20.130267405" observedRunningTime="2026-04-24 22:30:34.74899576 +0000 UTC m=+20.798851636" watchObservedRunningTime="2026-04-24 22:30:34.749458397 +0000 UTC m=+20.799314274" Apr 24 22:30:34.764652 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:34.764615 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-vqh8g" podStartSLOduration=11.963031765 podStartE2EDuration="20.76460322s" podCreationTimestamp="2026-04-24 22:30:14 +0000 UTC" firstStartedPulling="2026-04-24 22:30:17.102634833 +0000 UTC m=+3.152490689" lastFinishedPulling="2026-04-24 22:30:25.904206274 +0000 UTC m=+11.954062144" observedRunningTime="2026-04-24 22:30:34.764476204 +0000 UTC m=+20.814332081" watchObservedRunningTime="2026-04-24 22:30:34.76460322 +0000 UTC m=+20.814459080" Apr 24 22:30:35.377637 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:35.377609 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 22:30:35.473031 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:35.472861 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T22:30:35.377634931Z","UUID":"a1554241-814d-4ffc-b361-e4b9af81b169","Handler":null,"Name":"","Endpoint":""} Apr 24 22:30:35.474973 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:35.474945 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 22:30:35.474973 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:35.474973 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 22:30:35.668060 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:35.668024 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" event={"ID":"b188c294-c06f-4cc9-ab29-5edd0333288d","Type":"ContainerStarted","Data":"8bac72d36920761393768c07821cb29003ac0b474e9ae4907c2af80b2fbdd184"} Apr 24 22:30:35.668060 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:35.668066 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" event={"ID":"b188c294-c06f-4cc9-ab29-5edd0333288d","Type":"ContainerStarted","Data":"f464312ffadbcf686823fef12a5a6bfd6e09ec5b079d33b778a66714e1e62a93"} Apr 24 22:30:35.669458 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:35.669431 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-nzrnj" event={"ID":"bc479082-3849-4902-830a-4a785973b983","Type":"ContainerStarted","Data":"e6a9b59caf132960dec3054974bc7a0918dd4776cb584602364886f945d1f77c"} Apr 24 22:30:35.671658 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:35.671633 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h7b8" event={"ID":"d4161e94-f0bf-458d-9264-fe2fc28cab32","Type":"ContainerStarted","Data":"c2c3e75c702423167d04c39763c820a21164056fad20dc410b034844d5538287"} Apr 24 22:30:35.686014 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:35.685969 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-tb9tr" podStartSLOduration=4.705842494 podStartE2EDuration="21.685957977s" podCreationTimestamp="2026-04-24 22:30:14 +0000 UTC" firstStartedPulling="2026-04-24 22:30:17.100376788 +0000 UTC m=+3.150232643" lastFinishedPulling="2026-04-24 22:30:34.080492271 +0000 UTC m=+20.130348126" observedRunningTime="2026-04-24 22:30:34.788364235 +0000 UTC m=+20.838220111" watchObservedRunningTime="2026-04-24 22:30:35.685957977 +0000 UTC m=+21.735813856" Apr 24 22:30:35.686119 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:35.686067 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-nzrnj" podStartSLOduration=4.661060679 podStartE2EDuration="21.686061526s" podCreationTimestamp="2026-04-24 22:30:14 +0000 UTC" firstStartedPulling="2026-04-24 22:30:17.101446647 +0000 UTC m=+3.151302505" lastFinishedPulling="2026-04-24 22:30:34.126447482 +0000 UTC m=+20.176303352" observedRunningTime="2026-04-24 22:30:35.685443814 +0000 UTC m=+21.735299704" watchObservedRunningTime="2026-04-24 22:30:35.686061526 +0000 UTC m=+21.735917403" Apr 24 22:30:36.531578 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:36.531541 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lh47p" Apr 24 22:30:36.531780 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:36.531560 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8ntw" Apr 24 22:30:36.531780 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:36.531660 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lh47p" podUID="bc853d31-7ceb-415d-a71b-f18ee833a50c" Apr 24 22:30:36.531780 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:36.531750 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n8ntw" podUID="fd77a426-e63b-4027-97b7-e9893fd72601" Apr 24 22:30:36.675122 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:36.675062 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h7b8" event={"ID":"d4161e94-f0bf-458d-9264-fe2fc28cab32","Type":"ContainerStarted","Data":"cdfc07354f56c4d7e5a19aa01d07ae8dfe4826a193da39267976ce2d3f202b6d"} Apr 24 22:30:37.680313 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:37.680110 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" event={"ID":"b188c294-c06f-4cc9-ab29-5edd0333288d","Type":"ContainerStarted","Data":"3a3f59a2e03b0c551336de946c254cc988389bfd51f23ac40a6c580f525b5bd0"} Apr 24 22:30:38.133943 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:38.133914 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-vqh8g" Apr 24 22:30:38.134574 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:38.134542 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-vqh8g" Apr 24 22:30:38.155532 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:38.155479 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2h7b8" podStartSLOduration=4.995484469 podStartE2EDuration="24.155463145s" podCreationTimestamp="2026-04-24 22:30:14 +0000 UTC" firstStartedPulling="2026-04-24 22:30:17.096665345 +0000 UTC m=+3.146521214" lastFinishedPulling="2026-04-24 22:30:36.256644033 +0000 UTC m=+22.306499890" observedRunningTime="2026-04-24 22:30:36.699231132 +0000 UTC m=+22.749087009" watchObservedRunningTime="2026-04-24 22:30:38.155463145 +0000 UTC m=+24.205319028" Apr 24 22:30:38.530958 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:38.530926 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lh47p" Apr 24 22:30:38.530958 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:38.530954 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8ntw" Apr 24 22:30:38.531235 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:38.531052 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lh47p" podUID="bc853d31-7ceb-415d-a71b-f18ee833a50c" Apr 24 22:30:38.531235 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:38.531160 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n8ntw" podUID="fd77a426-e63b-4027-97b7-e9893fd72601" Apr 24 22:30:39.687177 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:39.686966 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" event={"ID":"b188c294-c06f-4cc9-ab29-5edd0333288d","Type":"ContainerStarted","Data":"13f352b5a7b5440986674097e4210a0eb859d556ecace3d0cecdccbcaf6bbb00"} Apr 24 22:30:39.688022 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:39.687277 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:39.688022 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:39.687327 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:39.688022 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:39.687343 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:39.688743 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:39.688720 2574 generic.go:358] "Generic (PLEG): container finished" podID="4fb2f48b-e735-4e72-896c-724bf453529c" containerID="825c16fbcf1e2b0aacbdc9d50da560da47071b02f8921c784235e36825d52078" exitCode=0 Apr 24 22:30:39.688858 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:39.688764 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2mz4f" event={"ID":"4fb2f48b-e735-4e72-896c-724bf453529c","Type":"ContainerDied","Data":"825c16fbcf1e2b0aacbdc9d50da560da47071b02f8921c784235e36825d52078"} Apr 24 22:30:39.702355 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:39.702335 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:39.702442 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:39.702392 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:30:39.725710 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:39.725674 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" podStartSLOduration=8.534590197 podStartE2EDuration="25.72566281s" podCreationTimestamp="2026-04-24 22:30:14 +0000 UTC" firstStartedPulling="2026-04-24 22:30:17.105736648 +0000 UTC m=+3.155592503" lastFinishedPulling="2026-04-24 22:30:34.296809247 +0000 UTC m=+20.346665116" observedRunningTime="2026-04-24 22:30:39.724057161 +0000 UTC m=+25.773913037" watchObservedRunningTime="2026-04-24 22:30:39.72566281 +0000 UTC m=+25.775518687" Apr 24 22:30:39.932598 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:39.932506 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-vqh8g" Apr 24 22:30:39.932720 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:39.932614 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 22:30:39.933062 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:39.933048 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-vqh8g" Apr 24 22:30:40.530812 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:40.530789 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lh47p" Apr 24 22:30:40.530921 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:40.530819 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8ntw" Apr 24 22:30:40.530921 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:40.530912 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lh47p" podUID="bc853d31-7ceb-415d-a71b-f18ee833a50c" Apr 24 22:30:40.531050 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:40.531031 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n8ntw" podUID="fd77a426-e63b-4027-97b7-e9893fd72601" Apr 24 22:30:40.692559 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:40.692530 2574 generic.go:358] "Generic (PLEG): container finished" podID="4fb2f48b-e735-4e72-896c-724bf453529c" containerID="d359767133cc7354e2f8ed85751c9ebaa6368c2181e13f423328803b489bebe1" exitCode=0 Apr 24 22:30:40.692916 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:40.692604 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2mz4f" event={"ID":"4fb2f48b-e735-4e72-896c-724bf453529c","Type":"ContainerDied","Data":"d359767133cc7354e2f8ed85751c9ebaa6368c2181e13f423328803b489bebe1"} Apr 24 22:30:41.087614 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:41.087582 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-lh47p"] Apr 24 22:30:41.087796 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:41.087691 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lh47p" Apr 24 22:30:41.087796 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:41.087765 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lh47p" podUID="bc853d31-7ceb-415d-a71b-f18ee833a50c" Apr 24 22:30:41.089384 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:41.089361 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-n8ntw"] Apr 24 22:30:41.089475 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:41.089461 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8ntw" Apr 24 22:30:41.089558 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:41.089542 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n8ntw" podUID="fd77a426-e63b-4027-97b7-e9893fd72601" Apr 24 22:30:41.696066 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:41.696036 2574 generic.go:358] "Generic (PLEG): container finished" podID="4fb2f48b-e735-4e72-896c-724bf453529c" containerID="745193758a8382b7f00442c4a1062f68a3c911efd9fdec7f410cfbbb4c568b04" exitCode=0 Apr 24 22:30:41.696350 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:41.696118 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2mz4f" event={"ID":"4fb2f48b-e735-4e72-896c-724bf453529c","Type":"ContainerDied","Data":"745193758a8382b7f00442c4a1062f68a3c911efd9fdec7f410cfbbb4c568b04"} Apr 24 22:30:42.531070 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:42.531034 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8ntw" Apr 24 22:30:42.531241 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:42.531075 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lh47p" Apr 24 22:30:42.531241 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:42.531178 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n8ntw" podUID="fd77a426-e63b-4027-97b7-e9893fd72601" Apr 24 22:30:42.531371 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:42.531297 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lh47p" podUID="bc853d31-7ceb-415d-a71b-f18ee833a50c" Apr 24 22:30:44.532629 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:44.532587 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lh47p" Apr 24 22:30:44.533087 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:44.532691 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8ntw" Apr 24 22:30:44.533087 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:44.532721 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lh47p" podUID="bc853d31-7ceb-415d-a71b-f18ee833a50c" Apr 24 22:30:44.533087 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:44.532779 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n8ntw" podUID="fd77a426-e63b-4027-97b7-e9893fd72601" Apr 24 22:30:46.531045 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:46.531013 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lh47p" Apr 24 22:30:46.531667 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:46.531013 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8ntw" Apr 24 22:30:46.531667 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:46.531139 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lh47p" podUID="bc853d31-7ceb-415d-a71b-f18ee833a50c" Apr 24 22:30:46.531667 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:46.531247 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n8ntw" podUID="fd77a426-e63b-4027-97b7-e9893fd72601" Apr 24 22:30:47.233515 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:47.233450 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-176.ec2.internal" event="NodeReady" Apr 24 22:30:47.233638 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:47.233557 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 22:30:47.319269 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:47.319239 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-mfpb9"] Apr 24 22:30:47.345572 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:47.345550 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-q7rg6"] Apr 24 22:30:47.345719 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:47.345702 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mfpb9" Apr 24 22:30:47.348074 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:47.348056 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 22:30:47.348306 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:47.348291 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 22:30:47.348371 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:47.348322 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-svq4d\"" Apr 24 22:30:47.363339 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:47.363320 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mfpb9"] Apr 24 22:30:47.363339 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:47.363340 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-q7rg6"] Apr 24 22:30:47.363485 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:47.363437 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-q7rg6" Apr 24 22:30:47.368627 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:47.368603 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 22:30:47.368733 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:47.368686 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 22:30:47.369859 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:47.369842 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 22:30:47.372852 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:47.372835 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tdthp\"" Apr 24 22:30:47.446049 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:47.446024 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b3ad74e7-fb86-475b-88a4-5b2f7848cd68-tmp-dir\") pod \"dns-default-mfpb9\" (UID: \"b3ad74e7-fb86-475b-88a4-5b2f7848cd68\") " pod="openshift-dns/dns-default-mfpb9" Apr 24 22:30:47.446049 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:47.446054 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f622f7a-7d90-4dd1-af3f-3f27ebad181a-cert\") pod \"ingress-canary-q7rg6\" (UID: \"6f622f7a-7d90-4dd1-af3f-3f27ebad181a\") " pod="openshift-ingress-canary/ingress-canary-q7rg6" Apr 24 22:30:47.446216 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:47.446077 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b3ad74e7-fb86-475b-88a4-5b2f7848cd68-config-volume\") pod \"dns-default-mfpb9\" (UID: \"b3ad74e7-fb86-475b-88a4-5b2f7848cd68\") " pod="openshift-dns/dns-default-mfpb9" Apr 24 22:30:47.446216 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:47.446127 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3ad74e7-fb86-475b-88a4-5b2f7848cd68-metrics-tls\") pod \"dns-default-mfpb9\" (UID: \"b3ad74e7-fb86-475b-88a4-5b2f7848cd68\") " pod="openshift-dns/dns-default-mfpb9" Apr 24 22:30:47.446216 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:47.446196 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkxkw\" (UniqueName: \"kubernetes.io/projected/6f622f7a-7d90-4dd1-af3f-3f27ebad181a-kube-api-access-hkxkw\") pod \"ingress-canary-q7rg6\" (UID: \"6f622f7a-7d90-4dd1-af3f-3f27ebad181a\") " pod="openshift-ingress-canary/ingress-canary-q7rg6" Apr 24 22:30:47.446318 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:47.446220 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wctb\" (UniqueName: \"kubernetes.io/projected/b3ad74e7-fb86-475b-88a4-5b2f7848cd68-kube-api-access-6wctb\") pod \"dns-default-mfpb9\" (UID: \"b3ad74e7-fb86-475b-88a4-5b2f7848cd68\") " pod="openshift-dns/dns-default-mfpb9" Apr 24 22:30:47.547057 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:47.546973 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b3ad74e7-fb86-475b-88a4-5b2f7848cd68-tmp-dir\") pod \"dns-default-mfpb9\" (UID: \"b3ad74e7-fb86-475b-88a4-5b2f7848cd68\") " pod="openshift-dns/dns-default-mfpb9" Apr 24 22:30:47.547057 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:47.547018 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f622f7a-7d90-4dd1-af3f-3f27ebad181a-cert\") pod \"ingress-canary-q7rg6\" (UID: \"6f622f7a-7d90-4dd1-af3f-3f27ebad181a\") " pod="openshift-ingress-canary/ingress-canary-q7rg6" Apr 24 22:30:47.547057 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:47.547056 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b3ad74e7-fb86-475b-88a4-5b2f7848cd68-config-volume\") pod \"dns-default-mfpb9\" (UID: \"b3ad74e7-fb86-475b-88a4-5b2f7848cd68\") " pod="openshift-dns/dns-default-mfpb9" Apr 24 22:30:47.547620 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:47.547085 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3ad74e7-fb86-475b-88a4-5b2f7848cd68-metrics-tls\") pod \"dns-default-mfpb9\" (UID: \"b3ad74e7-fb86-475b-88a4-5b2f7848cd68\") " pod="openshift-dns/dns-default-mfpb9" Apr 24 22:30:47.547620 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:47.547148 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hkxkw\" (UniqueName: \"kubernetes.io/projected/6f622f7a-7d90-4dd1-af3f-3f27ebad181a-kube-api-access-hkxkw\") pod \"ingress-canary-q7rg6\" (UID: \"6f622f7a-7d90-4dd1-af3f-3f27ebad181a\") " pod="openshift-ingress-canary/ingress-canary-q7rg6" Apr 24 22:30:47.547620 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:47.547171 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:47.547620 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:47.547180 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6wctb\" (UniqueName: \"kubernetes.io/projected/b3ad74e7-fb86-475b-88a4-5b2f7848cd68-kube-api-access-6wctb\") pod \"dns-default-mfpb9\" (UID: \"b3ad74e7-fb86-475b-88a4-5b2f7848cd68\") " pod="openshift-dns/dns-default-mfpb9" Apr 24 22:30:47.547620 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:47.547237 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f622f7a-7d90-4dd1-af3f-3f27ebad181a-cert podName:6f622f7a-7d90-4dd1-af3f-3f27ebad181a nodeName:}" failed. No retries permitted until 2026-04-24 22:30:48.047217033 +0000 UTC m=+34.097072892 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6f622f7a-7d90-4dd1-af3f-3f27ebad181a-cert") pod "ingress-canary-q7rg6" (UID: "6f622f7a-7d90-4dd1-af3f-3f27ebad181a") : secret "canary-serving-cert" not found Apr 24 22:30:47.547620 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:47.547249 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:47.547620 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:47.547300 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3ad74e7-fb86-475b-88a4-5b2f7848cd68-metrics-tls podName:b3ad74e7-fb86-475b-88a4-5b2f7848cd68 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:48.047282236 +0000 UTC m=+34.097138091 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b3ad74e7-fb86-475b-88a4-5b2f7848cd68-metrics-tls") pod "dns-default-mfpb9" (UID: "b3ad74e7-fb86-475b-88a4-5b2f7848cd68") : secret "dns-default-metrics-tls" not found Apr 24 22:30:47.547620 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:47.547383 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b3ad74e7-fb86-475b-88a4-5b2f7848cd68-tmp-dir\") pod \"dns-default-mfpb9\" (UID: \"b3ad74e7-fb86-475b-88a4-5b2f7848cd68\") " pod="openshift-dns/dns-default-mfpb9" Apr 24 22:30:47.547620 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:47.547553 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b3ad74e7-fb86-475b-88a4-5b2f7848cd68-config-volume\") pod \"dns-default-mfpb9\" (UID: \"b3ad74e7-fb86-475b-88a4-5b2f7848cd68\") " pod="openshift-dns/dns-default-mfpb9" Apr 24 22:30:47.573839 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:47.573815 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wctb\" (UniqueName: \"kubernetes.io/projected/b3ad74e7-fb86-475b-88a4-5b2f7848cd68-kube-api-access-6wctb\") pod \"dns-default-mfpb9\" (UID: \"b3ad74e7-fb86-475b-88a4-5b2f7848cd68\") " pod="openshift-dns/dns-default-mfpb9" Apr 24 22:30:47.573941 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:47.573821 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkxkw\" (UniqueName: \"kubernetes.io/projected/6f622f7a-7d90-4dd1-af3f-3f27ebad181a-kube-api-access-hkxkw\") pod \"ingress-canary-q7rg6\" (UID: \"6f622f7a-7d90-4dd1-af3f-3f27ebad181a\") " pod="openshift-ingress-canary/ingress-canary-q7rg6" Apr 24 22:30:47.709014 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:47.708975 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2mz4f" event={"ID":"4fb2f48b-e735-4e72-896c-724bf453529c","Type":"ContainerStarted","Data":"e21104472187075da5983b67f9e9828dbcc3e78782f53adbe67a216f42ec32f2"} Apr 24 22:30:48.051947 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:48.051920 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3ad74e7-fb86-475b-88a4-5b2f7848cd68-metrics-tls\") pod \"dns-default-mfpb9\" (UID: \"b3ad74e7-fb86-475b-88a4-5b2f7848cd68\") " pod="openshift-dns/dns-default-mfpb9" Apr 24 22:30:48.052193 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:48.051987 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f622f7a-7d90-4dd1-af3f-3f27ebad181a-cert\") pod \"ingress-canary-q7rg6\" (UID: \"6f622f7a-7d90-4dd1-af3f-3f27ebad181a\") " pod="openshift-ingress-canary/ingress-canary-q7rg6" Apr 24 22:30:48.052193 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:48.052061 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:48.052193 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:48.052069 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:48.052193 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:48.052111 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f622f7a-7d90-4dd1-af3f-3f27ebad181a-cert podName:6f622f7a-7d90-4dd1-af3f-3f27ebad181a nodeName:}" failed. No retries permitted until 2026-04-24 22:30:49.052095831 +0000 UTC m=+35.101951687 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6f622f7a-7d90-4dd1-af3f-3f27ebad181a-cert") pod "ingress-canary-q7rg6" (UID: "6f622f7a-7d90-4dd1-af3f-3f27ebad181a") : secret "canary-serving-cert" not found Apr 24 22:30:48.052193 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:48.052122 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3ad74e7-fb86-475b-88a4-5b2f7848cd68-metrics-tls podName:b3ad74e7-fb86-475b-88a4-5b2f7848cd68 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:49.052116948 +0000 UTC m=+35.101972802 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b3ad74e7-fb86-475b-88a4-5b2f7848cd68-metrics-tls") pod "dns-default-mfpb9" (UID: "b3ad74e7-fb86-475b-88a4-5b2f7848cd68") : secret "dns-default-metrics-tls" not found Apr 24 22:30:48.253247 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:48.253213 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fd77a426-e63b-4027-97b7-e9893fd72601-metrics-certs\") pod \"network-metrics-daemon-n8ntw\" (UID: \"fd77a426-e63b-4027-97b7-e9893fd72601\") " pod="openshift-multus/network-metrics-daemon-n8ntw" Apr 24 22:30:48.253247 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:48.253253 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m4cst\" (UniqueName: \"kubernetes.io/projected/bc853d31-7ceb-415d-a71b-f18ee833a50c-kube-api-access-m4cst\") pod \"network-check-target-lh47p\" (UID: \"bc853d31-7ceb-415d-a71b-f18ee833a50c\") " pod="openshift-network-diagnostics/network-check-target-lh47p" Apr 24 22:30:48.253437 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:48.253343 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:48.253437 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:48.253365 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:30:48.253437 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:48.253378 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:30:48.253437 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:48.253387 2574 projected.go:194] Error preparing data for projected volume kube-api-access-m4cst for pod openshift-network-diagnostics/network-check-target-lh47p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:48.253437 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:48.253402 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd77a426-e63b-4027-97b7-e9893fd72601-metrics-certs podName:fd77a426-e63b-4027-97b7-e9893fd72601 nodeName:}" failed. No retries permitted until 2026-04-24 22:31:20.253387595 +0000 UTC m=+66.303243455 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fd77a426-e63b-4027-97b7-e9893fd72601-metrics-certs") pod "network-metrics-daemon-n8ntw" (UID: "fd77a426-e63b-4027-97b7-e9893fd72601") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:48.253437 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:48.253420 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bc853d31-7ceb-415d-a71b-f18ee833a50c-kube-api-access-m4cst podName:bc853d31-7ceb-415d-a71b-f18ee833a50c nodeName:}" failed. No retries permitted until 2026-04-24 22:31:20.253410152 +0000 UTC m=+66.303266007 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-m4cst" (UniqueName: "kubernetes.io/projected/bc853d31-7ceb-415d-a71b-f18ee833a50c-kube-api-access-m4cst") pod "network-check-target-lh47p" (UID: "bc853d31-7ceb-415d-a71b-f18ee833a50c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:48.531060 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:48.531031 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lh47p" Apr 24 22:30:48.531246 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:48.531217 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8ntw" Apr 24 22:30:48.535380 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:48.535358 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 22:30:48.535380 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:48.535372 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-dndrr\"" Apr 24 22:30:48.535598 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:48.535407 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 22:30:48.535598 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:48.535438 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 22:30:48.536501 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:48.536483 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7dgz7\"" Apr 24 22:30:48.712931 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:48.712897 2574 generic.go:358] "Generic (PLEG): container finished" podID="4fb2f48b-e735-4e72-896c-724bf453529c" containerID="e21104472187075da5983b67f9e9828dbcc3e78782f53adbe67a216f42ec32f2" exitCode=0 Apr 24 22:30:48.713329 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:48.712953 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2mz4f" event={"ID":"4fb2f48b-e735-4e72-896c-724bf453529c","Type":"ContainerDied","Data":"e21104472187075da5983b67f9e9828dbcc3e78782f53adbe67a216f42ec32f2"} Apr 24 22:30:49.058655 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:49.058623 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3ad74e7-fb86-475b-88a4-5b2f7848cd68-metrics-tls\") pod \"dns-default-mfpb9\" (UID: \"b3ad74e7-fb86-475b-88a4-5b2f7848cd68\") " pod="openshift-dns/dns-default-mfpb9" Apr 24 22:30:49.058781 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:49.058692 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f622f7a-7d90-4dd1-af3f-3f27ebad181a-cert\") pod \"ingress-canary-q7rg6\" (UID: \"6f622f7a-7d90-4dd1-af3f-3f27ebad181a\") " pod="openshift-ingress-canary/ingress-canary-q7rg6" Apr 24 22:30:49.058781 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:49.058776 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:49.058850 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:49.058776 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:49.058850 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:49.058824 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f622f7a-7d90-4dd1-af3f-3f27ebad181a-cert podName:6f622f7a-7d90-4dd1-af3f-3f27ebad181a nodeName:}" failed. No retries permitted until 2026-04-24 22:30:51.058810842 +0000 UTC m=+37.108666697 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6f622f7a-7d90-4dd1-af3f-3f27ebad181a-cert") pod "ingress-canary-q7rg6" (UID: "6f622f7a-7d90-4dd1-af3f-3f27ebad181a") : secret "canary-serving-cert" not found Apr 24 22:30:49.058850 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:49.058836 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3ad74e7-fb86-475b-88a4-5b2f7848cd68-metrics-tls podName:b3ad74e7-fb86-475b-88a4-5b2f7848cd68 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:51.058830849 +0000 UTC m=+37.108686703 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b3ad74e7-fb86-475b-88a4-5b2f7848cd68-metrics-tls") pod "dns-default-mfpb9" (UID: "b3ad74e7-fb86-475b-88a4-5b2f7848cd68") : secret "dns-default-metrics-tls" not found Apr 24 22:30:49.717144 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:49.717114 2574 generic.go:358] "Generic (PLEG): container finished" podID="4fb2f48b-e735-4e72-896c-724bf453529c" containerID="f0b2e6d4fadc86560abd104c758a03a4edc7d5bce51dba4e96b2918848159136" exitCode=0 Apr 24 22:30:49.717475 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:49.717154 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2mz4f" event={"ID":"4fb2f48b-e735-4e72-896c-724bf453529c","Type":"ContainerDied","Data":"f0b2e6d4fadc86560abd104c758a03a4edc7d5bce51dba4e96b2918848159136"} Apr 24 22:30:50.722069 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:50.721869 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2mz4f" event={"ID":"4fb2f48b-e735-4e72-896c-724bf453529c","Type":"ContainerStarted","Data":"f142981beb754d45fb01c5eb4a7c96ad7398ab8e78e7f23d4bee611addb6ec8f"} Apr 24 22:30:50.790127 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:50.790079 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-2mz4f" podStartSLOduration=6.397248263 podStartE2EDuration="36.790065066s" podCreationTimestamp="2026-04-24 22:30:14 +0000 UTC" firstStartedPulling="2026-04-24 22:30:17.103482432 +0000 UTC m=+3.153338291" lastFinishedPulling="2026-04-24 22:30:47.496299236 +0000 UTC m=+33.546155094" observedRunningTime="2026-04-24 22:30:50.781557074 +0000 UTC m=+36.831412950" watchObservedRunningTime="2026-04-24 22:30:50.790065066 +0000 UTC m=+36.839920942" Apr 24 22:30:51.071563 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:51.071489 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3ad74e7-fb86-475b-88a4-5b2f7848cd68-metrics-tls\") pod \"dns-default-mfpb9\" (UID: \"b3ad74e7-fb86-475b-88a4-5b2f7848cd68\") " pod="openshift-dns/dns-default-mfpb9" Apr 24 22:30:51.071563 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:51.071554 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f622f7a-7d90-4dd1-af3f-3f27ebad181a-cert\") pod \"ingress-canary-q7rg6\" (UID: \"6f622f7a-7d90-4dd1-af3f-3f27ebad181a\") " pod="openshift-ingress-canary/ingress-canary-q7rg6" Apr 24 22:30:51.071743 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:51.071630 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:51.071743 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:51.071642 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:51.071743 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:51.071680 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f622f7a-7d90-4dd1-af3f-3f27ebad181a-cert podName:6f622f7a-7d90-4dd1-af3f-3f27ebad181a nodeName:}" failed. No retries permitted until 2026-04-24 22:30:55.071664929 +0000 UTC m=+41.121520784 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6f622f7a-7d90-4dd1-af3f-3f27ebad181a-cert") pod "ingress-canary-q7rg6" (UID: "6f622f7a-7d90-4dd1-af3f-3f27ebad181a") : secret "canary-serving-cert" not found Apr 24 22:30:51.071743 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:51.071702 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3ad74e7-fb86-475b-88a4-5b2f7848cd68-metrics-tls podName:b3ad74e7-fb86-475b-88a4-5b2f7848cd68 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:55.071685056 +0000 UTC m=+41.121540917 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b3ad74e7-fb86-475b-88a4-5b2f7848cd68-metrics-tls") pod "dns-default-mfpb9" (UID: "b3ad74e7-fb86-475b-88a4-5b2f7848cd68") : secret "dns-default-metrics-tls" not found Apr 24 22:30:55.099264 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:55.099228 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f622f7a-7d90-4dd1-af3f-3f27ebad181a-cert\") pod \"ingress-canary-q7rg6\" (UID: \"6f622f7a-7d90-4dd1-af3f-3f27ebad181a\") " pod="openshift-ingress-canary/ingress-canary-q7rg6" Apr 24 22:30:55.099669 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:30:55.099275 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3ad74e7-fb86-475b-88a4-5b2f7848cd68-metrics-tls\") pod \"dns-default-mfpb9\" (UID: \"b3ad74e7-fb86-475b-88a4-5b2f7848cd68\") " pod="openshift-dns/dns-default-mfpb9" Apr 24 22:30:55.099669 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:55.099376 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:55.099669 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:55.099377 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:55.099669 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:55.099426 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3ad74e7-fb86-475b-88a4-5b2f7848cd68-metrics-tls podName:b3ad74e7-fb86-475b-88a4-5b2f7848cd68 nodeName:}" failed. No retries permitted until 2026-04-24 22:31:03.099413494 +0000 UTC m=+49.149269349 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b3ad74e7-fb86-475b-88a4-5b2f7848cd68-metrics-tls") pod "dns-default-mfpb9" (UID: "b3ad74e7-fb86-475b-88a4-5b2f7848cd68") : secret "dns-default-metrics-tls" not found Apr 24 22:30:55.099669 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:30:55.099439 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f622f7a-7d90-4dd1-af3f-3f27ebad181a-cert podName:6f622f7a-7d90-4dd1-af3f-3f27ebad181a nodeName:}" failed. No retries permitted until 2026-04-24 22:31:03.09943401 +0000 UTC m=+49.149289864 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6f622f7a-7d90-4dd1-af3f-3f27ebad181a-cert") pod "ingress-canary-q7rg6" (UID: "6f622f7a-7d90-4dd1-af3f-3f27ebad181a") : secret "canary-serving-cert" not found Apr 24 22:31:03.151957 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:03.151923 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f622f7a-7d90-4dd1-af3f-3f27ebad181a-cert\") pod \"ingress-canary-q7rg6\" (UID: \"6f622f7a-7d90-4dd1-af3f-3f27ebad181a\") " pod="openshift-ingress-canary/ingress-canary-q7rg6" Apr 24 22:31:03.152308 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:03.151968 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3ad74e7-fb86-475b-88a4-5b2f7848cd68-metrics-tls\") pod \"dns-default-mfpb9\" (UID: \"b3ad74e7-fb86-475b-88a4-5b2f7848cd68\") " pod="openshift-dns/dns-default-mfpb9" Apr 24 22:31:03.152308 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:03.152057 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:31:03.152308 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:03.152058 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:31:03.152308 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:03.152111 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f622f7a-7d90-4dd1-af3f-3f27ebad181a-cert podName:6f622f7a-7d90-4dd1-af3f-3f27ebad181a nodeName:}" failed. No retries permitted until 2026-04-24 22:31:19.152096202 +0000 UTC m=+65.201952057 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6f622f7a-7d90-4dd1-af3f-3f27ebad181a-cert") pod "ingress-canary-q7rg6" (UID: "6f622f7a-7d90-4dd1-af3f-3f27ebad181a") : secret "canary-serving-cert" not found Apr 24 22:31:03.152308 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:03.152124 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3ad74e7-fb86-475b-88a4-5b2f7848cd68-metrics-tls podName:b3ad74e7-fb86-475b-88a4-5b2f7848cd68 nodeName:}" failed. No retries permitted until 2026-04-24 22:31:19.152118333 +0000 UTC m=+65.201974188 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b3ad74e7-fb86-475b-88a4-5b2f7848cd68-metrics-tls") pod "dns-default-mfpb9" (UID: "b3ad74e7-fb86-475b-88a4-5b2f7848cd68") : secret "dns-default-metrics-tls" not found Apr 24 22:31:11.706204 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:11.706169 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xsvks" Apr 24 22:31:19.155570 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:19.155532 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3ad74e7-fb86-475b-88a4-5b2f7848cd68-metrics-tls\") pod \"dns-default-mfpb9\" (UID: \"b3ad74e7-fb86-475b-88a4-5b2f7848cd68\") " pod="openshift-dns/dns-default-mfpb9" Apr 24 22:31:19.155970 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:19.155597 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f622f7a-7d90-4dd1-af3f-3f27ebad181a-cert\") pod \"ingress-canary-q7rg6\" (UID: \"6f622f7a-7d90-4dd1-af3f-3f27ebad181a\") " pod="openshift-ingress-canary/ingress-canary-q7rg6" Apr 24 22:31:19.155970 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:19.155677 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:31:19.155970 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:19.155679 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:31:19.155970 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:19.155731 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f622f7a-7d90-4dd1-af3f-3f27ebad181a-cert podName:6f622f7a-7d90-4dd1-af3f-3f27ebad181a nodeName:}" failed. No retries permitted until 2026-04-24 22:31:51.155716296 +0000 UTC m=+97.205572151 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6f622f7a-7d90-4dd1-af3f-3f27ebad181a-cert") pod "ingress-canary-q7rg6" (UID: "6f622f7a-7d90-4dd1-af3f-3f27ebad181a") : secret "canary-serving-cert" not found Apr 24 22:31:19.155970 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:19.155744 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3ad74e7-fb86-475b-88a4-5b2f7848cd68-metrics-tls podName:b3ad74e7-fb86-475b-88a4-5b2f7848cd68 nodeName:}" failed. No retries permitted until 2026-04-24 22:31:51.155737959 +0000 UTC m=+97.205593814 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b3ad74e7-fb86-475b-88a4-5b2f7848cd68-metrics-tls") pod "dns-default-mfpb9" (UID: "b3ad74e7-fb86-475b-88a4-5b2f7848cd68") : secret "dns-default-metrics-tls" not found Apr 24 22:31:20.263191 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:20.263136 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fd77a426-e63b-4027-97b7-e9893fd72601-metrics-certs\") pod \"network-metrics-daemon-n8ntw\" (UID: \"fd77a426-e63b-4027-97b7-e9893fd72601\") " pod="openshift-multus/network-metrics-daemon-n8ntw" Apr 24 22:31:20.263191 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:20.263193 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m4cst\" (UniqueName: \"kubernetes.io/projected/bc853d31-7ceb-415d-a71b-f18ee833a50c-kube-api-access-m4cst\") pod \"network-check-target-lh47p\" (UID: \"bc853d31-7ceb-415d-a71b-f18ee833a50c\") " pod="openshift-network-diagnostics/network-check-target-lh47p" Apr 24 22:31:20.265815 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:20.265795 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 22:31:20.265917 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:20.265848 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 22:31:20.274093 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:20.274077 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 22:31:20.274186 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:20.274131 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd77a426-e63b-4027-97b7-e9893fd72601-metrics-certs podName:fd77a426-e63b-4027-97b7-e9893fd72601 nodeName:}" failed. No retries permitted until 2026-04-24 22:32:24.274115592 +0000 UTC m=+130.323971451 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fd77a426-e63b-4027-97b7-e9893fd72601-metrics-certs") pod "network-metrics-daemon-n8ntw" (UID: "fd77a426-e63b-4027-97b7-e9893fd72601") : secret "metrics-daemon-secret" not found Apr 24 22:31:20.276448 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:20.276433 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 22:31:20.287651 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:20.287631 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4cst\" (UniqueName: \"kubernetes.io/projected/bc853d31-7ceb-415d-a71b-f18ee833a50c-kube-api-access-m4cst\") pod \"network-check-target-lh47p\" (UID: \"bc853d31-7ceb-415d-a71b-f18ee833a50c\") " pod="openshift-network-diagnostics/network-check-target-lh47p" Apr 24 22:31:20.343336 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:20.343312 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-dndrr\"" Apr 24 22:31:20.350701 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:20.350677 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lh47p" Apr 24 22:31:20.469441 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:20.469407 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-lh47p"] Apr 24 22:31:20.473002 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:31:20.472976 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc853d31_7ceb_415d_a71b_f18ee833a50c.slice/crio-278fc14bff2baf19deeb9ce27bac9d14109a638fff3db3d764a4e2d92d45e1ee WatchSource:0}: Error finding container 278fc14bff2baf19deeb9ce27bac9d14109a638fff3db3d764a4e2d92d45e1ee: Status 404 returned error can't find the container with id 278fc14bff2baf19deeb9ce27bac9d14109a638fff3db3d764a4e2d92d45e1ee Apr 24 22:31:20.775625 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:20.775592 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-lh47p" event={"ID":"bc853d31-7ceb-415d-a71b-f18ee833a50c","Type":"ContainerStarted","Data":"278fc14bff2baf19deeb9ce27bac9d14109a638fff3db3d764a4e2d92d45e1ee"} Apr 24 22:31:23.783202 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:23.783169 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-lh47p" event={"ID":"bc853d31-7ceb-415d-a71b-f18ee833a50c","Type":"ContainerStarted","Data":"58740b205f67005e8833a9da7800ec3098d605723a3bf55c498e176e01517cb4"} Apr 24 22:31:23.783554 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:23.783295 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-lh47p" Apr 24 22:31:23.801637 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:23.801595 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-lh47p" podStartSLOduration=67.224867571 podStartE2EDuration="1m9.801582564s" podCreationTimestamp="2026-04-24 22:30:14 +0000 UTC" firstStartedPulling="2026-04-24 22:31:20.474760297 +0000 UTC m=+66.524616151" lastFinishedPulling="2026-04-24 22:31:23.051475285 +0000 UTC m=+69.101331144" observedRunningTime="2026-04-24 22:31:23.801506943 +0000 UTC m=+69.851362812" watchObservedRunningTime="2026-04-24 22:31:23.801582564 +0000 UTC m=+69.851438462" Apr 24 22:31:50.643383 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.643323 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bzh27"] Apr 24 22:31:50.648051 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.648027 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bzh27" Apr 24 22:31:50.650757 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.650725 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 24 22:31:50.650888 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.650781 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-7gfj9\"" Apr 24 22:31:50.651830 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.651806 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 24 22:31:50.651943 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.651812 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:31:50.655542 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.655505 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bzh27"] Apr 24 22:31:50.750103 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.750036 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-z5pwv"] Apr 24 22:31:50.752838 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.752821 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-6898f6cd9b-c68zq"] Apr 24 22:31:50.752980 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.752961 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-z5pwv" Apr 24 22:31:50.755286 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.755268 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-6nnq9"] Apr 24 22:31:50.755390 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.755346 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-6898f6cd9b-c68zq" Apr 24 22:31:50.755448 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.755431 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 24 22:31:50.755650 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.755437 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:31:50.755734 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.755719 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-jt44s\"" Apr 24 22:31:50.757684 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.757667 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-t6zmz\"" Apr 24 22:31:50.757935 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.757918 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 24 22:31:50.757935 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.757918 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 24 22:31:50.758087 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.757924 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-22dqh"] Apr 24 22:31:50.758087 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.758012 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-6nnq9" Apr 24 22:31:50.758333 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.758312 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 24 22:31:50.758518 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.758503 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 24 22:31:50.758596 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.758517 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 24 22:31:50.758653 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.758609 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 24 22:31:50.759455 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.759389 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsk2s\" (UniqueName: \"kubernetes.io/projected/5469c5ef-21d3-4db2-8b90-fe6fca022351-kube-api-access-hsk2s\") pod \"cluster-samples-operator-6dc5bdb6b4-bzh27\" (UID: \"5469c5ef-21d3-4db2-8b90-fe6fca022351\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bzh27" Apr 24 22:31:50.759516 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.759455 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5469c5ef-21d3-4db2-8b90-fe6fca022351-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bzh27\" (UID: \"5469c5ef-21d3-4db2-8b90-fe6fca022351\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bzh27" Apr 24 22:31:50.760385 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.760365 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:31:50.760470 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.760395 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 24 22:31:50.760606 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.760590 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 24 22:31:50.760679 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.760657 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-22dqh" Apr 24 22:31:50.760744 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.760597 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-tp7jm\"" Apr 24 22:31:50.761581 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.761509 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 24 22:31:50.762322 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.762304 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-z5pwv"] Apr 24 22:31:50.763306 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.763287 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-6nnq9"] Apr 24 22:31:50.764478 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.764454 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 24 22:31:50.764585 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.764545 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 22:31:50.764658 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.764586 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 22:31:50.764754 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.764729 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-8bzn5\"" Apr 24 22:31:50.764849 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.764828 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 24 22:31:50.766712 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.766693 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-6898f6cd9b-c68zq"] Apr 24 22:31:50.769051 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.769034 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 24 22:31:50.775169 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.775150 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-22dqh"] Apr 24 22:31:50.845908 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.845863 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v29tm"] Apr 24 22:31:50.848798 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.848783 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v29tm" Apr 24 22:31:50.851132 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.851117 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 24 22:31:50.851305 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.851289 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 24 22:31:50.851366 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.851296 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 24 22:31:50.851623 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.851606 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-75nrc\"" Apr 24 22:31:50.851677 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.851611 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:31:50.860413 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.860393 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6aaf0fb0-1f4a-46f7-a1db-82394fa8792a-config\") pod \"console-operator-9d4b6777b-6nnq9\" (UID: \"6aaf0fb0-1f4a-46f7-a1db-82394fa8792a\") " pod="openshift-console-operator/console-operator-9d4b6777b-6nnq9" Apr 24 22:31:50.860512 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.860420 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-stats-auth\") pod \"router-default-6898f6cd9b-c68zq\" (UID: \"b3432e60-bf25-4ee8-876a-a1ee3c4b5846\") " pod="openshift-ingress/router-default-6898f6cd9b-c68zq" Apr 24 22:31:50.860512 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.860444 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-metrics-certs\") pod \"router-default-6898f6cd9b-c68zq\" (UID: \"b3432e60-bf25-4ee8-876a-a1ee3c4b5846\") " pod="openshift-ingress/router-default-6898f6cd9b-c68zq" Apr 24 22:31:50.860512 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.860481 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h4rt\" (UniqueName: \"kubernetes.io/projected/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-kube-api-access-4h4rt\") pod \"router-default-6898f6cd9b-c68zq\" (UID: \"b3432e60-bf25-4ee8-876a-a1ee3c4b5846\") " pod="openshift-ingress/router-default-6898f6cd9b-c68zq" Apr 24 22:31:50.860639 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.860560 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hsk2s\" (UniqueName: \"kubernetes.io/projected/5469c5ef-21d3-4db2-8b90-fe6fca022351-kube-api-access-hsk2s\") pod \"cluster-samples-operator-6dc5bdb6b4-bzh27\" (UID: \"5469c5ef-21d3-4db2-8b90-fe6fca022351\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bzh27" Apr 24 22:31:50.860639 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.860583 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm64f\" (UniqueName: \"kubernetes.io/projected/6aaf0fb0-1f4a-46f7-a1db-82394fa8792a-kube-api-access-xm64f\") pod \"console-operator-9d4b6777b-6nnq9\" (UID: \"6aaf0fb0-1f4a-46f7-a1db-82394fa8792a\") " pod="openshift-console-operator/console-operator-9d4b6777b-6nnq9" Apr 24 22:31:50.860639 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.860614 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7szwr\" (UniqueName: \"kubernetes.io/projected/b2412ae3-7a69-4505-b419-5d96e93a567c-kube-api-access-7szwr\") pod \"cluster-monitoring-operator-75587bd455-22dqh\" (UID: \"b2412ae3-7a69-4505-b419-5d96e93a567c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-22dqh" Apr 24 22:31:50.860780 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.860640 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5469c5ef-21d3-4db2-8b90-fe6fca022351-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bzh27\" (UID: \"5469c5ef-21d3-4db2-8b90-fe6fca022351\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bzh27" Apr 24 22:31:50.860780 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.860663 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/b2412ae3-7a69-4505-b419-5d96e93a567c-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-22dqh\" (UID: \"b2412ae3-7a69-4505-b419-5d96e93a567c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-22dqh" Apr 24 22:31:50.860780 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.860715 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2412ae3-7a69-4505-b419-5d96e93a567c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-22dqh\" (UID: \"b2412ae3-7a69-4505-b419-5d96e93a567c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-22dqh" Apr 24 22:31:50.860780 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:50.860748 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 22:31:50.860780 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.860755 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6aaf0fb0-1f4a-46f7-a1db-82394fa8792a-serving-cert\") pod \"console-operator-9d4b6777b-6nnq9\" (UID: \"6aaf0fb0-1f4a-46f7-a1db-82394fa8792a\") " pod="openshift-console-operator/console-operator-9d4b6777b-6nnq9" Apr 24 22:31:50.861055 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:50.860809 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5469c5ef-21d3-4db2-8b90-fe6fca022351-samples-operator-tls podName:5469c5ef-21d3-4db2-8b90-fe6fca022351 nodeName:}" failed. No retries permitted until 2026-04-24 22:31:51.36079622 +0000 UTC m=+97.410652080 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/5469c5ef-21d3-4db2-8b90-fe6fca022351-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-bzh27" (UID: "5469c5ef-21d3-4db2-8b90-fe6fca022351") : secret "samples-operator-tls" not found Apr 24 22:31:50.861055 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.860859 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97shs\" (UniqueName: \"kubernetes.io/projected/489b688c-0857-4d11-95f2-90cdf9578a51-kube-api-access-97shs\") pod \"volume-data-source-validator-7c6cbb6c87-z5pwv\" (UID: \"489b688c-0857-4d11-95f2-90cdf9578a51\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-z5pwv" Apr 24 22:31:50.861055 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.860900 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6aaf0fb0-1f4a-46f7-a1db-82394fa8792a-trusted-ca\") pod \"console-operator-9d4b6777b-6nnq9\" (UID: \"6aaf0fb0-1f4a-46f7-a1db-82394fa8792a\") " pod="openshift-console-operator/console-operator-9d4b6777b-6nnq9" Apr 24 22:31:50.861055 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.860960 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-default-certificate\") pod \"router-default-6898f6cd9b-c68zq\" (UID: \"b3432e60-bf25-4ee8-876a-a1ee3c4b5846\") " pod="openshift-ingress/router-default-6898f6cd9b-c68zq" Apr 24 22:31:50.861055 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.860963 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v29tm"] Apr 24 22:31:50.861055 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.861003 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-service-ca-bundle\") pod \"router-default-6898f6cd9b-c68zq\" (UID: \"b3432e60-bf25-4ee8-876a-a1ee3c4b5846\") " pod="openshift-ingress/router-default-6898f6cd9b-c68zq" Apr 24 22:31:50.871740 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.871719 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsk2s\" (UniqueName: \"kubernetes.io/projected/5469c5ef-21d3-4db2-8b90-fe6fca022351-kube-api-access-hsk2s\") pod \"cluster-samples-operator-6dc5bdb6b4-bzh27\" (UID: \"5469c5ef-21d3-4db2-8b90-fe6fca022351\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bzh27" Apr 24 22:31:50.961624 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.961594 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f025f22-b0e0-48ff-8928-ed22d22ab622-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-v29tm\" (UID: \"9f025f22-b0e0-48ff-8928-ed22d22ab622\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v29tm" Apr 24 22:31:50.961724 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.961632 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xm64f\" (UniqueName: \"kubernetes.io/projected/6aaf0fb0-1f4a-46f7-a1db-82394fa8792a-kube-api-access-xm64f\") pod \"console-operator-9d4b6777b-6nnq9\" (UID: \"6aaf0fb0-1f4a-46f7-a1db-82394fa8792a\") " pod="openshift-console-operator/console-operator-9d4b6777b-6nnq9" Apr 24 22:31:50.961724 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.961663 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7szwr\" (UniqueName: \"kubernetes.io/projected/b2412ae3-7a69-4505-b419-5d96e93a567c-kube-api-access-7szwr\") pod \"cluster-monitoring-operator-75587bd455-22dqh\" (UID: \"b2412ae3-7a69-4505-b419-5d96e93a567c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-22dqh" Apr 24 22:31:50.961724 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.961679 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f025f22-b0e0-48ff-8928-ed22d22ab622-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-v29tm\" (UID: \"9f025f22-b0e0-48ff-8928-ed22d22ab622\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v29tm" Apr 24 22:31:50.961843 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.961818 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79q9m\" (UniqueName: \"kubernetes.io/projected/9f025f22-b0e0-48ff-8928-ed22d22ab622-kube-api-access-79q9m\") pod \"kube-storage-version-migrator-operator-6769c5d45-v29tm\" (UID: \"9f025f22-b0e0-48ff-8928-ed22d22ab622\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v29tm" Apr 24 22:31:50.961911 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.961898 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/b2412ae3-7a69-4505-b419-5d96e93a567c-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-22dqh\" (UID: \"b2412ae3-7a69-4505-b419-5d96e93a567c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-22dqh" Apr 24 22:31:50.961958 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.961932 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2412ae3-7a69-4505-b419-5d96e93a567c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-22dqh\" (UID: \"b2412ae3-7a69-4505-b419-5d96e93a567c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-22dqh" Apr 24 22:31:50.962000 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.961959 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6aaf0fb0-1f4a-46f7-a1db-82394fa8792a-serving-cert\") pod \"console-operator-9d4b6777b-6nnq9\" (UID: \"6aaf0fb0-1f4a-46f7-a1db-82394fa8792a\") " pod="openshift-console-operator/console-operator-9d4b6777b-6nnq9" Apr 24 22:31:50.962000 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.961989 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97shs\" (UniqueName: \"kubernetes.io/projected/489b688c-0857-4d11-95f2-90cdf9578a51-kube-api-access-97shs\") pod \"volume-data-source-validator-7c6cbb6c87-z5pwv\" (UID: \"489b688c-0857-4d11-95f2-90cdf9578a51\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-z5pwv" Apr 24 22:31:50.962081 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.962012 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6aaf0fb0-1f4a-46f7-a1db-82394fa8792a-trusted-ca\") pod \"console-operator-9d4b6777b-6nnq9\" (UID: \"6aaf0fb0-1f4a-46f7-a1db-82394fa8792a\") " pod="openshift-console-operator/console-operator-9d4b6777b-6nnq9" Apr 24 22:31:50.962081 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.962040 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-default-certificate\") pod \"router-default-6898f6cd9b-c68zq\" (UID: \"b3432e60-bf25-4ee8-876a-a1ee3c4b5846\") " pod="openshift-ingress/router-default-6898f6cd9b-c68zq" Apr 24 22:31:50.962081 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:50.962049 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 22:31:50.962226 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.962064 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-service-ca-bundle\") pod \"router-default-6898f6cd9b-c68zq\" (UID: \"b3432e60-bf25-4ee8-876a-a1ee3c4b5846\") " pod="openshift-ingress/router-default-6898f6cd9b-c68zq" Apr 24 22:31:50.962226 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:50.962107 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2412ae3-7a69-4505-b419-5d96e93a567c-cluster-monitoring-operator-tls podName:b2412ae3-7a69-4505-b419-5d96e93a567c nodeName:}" failed. No retries permitted until 2026-04-24 22:31:51.462088656 +0000 UTC m=+97.511944513 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b2412ae3-7a69-4505-b419-5d96e93a567c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-22dqh" (UID: "b2412ae3-7a69-4505-b419-5d96e93a567c") : secret "cluster-monitoring-operator-tls" not found Apr 24 22:31:50.962226 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.962140 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6aaf0fb0-1f4a-46f7-a1db-82394fa8792a-config\") pod \"console-operator-9d4b6777b-6nnq9\" (UID: \"6aaf0fb0-1f4a-46f7-a1db-82394fa8792a\") " pod="openshift-console-operator/console-operator-9d4b6777b-6nnq9" Apr 24 22:31:50.962226 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.962165 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-stats-auth\") pod \"router-default-6898f6cd9b-c68zq\" (UID: \"b3432e60-bf25-4ee8-876a-a1ee3c4b5846\") " pod="openshift-ingress/router-default-6898f6cd9b-c68zq" Apr 24 22:31:50.962226 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.962187 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-metrics-certs\") pod \"router-default-6898f6cd9b-c68zq\" (UID: \"b3432e60-bf25-4ee8-876a-a1ee3c4b5846\") " pod="openshift-ingress/router-default-6898f6cd9b-c68zq" Apr 24 22:31:50.962226 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.962213 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4h4rt\" (UniqueName: \"kubernetes.io/projected/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-kube-api-access-4h4rt\") pod \"router-default-6898f6cd9b-c68zq\" (UID: \"b3432e60-bf25-4ee8-876a-a1ee3c4b5846\") " pod="openshift-ingress/router-default-6898f6cd9b-c68zq" Apr 24 22:31:50.962588 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:50.962437 2574 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 22:31:50.962588 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:50.962481 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-metrics-certs podName:b3432e60-bf25-4ee8-876a-a1ee3c4b5846 nodeName:}" failed. No retries permitted until 2026-04-24 22:31:51.462466792 +0000 UTC m=+97.512322661 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-metrics-certs") pod "router-default-6898f6cd9b-c68zq" (UID: "b3432e60-bf25-4ee8-876a-a1ee3c4b5846") : secret "router-metrics-certs-default" not found Apr 24 22:31:50.962696 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:50.962643 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-service-ca-bundle podName:b3432e60-bf25-4ee8-876a-a1ee3c4b5846 nodeName:}" failed. No retries permitted until 2026-04-24 22:31:51.462629677 +0000 UTC m=+97.512485533 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-service-ca-bundle") pod "router-default-6898f6cd9b-c68zq" (UID: "b3432e60-bf25-4ee8-876a-a1ee3c4b5846") : configmap references non-existent config key: service-ca.crt Apr 24 22:31:50.962763 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.962714 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/b2412ae3-7a69-4505-b419-5d96e93a567c-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-22dqh\" (UID: \"b2412ae3-7a69-4505-b419-5d96e93a567c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-22dqh" Apr 24 22:31:50.962926 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.962904 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6aaf0fb0-1f4a-46f7-a1db-82394fa8792a-config\") pod \"console-operator-9d4b6777b-6nnq9\" (UID: \"6aaf0fb0-1f4a-46f7-a1db-82394fa8792a\") " pod="openshift-console-operator/console-operator-9d4b6777b-6nnq9" Apr 24 22:31:50.963074 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.963055 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6aaf0fb0-1f4a-46f7-a1db-82394fa8792a-trusted-ca\") pod \"console-operator-9d4b6777b-6nnq9\" (UID: \"6aaf0fb0-1f4a-46f7-a1db-82394fa8792a\") " pod="openshift-console-operator/console-operator-9d4b6777b-6nnq9" Apr 24 22:31:50.964579 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.964552 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-stats-auth\") pod \"router-default-6898f6cd9b-c68zq\" (UID: \"b3432e60-bf25-4ee8-876a-a1ee3c4b5846\") " pod="openshift-ingress/router-default-6898f6cd9b-c68zq" Apr 24 22:31:50.964793 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.964773 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6aaf0fb0-1f4a-46f7-a1db-82394fa8792a-serving-cert\") pod \"console-operator-9d4b6777b-6nnq9\" (UID: \"6aaf0fb0-1f4a-46f7-a1db-82394fa8792a\") " pod="openshift-console-operator/console-operator-9d4b6777b-6nnq9" Apr 24 22:31:50.964831 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.964783 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-default-certificate\") pod \"router-default-6898f6cd9b-c68zq\" (UID: \"b3432e60-bf25-4ee8-876a-a1ee3c4b5846\") " pod="openshift-ingress/router-default-6898f6cd9b-c68zq" Apr 24 22:31:50.979047 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.979025 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7szwr\" (UniqueName: \"kubernetes.io/projected/b2412ae3-7a69-4505-b419-5d96e93a567c-kube-api-access-7szwr\") pod \"cluster-monitoring-operator-75587bd455-22dqh\" (UID: \"b2412ae3-7a69-4505-b419-5d96e93a567c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-22dqh" Apr 24 22:31:50.981183 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.981161 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h4rt\" (UniqueName: \"kubernetes.io/projected/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-kube-api-access-4h4rt\") pod \"router-default-6898f6cd9b-c68zq\" (UID: \"b3432e60-bf25-4ee8-876a-a1ee3c4b5846\") " pod="openshift-ingress/router-default-6898f6cd9b-c68zq" Apr 24 22:31:50.983827 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.983806 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-97shs\" (UniqueName: \"kubernetes.io/projected/489b688c-0857-4d11-95f2-90cdf9578a51-kube-api-access-97shs\") pod \"volume-data-source-validator-7c6cbb6c87-z5pwv\" (UID: \"489b688c-0857-4d11-95f2-90cdf9578a51\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-z5pwv" Apr 24 22:31:50.983923 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:50.983823 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm64f\" (UniqueName: \"kubernetes.io/projected/6aaf0fb0-1f4a-46f7-a1db-82394fa8792a-kube-api-access-xm64f\") pod \"console-operator-9d4b6777b-6nnq9\" (UID: \"6aaf0fb0-1f4a-46f7-a1db-82394fa8792a\") " pod="openshift-console-operator/console-operator-9d4b6777b-6nnq9" Apr 24 22:31:51.062936 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:51.062916 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-79q9m\" (UniqueName: \"kubernetes.io/projected/9f025f22-b0e0-48ff-8928-ed22d22ab622-kube-api-access-79q9m\") pod \"kube-storage-version-migrator-operator-6769c5d45-v29tm\" (UID: \"9f025f22-b0e0-48ff-8928-ed22d22ab622\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v29tm" Apr 24 22:31:51.063025 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:51.062985 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f025f22-b0e0-48ff-8928-ed22d22ab622-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-v29tm\" (UID: \"9f025f22-b0e0-48ff-8928-ed22d22ab622\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v29tm" Apr 24 22:31:51.063025 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:51.063017 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f025f22-b0e0-48ff-8928-ed22d22ab622-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-v29tm\" (UID: \"9f025f22-b0e0-48ff-8928-ed22d22ab622\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v29tm" Apr 24 22:31:51.063514 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:51.063498 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-z5pwv" Apr 24 22:31:51.064800 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:51.064784 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f025f22-b0e0-48ff-8928-ed22d22ab622-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-v29tm\" (UID: \"9f025f22-b0e0-48ff-8928-ed22d22ab622\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v29tm" Apr 24 22:31:51.067291 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:51.067272 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f025f22-b0e0-48ff-8928-ed22d22ab622-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-v29tm\" (UID: \"9f025f22-b0e0-48ff-8928-ed22d22ab622\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v29tm" Apr 24 22:31:51.070357 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:51.070339 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-79q9m\" (UniqueName: \"kubernetes.io/projected/9f025f22-b0e0-48ff-8928-ed22d22ab622-kube-api-access-79q9m\") pod \"kube-storage-version-migrator-operator-6769c5d45-v29tm\" (UID: \"9f025f22-b0e0-48ff-8928-ed22d22ab622\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v29tm" Apr 24 22:31:51.078351 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:51.078334 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-6nnq9" Apr 24 22:31:51.157238 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:51.157209 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v29tm" Apr 24 22:31:51.164561 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:51.164270 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f622f7a-7d90-4dd1-af3f-3f27ebad181a-cert\") pod \"ingress-canary-q7rg6\" (UID: \"6f622f7a-7d90-4dd1-af3f-3f27ebad181a\") " pod="openshift-ingress-canary/ingress-canary-q7rg6" Apr 24 22:31:51.164561 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:51.164345 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3ad74e7-fb86-475b-88a4-5b2f7848cd68-metrics-tls\") pod \"dns-default-mfpb9\" (UID: \"b3ad74e7-fb86-475b-88a4-5b2f7848cd68\") " pod="openshift-dns/dns-default-mfpb9" Apr 24 22:31:51.164561 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:51.164409 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:31:51.164561 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:51.164480 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f622f7a-7d90-4dd1-af3f-3f27ebad181a-cert podName:6f622f7a-7d90-4dd1-af3f-3f27ebad181a nodeName:}" failed. No retries permitted until 2026-04-24 22:32:55.16445838 +0000 UTC m=+161.214314243 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6f622f7a-7d90-4dd1-af3f-3f27ebad181a-cert") pod "ingress-canary-q7rg6" (UID: "6f622f7a-7d90-4dd1-af3f-3f27ebad181a") : secret "canary-serving-cert" not found Apr 24 22:31:51.164561 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:51.164537 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:31:51.164818 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:51.164601 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3ad74e7-fb86-475b-88a4-5b2f7848cd68-metrics-tls podName:b3ad74e7-fb86-475b-88a4-5b2f7848cd68 nodeName:}" failed. No retries permitted until 2026-04-24 22:32:55.164585289 +0000 UTC m=+161.214441145 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b3ad74e7-fb86-475b-88a4-5b2f7848cd68-metrics-tls") pod "dns-default-mfpb9" (UID: "b3ad74e7-fb86-475b-88a4-5b2f7848cd68") : secret "dns-default-metrics-tls" not found Apr 24 22:31:51.179505 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:51.179473 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-z5pwv"] Apr 24 22:31:51.182733 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:31:51.182699 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod489b688c_0857_4d11_95f2_90cdf9578a51.slice/crio-0727c5547738f3a35b81e9b30728a71924f8b79d7a1f693b9b2b83249c4db5a5 WatchSource:0}: Error finding container 0727c5547738f3a35b81e9b30728a71924f8b79d7a1f693b9b2b83249c4db5a5: Status 404 returned error can't find the container with id 0727c5547738f3a35b81e9b30728a71924f8b79d7a1f693b9b2b83249c4db5a5 Apr 24 22:31:51.195210 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:51.195187 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-6nnq9"] Apr 24 22:31:51.198548 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:31:51.198521 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6aaf0fb0_1f4a_46f7_a1db_82394fa8792a.slice/crio-28869cda2c864526a705cc0bb091ad6a2966d477fb6f5a436fef3eece80cb2cd WatchSource:0}: Error finding container 28869cda2c864526a705cc0bb091ad6a2966d477fb6f5a436fef3eece80cb2cd: Status 404 returned error can't find the container with id 28869cda2c864526a705cc0bb091ad6a2966d477fb6f5a436fef3eece80cb2cd Apr 24 22:31:51.268275 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:51.268249 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v29tm"] Apr 24 22:31:51.271621 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:31:51.271599 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f025f22_b0e0_48ff_8928_ed22d22ab622.slice/crio-ee3a6d002ce6902e58dbc38fe96e883c058da8025626a48017d8788d3bf9f06b WatchSource:0}: Error finding container ee3a6d002ce6902e58dbc38fe96e883c058da8025626a48017d8788d3bf9f06b: Status 404 returned error can't find the container with id ee3a6d002ce6902e58dbc38fe96e883c058da8025626a48017d8788d3bf9f06b Apr 24 22:31:51.366660 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:51.366596 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5469c5ef-21d3-4db2-8b90-fe6fca022351-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bzh27\" (UID: \"5469c5ef-21d3-4db2-8b90-fe6fca022351\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bzh27" Apr 24 22:31:51.366759 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:51.366708 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 22:31:51.366805 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:51.366763 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5469c5ef-21d3-4db2-8b90-fe6fca022351-samples-operator-tls podName:5469c5ef-21d3-4db2-8b90-fe6fca022351 nodeName:}" failed. No retries permitted until 2026-04-24 22:31:52.366749011 +0000 UTC m=+98.416604865 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/5469c5ef-21d3-4db2-8b90-fe6fca022351-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-bzh27" (UID: "5469c5ef-21d3-4db2-8b90-fe6fca022351") : secret "samples-operator-tls" not found Apr 24 22:31:51.467300 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:51.467276 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2412ae3-7a69-4505-b419-5d96e93a567c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-22dqh\" (UID: \"b2412ae3-7a69-4505-b419-5d96e93a567c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-22dqh" Apr 24 22:31:51.467385 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:51.467311 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-service-ca-bundle\") pod \"router-default-6898f6cd9b-c68zq\" (UID: \"b3432e60-bf25-4ee8-876a-a1ee3c4b5846\") " pod="openshift-ingress/router-default-6898f6cd9b-c68zq" Apr 24 22:31:51.467385 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:51.467335 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-metrics-certs\") pod \"router-default-6898f6cd9b-c68zq\" (UID: \"b3432e60-bf25-4ee8-876a-a1ee3c4b5846\") " pod="openshift-ingress/router-default-6898f6cd9b-c68zq" Apr 24 22:31:51.467457 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:51.467444 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-service-ca-bundle podName:b3432e60-bf25-4ee8-876a-a1ee3c4b5846 nodeName:}" failed. No retries permitted until 2026-04-24 22:31:52.46742629 +0000 UTC m=+98.517282144 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-service-ca-bundle") pod "router-default-6898f6cd9b-c68zq" (UID: "b3432e60-bf25-4ee8-876a-a1ee3c4b5846") : configmap references non-existent config key: service-ca.crt Apr 24 22:31:51.467510 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:51.467445 2574 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 22:31:51.467510 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:51.467471 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 22:31:51.467510 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:51.467478 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-metrics-certs podName:b3432e60-bf25-4ee8-876a-a1ee3c4b5846 nodeName:}" failed. No retries permitted until 2026-04-24 22:31:52.467472026 +0000 UTC m=+98.517327881 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-metrics-certs") pod "router-default-6898f6cd9b-c68zq" (UID: "b3432e60-bf25-4ee8-876a-a1ee3c4b5846") : secret "router-metrics-certs-default" not found Apr 24 22:31:51.467613 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:51.467531 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2412ae3-7a69-4505-b419-5d96e93a567c-cluster-monitoring-operator-tls podName:b2412ae3-7a69-4505-b419-5d96e93a567c nodeName:}" failed. No retries permitted until 2026-04-24 22:31:52.467516269 +0000 UTC m=+98.517372136 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b2412ae3-7a69-4505-b419-5d96e93a567c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-22dqh" (UID: "b2412ae3-7a69-4505-b419-5d96e93a567c") : secret "cluster-monitoring-operator-tls" not found Apr 24 22:31:51.837459 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:51.837421 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v29tm" event={"ID":"9f025f22-b0e0-48ff-8928-ed22d22ab622","Type":"ContainerStarted","Data":"ee3a6d002ce6902e58dbc38fe96e883c058da8025626a48017d8788d3bf9f06b"} Apr 24 22:31:51.838610 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:51.838554 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-6nnq9" event={"ID":"6aaf0fb0-1f4a-46f7-a1db-82394fa8792a","Type":"ContainerStarted","Data":"28869cda2c864526a705cc0bb091ad6a2966d477fb6f5a436fef3eece80cb2cd"} Apr 24 22:31:51.840205 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:51.840158 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-z5pwv" event={"ID":"489b688c-0857-4d11-95f2-90cdf9578a51","Type":"ContainerStarted","Data":"0727c5547738f3a35b81e9b30728a71924f8b79d7a1f693b9b2b83249c4db5a5"} Apr 24 22:31:52.374836 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:52.374707 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5469c5ef-21d3-4db2-8b90-fe6fca022351-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bzh27\" (UID: \"5469c5ef-21d3-4db2-8b90-fe6fca022351\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bzh27" Apr 24 22:31:52.375050 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:52.374887 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 22:31:52.375050 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:52.374967 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5469c5ef-21d3-4db2-8b90-fe6fca022351-samples-operator-tls podName:5469c5ef-21d3-4db2-8b90-fe6fca022351 nodeName:}" failed. No retries permitted until 2026-04-24 22:31:54.374944743 +0000 UTC m=+100.424800600 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/5469c5ef-21d3-4db2-8b90-fe6fca022351-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-bzh27" (UID: "5469c5ef-21d3-4db2-8b90-fe6fca022351") : secret "samples-operator-tls" not found Apr 24 22:31:52.475748 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:52.475678 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2412ae3-7a69-4505-b419-5d96e93a567c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-22dqh\" (UID: \"b2412ae3-7a69-4505-b419-5d96e93a567c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-22dqh" Apr 24 22:31:52.475748 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:52.475775 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-service-ca-bundle\") pod \"router-default-6898f6cd9b-c68zq\" (UID: \"b3432e60-bf25-4ee8-876a-a1ee3c4b5846\") " pod="openshift-ingress/router-default-6898f6cd9b-c68zq" Apr 24 22:31:52.476081 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:52.475853 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 22:31:52.476081 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:52.475914 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-service-ca-bundle podName:b3432e60-bf25-4ee8-876a-a1ee3c4b5846 nodeName:}" failed. No retries permitted until 2026-04-24 22:31:54.475894905 +0000 UTC m=+100.525750777 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-service-ca-bundle") pod "router-default-6898f6cd9b-c68zq" (UID: "b3432e60-bf25-4ee8-876a-a1ee3c4b5846") : configmap references non-existent config key: service-ca.crt Apr 24 22:31:52.476081 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:52.475941 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-metrics-certs\") pod \"router-default-6898f6cd9b-c68zq\" (UID: \"b3432e60-bf25-4ee8-876a-a1ee3c4b5846\") " pod="openshift-ingress/router-default-6898f6cd9b-c68zq" Apr 24 22:31:52.476081 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:52.476017 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2412ae3-7a69-4505-b419-5d96e93a567c-cluster-monitoring-operator-tls podName:b2412ae3-7a69-4505-b419-5d96e93a567c nodeName:}" failed. No retries permitted until 2026-04-24 22:31:54.476000651 +0000 UTC m=+100.525856507 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b2412ae3-7a69-4505-b419-5d96e93a567c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-22dqh" (UID: "b2412ae3-7a69-4505-b419-5d96e93a567c") : secret "cluster-monitoring-operator-tls" not found Apr 24 22:31:52.476081 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:52.476083 2574 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 22:31:52.476431 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:52.476119 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-metrics-certs podName:b3432e60-bf25-4ee8-876a-a1ee3c4b5846 nodeName:}" failed. No retries permitted until 2026-04-24 22:31:54.476107518 +0000 UTC m=+100.525963378 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-metrics-certs") pod "router-default-6898f6cd9b-c68zq" (UID: "b3432e60-bf25-4ee8-876a-a1ee3c4b5846") : secret "router-metrics-certs-default" not found Apr 24 22:31:53.845547 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:53.845518 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v29tm" event={"ID":"9f025f22-b0e0-48ff-8928-ed22d22ab622","Type":"ContainerStarted","Data":"aea56cdb8d55ea5e2c24edb99ac1e576201133af7053e1703730266352d065ba"} Apr 24 22:31:53.847014 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:53.846994 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6nnq9_6aaf0fb0-1f4a-46f7-a1db-82394fa8792a/console-operator/0.log" Apr 24 22:31:53.847123 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:53.847033 2574 generic.go:358] "Generic (PLEG): container finished" podID="6aaf0fb0-1f4a-46f7-a1db-82394fa8792a" containerID="fc2d6500767d8d484369a15cc5fbb029908214f0f0865b04f60d052e8d1d4f61" exitCode=255 Apr 24 22:31:53.847123 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:53.847102 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-6nnq9" event={"ID":"6aaf0fb0-1f4a-46f7-a1db-82394fa8792a","Type":"ContainerDied","Data":"fc2d6500767d8d484369a15cc5fbb029908214f0f0865b04f60d052e8d1d4f61"} Apr 24 22:31:53.847290 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:53.847272 2574 scope.go:117] "RemoveContainer" containerID="fc2d6500767d8d484369a15cc5fbb029908214f0f0865b04f60d052e8d1d4f61" Apr 24 22:31:53.848603 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:53.848414 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-z5pwv" event={"ID":"489b688c-0857-4d11-95f2-90cdf9578a51","Type":"ContainerStarted","Data":"3c2476b3260bd24d88f76ad64a909c4be5002d01ee0f871561a10cfb3f280e89"} Apr 24 22:31:53.863968 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:53.863926 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v29tm" podStartSLOduration=1.5273854 podStartE2EDuration="3.863911588s" podCreationTimestamp="2026-04-24 22:31:50 +0000 UTC" firstStartedPulling="2026-04-24 22:31:51.273383608 +0000 UTC m=+97.323239463" lastFinishedPulling="2026-04-24 22:31:53.609909795 +0000 UTC m=+99.659765651" observedRunningTime="2026-04-24 22:31:53.862420087 +0000 UTC m=+99.912275965" watchObservedRunningTime="2026-04-24 22:31:53.863911588 +0000 UTC m=+99.913767458" Apr 24 22:31:53.897645 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:53.897597 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-z5pwv" podStartSLOduration=2.466353233 podStartE2EDuration="3.897587461s" podCreationTimestamp="2026-04-24 22:31:50 +0000 UTC" firstStartedPulling="2026-04-24 22:31:51.184623048 +0000 UTC m=+97.234478906" lastFinishedPulling="2026-04-24 22:31:52.615857269 +0000 UTC m=+98.665713134" observedRunningTime="2026-04-24 22:31:53.89667066 +0000 UTC m=+99.946526538" watchObservedRunningTime="2026-04-24 22:31:53.897587461 +0000 UTC m=+99.947443332" Apr 24 22:31:54.391113 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:54.391081 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5469c5ef-21d3-4db2-8b90-fe6fca022351-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bzh27\" (UID: \"5469c5ef-21d3-4db2-8b90-fe6fca022351\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bzh27" Apr 24 22:31:54.391269 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:54.391212 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 22:31:54.391269 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:54.391267 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5469c5ef-21d3-4db2-8b90-fe6fca022351-samples-operator-tls podName:5469c5ef-21d3-4db2-8b90-fe6fca022351 nodeName:}" failed. No retries permitted until 2026-04-24 22:31:58.391252012 +0000 UTC m=+104.441107872 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/5469c5ef-21d3-4db2-8b90-fe6fca022351-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-bzh27" (UID: "5469c5ef-21d3-4db2-8b90-fe6fca022351") : secret "samples-operator-tls" not found Apr 24 22:31:54.491791 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:54.491759 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2412ae3-7a69-4505-b419-5d96e93a567c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-22dqh\" (UID: \"b2412ae3-7a69-4505-b419-5d96e93a567c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-22dqh" Apr 24 22:31:54.491970 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:54.491803 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-service-ca-bundle\") pod \"router-default-6898f6cd9b-c68zq\" (UID: \"b3432e60-bf25-4ee8-876a-a1ee3c4b5846\") " pod="openshift-ingress/router-default-6898f6cd9b-c68zq" Apr 24 22:31:54.491970 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:54.491835 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-metrics-certs\") pod \"router-default-6898f6cd9b-c68zq\" (UID: \"b3432e60-bf25-4ee8-876a-a1ee3c4b5846\") " pod="openshift-ingress/router-default-6898f6cd9b-c68zq" Apr 24 22:31:54.491970 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:54.491929 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 22:31:54.491970 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:54.491952 2574 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 22:31:54.492116 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:54.491971 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-service-ca-bundle podName:b3432e60-bf25-4ee8-876a-a1ee3c4b5846 nodeName:}" failed. No retries permitted until 2026-04-24 22:31:58.491956646 +0000 UTC m=+104.541812501 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-service-ca-bundle") pod "router-default-6898f6cd9b-c68zq" (UID: "b3432e60-bf25-4ee8-876a-a1ee3c4b5846") : configmap references non-existent config key: service-ca.crt Apr 24 22:31:54.492116 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:54.491988 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2412ae3-7a69-4505-b419-5d96e93a567c-cluster-monitoring-operator-tls podName:b2412ae3-7a69-4505-b419-5d96e93a567c nodeName:}" failed. No retries permitted until 2026-04-24 22:31:58.491981883 +0000 UTC m=+104.541837739 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b2412ae3-7a69-4505-b419-5d96e93a567c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-22dqh" (UID: "b2412ae3-7a69-4505-b419-5d96e93a567c") : secret "cluster-monitoring-operator-tls" not found Apr 24 22:31:54.492116 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:54.491999 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-metrics-certs podName:b3432e60-bf25-4ee8-876a-a1ee3c4b5846 nodeName:}" failed. No retries permitted until 2026-04-24 22:31:58.491993326 +0000 UTC m=+104.541849181 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-metrics-certs") pod "router-default-6898f6cd9b-c68zq" (UID: "b3432e60-bf25-4ee8-876a-a1ee3c4b5846") : secret "router-metrics-certs-default" not found Apr 24 22:31:54.787394 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:54.787370 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-lh47p" Apr 24 22:31:54.851608 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:54.851588 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6nnq9_6aaf0fb0-1f4a-46f7-a1db-82394fa8792a/console-operator/1.log" Apr 24 22:31:54.852044 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:54.852028 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6nnq9_6aaf0fb0-1f4a-46f7-a1db-82394fa8792a/console-operator/0.log" Apr 24 22:31:54.852100 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:54.852069 2574 generic.go:358] "Generic (PLEG): container finished" podID="6aaf0fb0-1f4a-46f7-a1db-82394fa8792a" containerID="067d0f65a44efa35df6ff006f90fa48492d82b490472e7b5d81f577e4c2fa004" exitCode=255 Apr 24 22:31:54.852185 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:54.852162 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-6nnq9" event={"ID":"6aaf0fb0-1f4a-46f7-a1db-82394fa8792a","Type":"ContainerDied","Data":"067d0f65a44efa35df6ff006f90fa48492d82b490472e7b5d81f577e4c2fa004"} Apr 24 22:31:54.852242 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:54.852206 2574 scope.go:117] "RemoveContainer" containerID="fc2d6500767d8d484369a15cc5fbb029908214f0f0865b04f60d052e8d1d4f61" Apr 24 22:31:54.852431 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:54.852412 2574 scope.go:117] "RemoveContainer" containerID="067d0f65a44efa35df6ff006f90fa48492d82b490472e7b5d81f577e4c2fa004" Apr 24 22:31:54.852641 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:54.852619 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-6nnq9_openshift-console-operator(6aaf0fb0-1f4a-46f7-a1db-82394fa8792a)\"" pod="openshift-console-operator/console-operator-9d4b6777b-6nnq9" podUID="6aaf0fb0-1f4a-46f7-a1db-82394fa8792a" Apr 24 22:31:55.001361 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:55.001335 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-pwmng"] Apr 24 22:31:55.004599 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:55.004584 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-pwmng" Apr 24 22:31:55.007659 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:55.007643 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-2jqc6\"" Apr 24 22:31:55.013685 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:55.013665 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-pwmng"] Apr 24 22:31:55.096953 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:55.096905 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f899j\" (UniqueName: \"kubernetes.io/projected/7679822c-7212-4ac5-9de4-a4486c413686-kube-api-access-f899j\") pod \"network-check-source-8894fc9bd-pwmng\" (UID: \"7679822c-7212-4ac5-9de4-a4486c413686\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-pwmng" Apr 24 22:31:55.197826 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:55.197798 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f899j\" (UniqueName: \"kubernetes.io/projected/7679822c-7212-4ac5-9de4-a4486c413686-kube-api-access-f899j\") pod \"network-check-source-8894fc9bd-pwmng\" (UID: \"7679822c-7212-4ac5-9de4-a4486c413686\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-pwmng" Apr 24 22:31:55.205730 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:55.205706 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f899j\" (UniqueName: \"kubernetes.io/projected/7679822c-7212-4ac5-9de4-a4486c413686-kube-api-access-f899j\") pod \"network-check-source-8894fc9bd-pwmng\" (UID: \"7679822c-7212-4ac5-9de4-a4486c413686\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-pwmng" Apr 24 22:31:55.312312 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:55.312286 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-pwmng" Apr 24 22:31:55.426713 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:55.426682 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-pwmng"] Apr 24 22:31:55.429254 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:31:55.429223 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7679822c_7212_4ac5_9de4_a4486c413686.slice/crio-86aae059a4c808228aa66f840b02e08cb5595977f9b8ebb8cdca66e012240366 WatchSource:0}: Error finding container 86aae059a4c808228aa66f840b02e08cb5595977f9b8ebb8cdca66e012240366: Status 404 returned error can't find the container with id 86aae059a4c808228aa66f840b02e08cb5595977f9b8ebb8cdca66e012240366 Apr 24 22:31:55.855102 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:55.855075 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6nnq9_6aaf0fb0-1f4a-46f7-a1db-82394fa8792a/console-operator/1.log" Apr 24 22:31:55.855474 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:55.855419 2574 scope.go:117] "RemoveContainer" containerID="067d0f65a44efa35df6ff006f90fa48492d82b490472e7b5d81f577e4c2fa004" Apr 24 22:31:55.855628 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:55.855607 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-6nnq9_openshift-console-operator(6aaf0fb0-1f4a-46f7-a1db-82394fa8792a)\"" pod="openshift-console-operator/console-operator-9d4b6777b-6nnq9" podUID="6aaf0fb0-1f4a-46f7-a1db-82394fa8792a" Apr 24 22:31:55.856379 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:55.856356 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-pwmng" event={"ID":"7679822c-7212-4ac5-9de4-a4486c413686","Type":"ContainerStarted","Data":"83d5cadb8f91e6801947c38c663bcf7d83fb20a6e1591c4909b3857ff9926615"} Apr 24 22:31:55.856481 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:55.856385 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-pwmng" event={"ID":"7679822c-7212-4ac5-9de4-a4486c413686","Type":"ContainerStarted","Data":"86aae059a4c808228aa66f840b02e08cb5595977f9b8ebb8cdca66e012240366"} Apr 24 22:31:55.888681 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:55.888643 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-pwmng" podStartSLOduration=1.888631396 podStartE2EDuration="1.888631396s" podCreationTimestamp="2026-04-24 22:31:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:31:55.88819385 +0000 UTC m=+101.938049741" watchObservedRunningTime="2026-04-24 22:31:55.888631396 +0000 UTC m=+101.938487272" Apr 24 22:31:57.270765 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:57.270736 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-t25xz_31be179c-c441-48a4-8779-593458646c77/dns-node-resolver/0.log" Apr 24 22:31:58.070368 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:58.070343 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-tb9tr_ed31fcf6-85e1-43d1-86ff-16bb6763e3e6/node-ca/0.log" Apr 24 22:31:58.422599 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:58.422492 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5469c5ef-21d3-4db2-8b90-fe6fca022351-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bzh27\" (UID: \"5469c5ef-21d3-4db2-8b90-fe6fca022351\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bzh27" Apr 24 22:31:58.422999 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:58.422637 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 22:31:58.422999 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:58.422702 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5469c5ef-21d3-4db2-8b90-fe6fca022351-samples-operator-tls podName:5469c5ef-21d3-4db2-8b90-fe6fca022351 nodeName:}" failed. No retries permitted until 2026-04-24 22:32:06.422686351 +0000 UTC m=+112.472542206 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/5469c5ef-21d3-4db2-8b90-fe6fca022351-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-bzh27" (UID: "5469c5ef-21d3-4db2-8b90-fe6fca022351") : secret "samples-operator-tls" not found Apr 24 22:31:58.523421 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:58.523387 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-service-ca-bundle\") pod \"router-default-6898f6cd9b-c68zq\" (UID: \"b3432e60-bf25-4ee8-876a-a1ee3c4b5846\") " pod="openshift-ingress/router-default-6898f6cd9b-c68zq" Apr 24 22:31:58.523421 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:58.523431 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-metrics-certs\") pod \"router-default-6898f6cd9b-c68zq\" (UID: \"b3432e60-bf25-4ee8-876a-a1ee3c4b5846\") " pod="openshift-ingress/router-default-6898f6cd9b-c68zq" Apr 24 22:31:58.523623 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:31:58.523511 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2412ae3-7a69-4505-b419-5d96e93a567c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-22dqh\" (UID: \"b2412ae3-7a69-4505-b419-5d96e93a567c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-22dqh" Apr 24 22:31:58.523623 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:58.523588 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-service-ca-bundle podName:b3432e60-bf25-4ee8-876a-a1ee3c4b5846 nodeName:}" failed. No retries permitted until 2026-04-24 22:32:06.523564689 +0000 UTC m=+112.573420568 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-service-ca-bundle") pod "router-default-6898f6cd9b-c68zq" (UID: "b3432e60-bf25-4ee8-876a-a1ee3c4b5846") : configmap references non-existent config key: service-ca.crt Apr 24 22:31:58.523623 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:58.523609 2574 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 22:31:58.523773 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:58.523667 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-metrics-certs podName:b3432e60-bf25-4ee8-876a-a1ee3c4b5846 nodeName:}" failed. No retries permitted until 2026-04-24 22:32:06.523653911 +0000 UTC m=+112.573509770 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-metrics-certs") pod "router-default-6898f6cd9b-c68zq" (UID: "b3432e60-bf25-4ee8-876a-a1ee3c4b5846") : secret "router-metrics-certs-default" not found Apr 24 22:31:58.523773 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:58.523609 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 22:31:58.523773 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:31:58.523713 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2412ae3-7a69-4505-b419-5d96e93a567c-cluster-monitoring-operator-tls podName:b2412ae3-7a69-4505-b419-5d96e93a567c nodeName:}" failed. No retries permitted until 2026-04-24 22:32:06.523702103 +0000 UTC m=+112.573557968 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b2412ae3-7a69-4505-b419-5d96e93a567c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-22dqh" (UID: "b2412ae3-7a69-4505-b419-5d96e93a567c") : secret "cluster-monitoring-operator-tls" not found Apr 24 22:32:01.079349 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:01.079299 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-6nnq9" Apr 24 22:32:01.079349 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:01.079358 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-6nnq9" Apr 24 22:32:01.079856 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:01.079790 2574 scope.go:117] "RemoveContainer" containerID="067d0f65a44efa35df6ff006f90fa48492d82b490472e7b5d81f577e4c2fa004" Apr 24 22:32:01.080033 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:32:01.080013 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-6nnq9_openshift-console-operator(6aaf0fb0-1f4a-46f7-a1db-82394fa8792a)\"" pod="openshift-console-operator/console-operator-9d4b6777b-6nnq9" podUID="6aaf0fb0-1f4a-46f7-a1db-82394fa8792a" Apr 24 22:32:06.488718 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:06.488672 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5469c5ef-21d3-4db2-8b90-fe6fca022351-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bzh27\" (UID: \"5469c5ef-21d3-4db2-8b90-fe6fca022351\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bzh27" Apr 24 22:32:06.491099 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:06.491068 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5469c5ef-21d3-4db2-8b90-fe6fca022351-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bzh27\" (UID: \"5469c5ef-21d3-4db2-8b90-fe6fca022351\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bzh27" Apr 24 22:32:06.556960 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:06.556935 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bzh27" Apr 24 22:32:06.589961 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:06.589934 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2412ae3-7a69-4505-b419-5d96e93a567c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-22dqh\" (UID: \"b2412ae3-7a69-4505-b419-5d96e93a567c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-22dqh" Apr 24 22:32:06.590071 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:06.589972 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-service-ca-bundle\") pod \"router-default-6898f6cd9b-c68zq\" (UID: \"b3432e60-bf25-4ee8-876a-a1ee3c4b5846\") " pod="openshift-ingress/router-default-6898f6cd9b-c68zq" Apr 24 22:32:06.590071 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:06.589996 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-metrics-certs\") pod \"router-default-6898f6cd9b-c68zq\" (UID: \"b3432e60-bf25-4ee8-876a-a1ee3c4b5846\") " pod="openshift-ingress/router-default-6898f6cd9b-c68zq" Apr 24 22:32:06.590181 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:32:06.590089 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-service-ca-bundle podName:b3432e60-bf25-4ee8-876a-a1ee3c4b5846 nodeName:}" failed. No retries permitted until 2026-04-24 22:32:22.590075852 +0000 UTC m=+128.639931707 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-service-ca-bundle") pod "router-default-6898f6cd9b-c68zq" (UID: "b3432e60-bf25-4ee8-876a-a1ee3c4b5846") : configmap references non-existent config key: service-ca.crt Apr 24 22:32:06.590181 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:32:06.590106 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 22:32:06.590299 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:32:06.590207 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2412ae3-7a69-4505-b419-5d96e93a567c-cluster-monitoring-operator-tls podName:b2412ae3-7a69-4505-b419-5d96e93a567c nodeName:}" failed. No retries permitted until 2026-04-24 22:32:22.590188586 +0000 UTC m=+128.640044441 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b2412ae3-7a69-4505-b419-5d96e93a567c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-22dqh" (UID: "b2412ae3-7a69-4505-b419-5d96e93a567c") : secret "cluster-monitoring-operator-tls" not found Apr 24 22:32:06.592384 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:06.592349 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-metrics-certs\") pod \"router-default-6898f6cd9b-c68zq\" (UID: \"b3432e60-bf25-4ee8-876a-a1ee3c4b5846\") " pod="openshift-ingress/router-default-6898f6cd9b-c68zq" Apr 24 22:32:06.672503 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:06.672472 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bzh27"] Apr 24 22:32:06.878808 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:06.878778 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bzh27" event={"ID":"5469c5ef-21d3-4db2-8b90-fe6fca022351","Type":"ContainerStarted","Data":"bb525041efed313337a4554e85b504a1944751e0d07027c576ee95a4424941d7"} Apr 24 22:32:08.885553 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:08.885516 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bzh27" event={"ID":"5469c5ef-21d3-4db2-8b90-fe6fca022351","Type":"ContainerStarted","Data":"da1a6316d0fc6712947443a60fc2baea426b4b1c832f9e617017a87013162dc4"} Apr 24 22:32:08.885553 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:08.885553 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bzh27" event={"ID":"5469c5ef-21d3-4db2-8b90-fe6fca022351","Type":"ContainerStarted","Data":"e98a6810ff130db3f2b6b0ec660201057ae1b436493486823f0c8eee5a52b316"} Apr 24 22:32:08.905485 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:08.905432 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bzh27" podStartSLOduration=17.252813684 podStartE2EDuration="18.905418461s" podCreationTimestamp="2026-04-24 22:31:50 +0000 UTC" firstStartedPulling="2026-04-24 22:32:06.711062472 +0000 UTC m=+112.760918327" lastFinishedPulling="2026-04-24 22:32:08.363667245 +0000 UTC m=+114.413523104" observedRunningTime="2026-04-24 22:32:08.905159221 +0000 UTC m=+114.955015099" watchObservedRunningTime="2026-04-24 22:32:08.905418461 +0000 UTC m=+114.955274339" Apr 24 22:32:11.531665 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:11.531631 2574 scope.go:117] "RemoveContainer" containerID="067d0f65a44efa35df6ff006f90fa48492d82b490472e7b5d81f577e4c2fa004" Apr 24 22:32:11.895248 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:11.895189 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6nnq9_6aaf0fb0-1f4a-46f7-a1db-82394fa8792a/console-operator/2.log" Apr 24 22:32:11.895568 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:11.895554 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6nnq9_6aaf0fb0-1f4a-46f7-a1db-82394fa8792a/console-operator/1.log" Apr 24 22:32:11.895643 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:11.895586 2574 generic.go:358] "Generic (PLEG): container finished" podID="6aaf0fb0-1f4a-46f7-a1db-82394fa8792a" containerID="10c85ade7a91bac987591bb4320e49ddec708909fbd021b53f90649fdd43487d" exitCode=255 Apr 24 22:32:11.895643 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:11.895622 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-6nnq9" event={"ID":"6aaf0fb0-1f4a-46f7-a1db-82394fa8792a","Type":"ContainerDied","Data":"10c85ade7a91bac987591bb4320e49ddec708909fbd021b53f90649fdd43487d"} Apr 24 22:32:11.895714 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:11.895651 2574 scope.go:117] "RemoveContainer" containerID="067d0f65a44efa35df6ff006f90fa48492d82b490472e7b5d81f577e4c2fa004" Apr 24 22:32:11.895953 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:11.895931 2574 scope.go:117] "RemoveContainer" containerID="10c85ade7a91bac987591bb4320e49ddec708909fbd021b53f90649fdd43487d" Apr 24 22:32:11.896143 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:32:11.896120 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-6nnq9_openshift-console-operator(6aaf0fb0-1f4a-46f7-a1db-82394fa8792a)\"" pod="openshift-console-operator/console-operator-9d4b6777b-6nnq9" podUID="6aaf0fb0-1f4a-46f7-a1db-82394fa8792a" Apr 24 22:32:12.899519 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:12.899489 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6nnq9_6aaf0fb0-1f4a-46f7-a1db-82394fa8792a/console-operator/2.log" Apr 24 22:32:21.079060 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:21.079009 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-6nnq9" Apr 24 22:32:21.079060 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:21.079062 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-6nnq9" Apr 24 22:32:21.079569 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:21.079414 2574 scope.go:117] "RemoveContainer" containerID="10c85ade7a91bac987591bb4320e49ddec708909fbd021b53f90649fdd43487d" Apr 24 22:32:21.079614 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:32:21.079595 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-6nnq9_openshift-console-operator(6aaf0fb0-1f4a-46f7-a1db-82394fa8792a)\"" pod="openshift-console-operator/console-operator-9d4b6777b-6nnq9" podUID="6aaf0fb0-1f4a-46f7-a1db-82394fa8792a" Apr 24 22:32:22.184040 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.184004 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-846bc59fcf-t65jg"] Apr 24 22:32:22.188577 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.188555 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-846bc59fcf-t65jg" Apr 24 22:32:22.191479 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.191460 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 22:32:22.191621 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.191603 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 22:32:22.191675 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.191661 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-xzwft\"" Apr 24 22:32:22.192498 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.192473 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 22:32:22.200980 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.200951 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-846bc59fcf-t65jg"] Apr 24 22:32:22.203247 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.203224 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 22:32:22.233594 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.233567 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-xs49w"] Apr 24 22:32:22.236449 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.236431 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xs49w" Apr 24 22:32:22.239374 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.239351 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 22:32:22.239567 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.239553 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 22:32:22.239743 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.239728 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-dwrxh\"" Apr 24 22:32:22.240017 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.239995 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 22:32:22.240956 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.240941 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 22:32:22.262863 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.262837 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xs49w"] Apr 24 22:32:22.302059 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.302034 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9bb13040-efc6-4698-8a0a-b1270f5d0998-trusted-ca\") pod \"image-registry-846bc59fcf-t65jg\" (UID: \"9bb13040-efc6-4698-8a0a-b1270f5d0998\") " pod="openshift-image-registry/image-registry-846bc59fcf-t65jg" Apr 24 22:32:22.302154 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.302073 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptbms\" (UniqueName: \"kubernetes.io/projected/9bb13040-efc6-4698-8a0a-b1270f5d0998-kube-api-access-ptbms\") pod \"image-registry-846bc59fcf-t65jg\" (UID: \"9bb13040-efc6-4698-8a0a-b1270f5d0998\") " pod="openshift-image-registry/image-registry-846bc59fcf-t65jg" Apr 24 22:32:22.302154 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.302111 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9bb13040-efc6-4698-8a0a-b1270f5d0998-installation-pull-secrets\") pod \"image-registry-846bc59fcf-t65jg\" (UID: \"9bb13040-efc6-4698-8a0a-b1270f5d0998\") " pod="openshift-image-registry/image-registry-846bc59fcf-t65jg" Apr 24 22:32:22.302154 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.302135 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9bb13040-efc6-4698-8a0a-b1270f5d0998-bound-sa-token\") pod \"image-registry-846bc59fcf-t65jg\" (UID: \"9bb13040-efc6-4698-8a0a-b1270f5d0998\") " pod="openshift-image-registry/image-registry-846bc59fcf-t65jg" Apr 24 22:32:22.302301 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.302166 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9bb13040-efc6-4698-8a0a-b1270f5d0998-registry-certificates\") pod \"image-registry-846bc59fcf-t65jg\" (UID: \"9bb13040-efc6-4698-8a0a-b1270f5d0998\") " pod="openshift-image-registry/image-registry-846bc59fcf-t65jg" Apr 24 22:32:22.302301 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.302199 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9bb13040-efc6-4698-8a0a-b1270f5d0998-image-registry-private-configuration\") pod \"image-registry-846bc59fcf-t65jg\" (UID: \"9bb13040-efc6-4698-8a0a-b1270f5d0998\") " pod="openshift-image-registry/image-registry-846bc59fcf-t65jg" Apr 24 22:32:22.302301 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.302247 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9bb13040-efc6-4698-8a0a-b1270f5d0998-registry-tls\") pod \"image-registry-846bc59fcf-t65jg\" (UID: \"9bb13040-efc6-4698-8a0a-b1270f5d0998\") " pod="openshift-image-registry/image-registry-846bc59fcf-t65jg" Apr 24 22:32:22.302301 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.302285 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9bb13040-efc6-4698-8a0a-b1270f5d0998-ca-trust-extracted\") pod \"image-registry-846bc59fcf-t65jg\" (UID: \"9bb13040-efc6-4698-8a0a-b1270f5d0998\") " pod="openshift-image-registry/image-registry-846bc59fcf-t65jg" Apr 24 22:32:22.402732 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.402709 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9bb13040-efc6-4698-8a0a-b1270f5d0998-installation-pull-secrets\") pod \"image-registry-846bc59fcf-t65jg\" (UID: \"9bb13040-efc6-4698-8a0a-b1270f5d0998\") " pod="openshift-image-registry/image-registry-846bc59fcf-t65jg" Apr 24 22:32:22.402817 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.402734 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9bb13040-efc6-4698-8a0a-b1270f5d0998-bound-sa-token\") pod \"image-registry-846bc59fcf-t65jg\" (UID: \"9bb13040-efc6-4698-8a0a-b1270f5d0998\") " pod="openshift-image-registry/image-registry-846bc59fcf-t65jg" Apr 24 22:32:22.402817 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.402759 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c7c3582c-44dd-492c-b4ba-7bd36140280c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xs49w\" (UID: \"c7c3582c-44dd-492c-b4ba-7bd36140280c\") " pod="openshift-insights/insights-runtime-extractor-xs49w" Apr 24 22:32:22.402817 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.402797 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8njl\" (UniqueName: \"kubernetes.io/projected/c7c3582c-44dd-492c-b4ba-7bd36140280c-kube-api-access-r8njl\") pod \"insights-runtime-extractor-xs49w\" (UID: \"c7c3582c-44dd-492c-b4ba-7bd36140280c\") " pod="openshift-insights/insights-runtime-extractor-xs49w" Apr 24 22:32:22.402981 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.402831 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9bb13040-efc6-4698-8a0a-b1270f5d0998-registry-certificates\") pod \"image-registry-846bc59fcf-t65jg\" (UID: \"9bb13040-efc6-4698-8a0a-b1270f5d0998\") " pod="openshift-image-registry/image-registry-846bc59fcf-t65jg" Apr 24 22:32:22.402981 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.402858 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9bb13040-efc6-4698-8a0a-b1270f5d0998-image-registry-private-configuration\") pod \"image-registry-846bc59fcf-t65jg\" (UID: \"9bb13040-efc6-4698-8a0a-b1270f5d0998\") " pod="openshift-image-registry/image-registry-846bc59fcf-t65jg" Apr 24 22:32:22.402981 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.402915 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9bb13040-efc6-4698-8a0a-b1270f5d0998-registry-tls\") pod \"image-registry-846bc59fcf-t65jg\" (UID: \"9bb13040-efc6-4698-8a0a-b1270f5d0998\") " pod="openshift-image-registry/image-registry-846bc59fcf-t65jg" Apr 24 22:32:22.403130 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.403019 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9bb13040-efc6-4698-8a0a-b1270f5d0998-ca-trust-extracted\") pod \"image-registry-846bc59fcf-t65jg\" (UID: \"9bb13040-efc6-4698-8a0a-b1270f5d0998\") " pod="openshift-image-registry/image-registry-846bc59fcf-t65jg" Apr 24 22:32:22.403130 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.403069 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c7c3582c-44dd-492c-b4ba-7bd36140280c-data-volume\") pod \"insights-runtime-extractor-xs49w\" (UID: \"c7c3582c-44dd-492c-b4ba-7bd36140280c\") " pod="openshift-insights/insights-runtime-extractor-xs49w" Apr 24 22:32:22.403230 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.403133 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9bb13040-efc6-4698-8a0a-b1270f5d0998-trusted-ca\") pod \"image-registry-846bc59fcf-t65jg\" (UID: \"9bb13040-efc6-4698-8a0a-b1270f5d0998\") " pod="openshift-image-registry/image-registry-846bc59fcf-t65jg" Apr 24 22:32:22.403230 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.403188 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c7c3582c-44dd-492c-b4ba-7bd36140280c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xs49w\" (UID: \"c7c3582c-44dd-492c-b4ba-7bd36140280c\") " pod="openshift-insights/insights-runtime-extractor-xs49w" Apr 24 22:32:22.403230 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.403221 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ptbms\" (UniqueName: \"kubernetes.io/projected/9bb13040-efc6-4698-8a0a-b1270f5d0998-kube-api-access-ptbms\") pod \"image-registry-846bc59fcf-t65jg\" (UID: \"9bb13040-efc6-4698-8a0a-b1270f5d0998\") " pod="openshift-image-registry/image-registry-846bc59fcf-t65jg" Apr 24 22:32:22.403380 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.403256 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c7c3582c-44dd-492c-b4ba-7bd36140280c-crio-socket\") pod \"insights-runtime-extractor-xs49w\" (UID: \"c7c3582c-44dd-492c-b4ba-7bd36140280c\") " pod="openshift-insights/insights-runtime-extractor-xs49w" Apr 24 22:32:22.403434 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.403413 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9bb13040-efc6-4698-8a0a-b1270f5d0998-ca-trust-extracted\") pod \"image-registry-846bc59fcf-t65jg\" (UID: \"9bb13040-efc6-4698-8a0a-b1270f5d0998\") " pod="openshift-image-registry/image-registry-846bc59fcf-t65jg" Apr 24 22:32:22.403757 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.403736 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9bb13040-efc6-4698-8a0a-b1270f5d0998-registry-certificates\") pod \"image-registry-846bc59fcf-t65jg\" (UID: \"9bb13040-efc6-4698-8a0a-b1270f5d0998\") " pod="openshift-image-registry/image-registry-846bc59fcf-t65jg" Apr 24 22:32:22.404032 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.404012 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9bb13040-efc6-4698-8a0a-b1270f5d0998-trusted-ca\") pod \"image-registry-846bc59fcf-t65jg\" (UID: \"9bb13040-efc6-4698-8a0a-b1270f5d0998\") " pod="openshift-image-registry/image-registry-846bc59fcf-t65jg" Apr 24 22:32:22.405370 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.405351 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9bb13040-efc6-4698-8a0a-b1270f5d0998-registry-tls\") pod \"image-registry-846bc59fcf-t65jg\" (UID: \"9bb13040-efc6-4698-8a0a-b1270f5d0998\") " pod="openshift-image-registry/image-registry-846bc59fcf-t65jg" Apr 24 22:32:22.405814 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.405792 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9bb13040-efc6-4698-8a0a-b1270f5d0998-image-registry-private-configuration\") pod \"image-registry-846bc59fcf-t65jg\" (UID: \"9bb13040-efc6-4698-8a0a-b1270f5d0998\") " pod="openshift-image-registry/image-registry-846bc59fcf-t65jg" Apr 24 22:32:22.405912 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.405793 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9bb13040-efc6-4698-8a0a-b1270f5d0998-installation-pull-secrets\") pod \"image-registry-846bc59fcf-t65jg\" (UID: \"9bb13040-efc6-4698-8a0a-b1270f5d0998\") " pod="openshift-image-registry/image-registry-846bc59fcf-t65jg" Apr 24 22:32:22.417081 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.417057 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9bb13040-efc6-4698-8a0a-b1270f5d0998-bound-sa-token\") pod \"image-registry-846bc59fcf-t65jg\" (UID: \"9bb13040-efc6-4698-8a0a-b1270f5d0998\") " pod="openshift-image-registry/image-registry-846bc59fcf-t65jg" Apr 24 22:32:22.417305 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.417284 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptbms\" (UniqueName: \"kubernetes.io/projected/9bb13040-efc6-4698-8a0a-b1270f5d0998-kube-api-access-ptbms\") pod \"image-registry-846bc59fcf-t65jg\" (UID: \"9bb13040-efc6-4698-8a0a-b1270f5d0998\") " pod="openshift-image-registry/image-registry-846bc59fcf-t65jg" Apr 24 22:32:22.503028 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.502973 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-846bc59fcf-t65jg" Apr 24 22:32:22.503868 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.503548 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c7c3582c-44dd-492c-b4ba-7bd36140280c-data-volume\") pod \"insights-runtime-extractor-xs49w\" (UID: \"c7c3582c-44dd-492c-b4ba-7bd36140280c\") " pod="openshift-insights/insights-runtime-extractor-xs49w" Apr 24 22:32:22.503868 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.503598 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c7c3582c-44dd-492c-b4ba-7bd36140280c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xs49w\" (UID: \"c7c3582c-44dd-492c-b4ba-7bd36140280c\") " pod="openshift-insights/insights-runtime-extractor-xs49w" Apr 24 22:32:22.503868 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.503622 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c7c3582c-44dd-492c-b4ba-7bd36140280c-crio-socket\") pod \"insights-runtime-extractor-xs49w\" (UID: \"c7c3582c-44dd-492c-b4ba-7bd36140280c\") " pod="openshift-insights/insights-runtime-extractor-xs49w" Apr 24 22:32:22.503868 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.503652 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c7c3582c-44dd-492c-b4ba-7bd36140280c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xs49w\" (UID: \"c7c3582c-44dd-492c-b4ba-7bd36140280c\") " pod="openshift-insights/insights-runtime-extractor-xs49w" Apr 24 22:32:22.503868 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.503682 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r8njl\" (UniqueName: \"kubernetes.io/projected/c7c3582c-44dd-492c-b4ba-7bd36140280c-kube-api-access-r8njl\") pod \"insights-runtime-extractor-xs49w\" (UID: \"c7c3582c-44dd-492c-b4ba-7bd36140280c\") " pod="openshift-insights/insights-runtime-extractor-xs49w" Apr 24 22:32:22.503868 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.503759 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c7c3582c-44dd-492c-b4ba-7bd36140280c-crio-socket\") pod \"insights-runtime-extractor-xs49w\" (UID: \"c7c3582c-44dd-492c-b4ba-7bd36140280c\") " pod="openshift-insights/insights-runtime-extractor-xs49w" Apr 24 22:32:22.504608 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.504585 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c7c3582c-44dd-492c-b4ba-7bd36140280c-data-volume\") pod \"insights-runtime-extractor-xs49w\" (UID: \"c7c3582c-44dd-492c-b4ba-7bd36140280c\") " pod="openshift-insights/insights-runtime-extractor-xs49w" Apr 24 22:32:22.504777 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.504759 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c7c3582c-44dd-492c-b4ba-7bd36140280c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xs49w\" (UID: \"c7c3582c-44dd-492c-b4ba-7bd36140280c\") " pod="openshift-insights/insights-runtime-extractor-xs49w" Apr 24 22:32:22.506285 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.506267 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c7c3582c-44dd-492c-b4ba-7bd36140280c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xs49w\" (UID: \"c7c3582c-44dd-492c-b4ba-7bd36140280c\") " pod="openshift-insights/insights-runtime-extractor-xs49w" Apr 24 22:32:22.512098 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.512077 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8njl\" (UniqueName: \"kubernetes.io/projected/c7c3582c-44dd-492c-b4ba-7bd36140280c-kube-api-access-r8njl\") pod \"insights-runtime-extractor-xs49w\" (UID: \"c7c3582c-44dd-492c-b4ba-7bd36140280c\") " pod="openshift-insights/insights-runtime-extractor-xs49w" Apr 24 22:32:22.544966 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.544937 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xs49w" Apr 24 22:32:22.604582 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.604482 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-service-ca-bundle\") pod \"router-default-6898f6cd9b-c68zq\" (UID: \"b3432e60-bf25-4ee8-876a-a1ee3c4b5846\") " pod="openshift-ingress/router-default-6898f6cd9b-c68zq" Apr 24 22:32:22.604582 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.604574 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2412ae3-7a69-4505-b419-5d96e93a567c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-22dqh\" (UID: \"b2412ae3-7a69-4505-b419-5d96e93a567c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-22dqh" Apr 24 22:32:22.605131 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.605099 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3432e60-bf25-4ee8-876a-a1ee3c4b5846-service-ca-bundle\") pod \"router-default-6898f6cd9b-c68zq\" (UID: \"b3432e60-bf25-4ee8-876a-a1ee3c4b5846\") " pod="openshift-ingress/router-default-6898f6cd9b-c68zq" Apr 24 22:32:22.607148 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.607125 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2412ae3-7a69-4505-b419-5d96e93a567c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-22dqh\" (UID: \"b2412ae3-7a69-4505-b419-5d96e93a567c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-22dqh" Apr 24 22:32:22.623773 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.623752 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-846bc59fcf-t65jg"] Apr 24 22:32:22.627094 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:32:22.627071 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bb13040_efc6_4698_8a0a_b1270f5d0998.slice/crio-114aa37de43c8de742f0ca8eab1dca8ed266c3d1bd820c0729389ad1a8def721 WatchSource:0}: Error finding container 114aa37de43c8de742f0ca8eab1dca8ed266c3d1bd820c0729389ad1a8def721: Status 404 returned error can't find the container with id 114aa37de43c8de742f0ca8eab1dca8ed266c3d1bd820c0729389ad1a8def721 Apr 24 22:32:22.683130 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.683041 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xs49w"] Apr 24 22:32:22.685339 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:32:22.685316 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7c3582c_44dd_492c_b4ba_7bd36140280c.slice/crio-4d6ea1a8a7867301b02cfbc90816df53612a446a8cd52a4730b8316f02f03343 WatchSource:0}: Error finding container 4d6ea1a8a7867301b02cfbc90816df53612a446a8cd52a4730b8316f02f03343: Status 404 returned error can't find the container with id 4d6ea1a8a7867301b02cfbc90816df53612a446a8cd52a4730b8316f02f03343 Apr 24 22:32:22.873392 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.873364 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-t6zmz\"" Apr 24 22:32:22.881644 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.881617 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-6898f6cd9b-c68zq" Apr 24 22:32:22.885860 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.885840 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-8bzn5\"" Apr 24 22:32:22.894160 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.894143 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-22dqh" Apr 24 22:32:22.926649 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.926608 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xs49w" event={"ID":"c7c3582c-44dd-492c-b4ba-7bd36140280c","Type":"ContainerStarted","Data":"5fc70b3919d01e72c510e04ec4225b19c57366d1a9c30f6612738217da5561a4"} Apr 24 22:32:22.926840 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.926659 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xs49w" event={"ID":"c7c3582c-44dd-492c-b4ba-7bd36140280c","Type":"ContainerStarted","Data":"4d6ea1a8a7867301b02cfbc90816df53612a446a8cd52a4730b8316f02f03343"} Apr 24 22:32:22.928898 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.928849 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-846bc59fcf-t65jg" event={"ID":"9bb13040-efc6-4698-8a0a-b1270f5d0998","Type":"ContainerStarted","Data":"d66a3e8e7e7afeb1cf0140440e06a575a19224f258a32237f6fdddbaa00f6290"} Apr 24 22:32:22.929051 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.928912 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-846bc59fcf-t65jg" event={"ID":"9bb13040-efc6-4698-8a0a-b1270f5d0998","Type":"ContainerStarted","Data":"114aa37de43c8de742f0ca8eab1dca8ed266c3d1bd820c0729389ad1a8def721"} Apr 24 22:32:22.929745 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.929720 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-846bc59fcf-t65jg" Apr 24 22:32:22.952814 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:22.952728 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-846bc59fcf-t65jg" podStartSLOduration=0.952707928 podStartE2EDuration="952.707928ms" podCreationTimestamp="2026-04-24 22:32:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:32:22.951945286 +0000 UTC m=+129.001801165" watchObservedRunningTime="2026-04-24 22:32:22.952707928 +0000 UTC m=+129.002563808" Apr 24 22:32:23.014327 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:23.014303 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-6898f6cd9b-c68zq"] Apr 24 22:32:23.016970 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:32:23.016940 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3432e60_bf25_4ee8_876a_a1ee3c4b5846.slice/crio-d2d4b6410cf5917f6a06290e56d97206d7b3f331617bb29a46220b5b5250a703 WatchSource:0}: Error finding container d2d4b6410cf5917f6a06290e56d97206d7b3f331617bb29a46220b5b5250a703: Status 404 returned error can't find the container with id d2d4b6410cf5917f6a06290e56d97206d7b3f331617bb29a46220b5b5250a703 Apr 24 22:32:23.030973 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:23.030952 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-22dqh"] Apr 24 22:32:23.033422 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:32:23.033401 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2412ae3_7a69_4505_b419_5d96e93a567c.slice/crio-0c541e586dbcf4b411e5f6c56031b58327024990b35fd1776a1f6e162d5d26e6 WatchSource:0}: Error finding container 0c541e586dbcf4b411e5f6c56031b58327024990b35fd1776a1f6e162d5d26e6: Status 404 returned error can't find the container with id 0c541e586dbcf4b411e5f6c56031b58327024990b35fd1776a1f6e162d5d26e6 Apr 24 22:32:23.933303 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:23.933267 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xs49w" event={"ID":"c7c3582c-44dd-492c-b4ba-7bd36140280c","Type":"ContainerStarted","Data":"d6415a718a42c72ed3987153784d5efadd041d00e82380e6a6035312f4849406"} Apr 24 22:32:23.934282 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:23.934251 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-22dqh" event={"ID":"b2412ae3-7a69-4505-b419-5d96e93a567c","Type":"ContainerStarted","Data":"0c541e586dbcf4b411e5f6c56031b58327024990b35fd1776a1f6e162d5d26e6"} Apr 24 22:32:23.935449 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:23.935421 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-6898f6cd9b-c68zq" event={"ID":"b3432e60-bf25-4ee8-876a-a1ee3c4b5846","Type":"ContainerStarted","Data":"dba89df3681fa8d45b366c9ef62e18bafa1630fe9748a4d93ad8d6b592f1b074"} Apr 24 22:32:23.935539 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:23.935452 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-6898f6cd9b-c68zq" event={"ID":"b3432e60-bf25-4ee8-876a-a1ee3c4b5846","Type":"ContainerStarted","Data":"d2d4b6410cf5917f6a06290e56d97206d7b3f331617bb29a46220b5b5250a703"} Apr 24 22:32:23.956493 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:23.956456 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-6898f6cd9b-c68zq" podStartSLOduration=33.95644658 podStartE2EDuration="33.95644658s" podCreationTimestamp="2026-04-24 22:31:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:32:23.955036179 +0000 UTC m=+130.004892053" watchObservedRunningTime="2026-04-24 22:32:23.95644658 +0000 UTC m=+130.006302501" Apr 24 22:32:24.319421 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:24.319106 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fd77a426-e63b-4027-97b7-e9893fd72601-metrics-certs\") pod \"network-metrics-daemon-n8ntw\" (UID: \"fd77a426-e63b-4027-97b7-e9893fd72601\") " pod="openshift-multus/network-metrics-daemon-n8ntw" Apr 24 22:32:24.321775 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:24.321719 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fd77a426-e63b-4027-97b7-e9893fd72601-metrics-certs\") pod \"network-metrics-daemon-n8ntw\" (UID: \"fd77a426-e63b-4027-97b7-e9893fd72601\") " pod="openshift-multus/network-metrics-daemon-n8ntw" Apr 24 22:32:24.549283 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:24.549255 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7dgz7\"" Apr 24 22:32:24.557636 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:24.557611 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n8ntw" Apr 24 22:32:24.882401 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:24.882365 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-6898f6cd9b-c68zq" Apr 24 22:32:24.885164 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:24.885135 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-6898f6cd9b-c68zq" Apr 24 22:32:24.938406 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:24.938380 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-6898f6cd9b-c68zq" Apr 24 22:32:24.939562 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:24.939543 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-6898f6cd9b-c68zq" Apr 24 22:32:25.545775 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:25.545741 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-n8ntw"] Apr 24 22:32:25.548955 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:32:25.548929 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd77a426_e63b_4027_97b7_e9893fd72601.slice/crio-42fcb84fa7e5984bb277b4459d4afe6b7a7c956721868d6913fbbd62c4accdfb WatchSource:0}: Error finding container 42fcb84fa7e5984bb277b4459d4afe6b7a7c956721868d6913fbbd62c4accdfb: Status 404 returned error can't find the container with id 42fcb84fa7e5984bb277b4459d4afe6b7a7c956721868d6913fbbd62c4accdfb Apr 24 22:32:25.945717 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:25.944781 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t4xmx"] Apr 24 22:32:25.947971 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:25.947944 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xs49w" event={"ID":"c7c3582c-44dd-492c-b4ba-7bd36140280c","Type":"ContainerStarted","Data":"150836317fee9c9f080247d7fb685eb478f97a9d15ed99b1c9cd9b54aaf54bda"} Apr 24 22:32:25.948106 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:25.947974 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-22dqh" event={"ID":"b2412ae3-7a69-4505-b419-5d96e93a567c","Type":"ContainerStarted","Data":"24c39c7f1aef38e17acd23c99da6ef616159897eade2d9dcdc85c0b0d48f7b90"} Apr 24 22:32:25.948106 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:25.948079 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t4xmx" Apr 24 22:32:25.950712 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:25.950678 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-dfgnd\"" Apr 24 22:32:25.951019 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:25.950999 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 24 22:32:25.954782 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:25.954757 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-n8ntw" event={"ID":"fd77a426-e63b-4027-97b7-e9893fd72601","Type":"ContainerStarted","Data":"42fcb84fa7e5984bb277b4459d4afe6b7a7c956721868d6913fbbd62c4accdfb"} Apr 24 22:32:25.964271 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:25.964249 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t4xmx"] Apr 24 22:32:25.972746 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:25.972708 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-xs49w" podStartSLOduration=1.2864733369999999 podStartE2EDuration="3.972694125s" podCreationTimestamp="2026-04-24 22:32:22 +0000 UTC" firstStartedPulling="2026-04-24 22:32:22.745280093 +0000 UTC m=+128.795135947" lastFinishedPulling="2026-04-24 22:32:25.431500865 +0000 UTC m=+131.481356735" observedRunningTime="2026-04-24 22:32:25.971187603 +0000 UTC m=+132.021043481" watchObservedRunningTime="2026-04-24 22:32:25.972694125 +0000 UTC m=+132.022550030" Apr 24 22:32:26.008321 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:26.008278 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-22dqh" podStartSLOduration=33.61616078 podStartE2EDuration="36.008264403s" podCreationTimestamp="2026-04-24 22:31:50 +0000 UTC" firstStartedPulling="2026-04-24 22:32:23.035719114 +0000 UTC m=+129.085574970" lastFinishedPulling="2026-04-24 22:32:25.42782272 +0000 UTC m=+131.477678593" observedRunningTime="2026-04-24 22:32:26.007787762 +0000 UTC m=+132.057643666" watchObservedRunningTime="2026-04-24 22:32:26.008264403 +0000 UTC m=+132.058120281" Apr 24 22:32:26.032759 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:26.032734 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/7c297fbf-2d2d-4390-9b2e-c7ef5d4a38bd-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-t4xmx\" (UID: \"7c297fbf-2d2d-4390-9b2e-c7ef5d4a38bd\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t4xmx" Apr 24 22:32:26.133353 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:26.133325 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/7c297fbf-2d2d-4390-9b2e-c7ef5d4a38bd-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-t4xmx\" (UID: \"7c297fbf-2d2d-4390-9b2e-c7ef5d4a38bd\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t4xmx" Apr 24 22:32:26.135929 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:26.135895 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/7c297fbf-2d2d-4390-9b2e-c7ef5d4a38bd-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-t4xmx\" (UID: \"7c297fbf-2d2d-4390-9b2e-c7ef5d4a38bd\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t4xmx" Apr 24 22:32:26.263208 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:26.263125 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t4xmx" Apr 24 22:32:26.601072 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:26.601047 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t4xmx"] Apr 24 22:32:26.603807 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:32:26.603775 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c297fbf_2d2d_4390_9b2e_c7ef5d4a38bd.slice/crio-6d13db54a38f6f88d692b01dba5777cd4bcc7f8a16582ce3148ac0e112fbdf36 WatchSource:0}: Error finding container 6d13db54a38f6f88d692b01dba5777cd4bcc7f8a16582ce3148ac0e112fbdf36: Status 404 returned error can't find the container with id 6d13db54a38f6f88d692b01dba5777cd4bcc7f8a16582ce3148ac0e112fbdf36 Apr 24 22:32:26.958511 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:26.958431 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-n8ntw" event={"ID":"fd77a426-e63b-4027-97b7-e9893fd72601","Type":"ContainerStarted","Data":"bfc488b84f1544d0b95238b48c66735a232c24753704cdeb749c140d4aafe2b0"} Apr 24 22:32:26.958511 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:26.958473 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-n8ntw" event={"ID":"fd77a426-e63b-4027-97b7-e9893fd72601","Type":"ContainerStarted","Data":"42bb3c2337af744e236fcf46c462a917864d6cd0e3b38bd8fe32c454c2c475e8"} Apr 24 22:32:26.959472 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:26.959448 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t4xmx" event={"ID":"7c297fbf-2d2d-4390-9b2e-c7ef5d4a38bd","Type":"ContainerStarted","Data":"6d13db54a38f6f88d692b01dba5777cd4bcc7f8a16582ce3148ac0e112fbdf36"} Apr 24 22:32:26.975322 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:26.975286 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-n8ntw" podStartSLOduration=131.993734687 podStartE2EDuration="2m12.975275596s" podCreationTimestamp="2026-04-24 22:30:14 +0000 UTC" firstStartedPulling="2026-04-24 22:32:25.550710599 +0000 UTC m=+131.600566453" lastFinishedPulling="2026-04-24 22:32:26.532251492 +0000 UTC m=+132.582107362" observedRunningTime="2026-04-24 22:32:26.974276149 +0000 UTC m=+133.024132028" watchObservedRunningTime="2026-04-24 22:32:26.975275596 +0000 UTC m=+133.025131472" Apr 24 22:32:27.963458 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:27.963374 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t4xmx" event={"ID":"7c297fbf-2d2d-4390-9b2e-c7ef5d4a38bd","Type":"ContainerStarted","Data":"3369358b37fafac7633b82f14123bccb7af65473b146c34e12852e1b3b57efc4"} Apr 24 22:32:27.979215 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:27.979141 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t4xmx" podStartSLOduration=2.041826524 podStartE2EDuration="2.97912632s" podCreationTimestamp="2026-04-24 22:32:25 +0000 UTC" firstStartedPulling="2026-04-24 22:32:26.606181171 +0000 UTC m=+132.656037030" lastFinishedPulling="2026-04-24 22:32:27.543480972 +0000 UTC m=+133.593336826" observedRunningTime="2026-04-24 22:32:27.978193569 +0000 UTC m=+134.028049448" watchObservedRunningTime="2026-04-24 22:32:27.97912632 +0000 UTC m=+134.028982197" Apr 24 22:32:28.966815 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:28.966777 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t4xmx" Apr 24 22:32:28.971854 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:28.971829 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t4xmx" Apr 24 22:32:30.007138 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:30.007105 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-c49zn"] Apr 24 22:32:30.011388 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:30.011369 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-c49zn" Apr 24 22:32:30.014142 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:30.014119 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 24 22:32:30.014236 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:30.014165 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 24 22:32:30.015273 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:30.015260 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 22:32:30.015358 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:30.015323 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-jmh8k\"" Apr 24 22:32:30.019520 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:30.019497 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-c49zn"] Apr 24 22:32:30.164796 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:30.164760 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd8gp\" (UniqueName: \"kubernetes.io/projected/dc88cfaa-dbaf-4194-a0fe-e1ca69e89e4e-kube-api-access-fd8gp\") pod \"prometheus-operator-5676c8c784-c49zn\" (UID: \"dc88cfaa-dbaf-4194-a0fe-e1ca69e89e4e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-c49zn" Apr 24 22:32:30.164796 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:30.164795 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dc88cfaa-dbaf-4194-a0fe-e1ca69e89e4e-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-c49zn\" (UID: \"dc88cfaa-dbaf-4194-a0fe-e1ca69e89e4e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-c49zn" Apr 24 22:32:30.165043 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:30.164822 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc88cfaa-dbaf-4194-a0fe-e1ca69e89e4e-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-c49zn\" (UID: \"dc88cfaa-dbaf-4194-a0fe-e1ca69e89e4e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-c49zn" Apr 24 22:32:30.165043 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:30.164948 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dc88cfaa-dbaf-4194-a0fe-e1ca69e89e4e-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-c49zn\" (UID: \"dc88cfaa-dbaf-4194-a0fe-e1ca69e89e4e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-c49zn" Apr 24 22:32:30.266208 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:30.266139 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fd8gp\" (UniqueName: \"kubernetes.io/projected/dc88cfaa-dbaf-4194-a0fe-e1ca69e89e4e-kube-api-access-fd8gp\") pod \"prometheus-operator-5676c8c784-c49zn\" (UID: \"dc88cfaa-dbaf-4194-a0fe-e1ca69e89e4e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-c49zn" Apr 24 22:32:30.266208 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:30.266173 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dc88cfaa-dbaf-4194-a0fe-e1ca69e89e4e-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-c49zn\" (UID: \"dc88cfaa-dbaf-4194-a0fe-e1ca69e89e4e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-c49zn" Apr 24 22:32:30.266208 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:30.266200 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc88cfaa-dbaf-4194-a0fe-e1ca69e89e4e-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-c49zn\" (UID: \"dc88cfaa-dbaf-4194-a0fe-e1ca69e89e4e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-c49zn" Apr 24 22:32:30.266375 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:30.266262 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dc88cfaa-dbaf-4194-a0fe-e1ca69e89e4e-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-c49zn\" (UID: \"dc88cfaa-dbaf-4194-a0fe-e1ca69e89e4e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-c49zn" Apr 24 22:32:30.266988 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:30.266969 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dc88cfaa-dbaf-4194-a0fe-e1ca69e89e4e-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-c49zn\" (UID: \"dc88cfaa-dbaf-4194-a0fe-e1ca69e89e4e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-c49zn" Apr 24 22:32:30.268624 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:30.268593 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc88cfaa-dbaf-4194-a0fe-e1ca69e89e4e-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-c49zn\" (UID: \"dc88cfaa-dbaf-4194-a0fe-e1ca69e89e4e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-c49zn" Apr 24 22:32:30.268736 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:30.268658 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dc88cfaa-dbaf-4194-a0fe-e1ca69e89e4e-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-c49zn\" (UID: \"dc88cfaa-dbaf-4194-a0fe-e1ca69e89e4e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-c49zn" Apr 24 22:32:30.273906 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:30.273886 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd8gp\" (UniqueName: \"kubernetes.io/projected/dc88cfaa-dbaf-4194-a0fe-e1ca69e89e4e-kube-api-access-fd8gp\") pod \"prometheus-operator-5676c8c784-c49zn\" (UID: \"dc88cfaa-dbaf-4194-a0fe-e1ca69e89e4e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-c49zn" Apr 24 22:32:30.321302 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:30.321277 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-c49zn" Apr 24 22:32:30.433910 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:30.433864 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-c49zn"] Apr 24 22:32:30.436210 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:32:30.436177 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc88cfaa_dbaf_4194_a0fe_e1ca69e89e4e.slice/crio-255bdc7ecef611b1c52b929ef6c15157367d297fedf3bae8e6b516338de07689 WatchSource:0}: Error finding container 255bdc7ecef611b1c52b929ef6c15157367d297fedf3bae8e6b516338de07689: Status 404 returned error can't find the container with id 255bdc7ecef611b1c52b929ef6c15157367d297fedf3bae8e6b516338de07689 Apr 24 22:32:30.973046 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:30.973008 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-c49zn" event={"ID":"dc88cfaa-dbaf-4194-a0fe-e1ca69e89e4e","Type":"ContainerStarted","Data":"255bdc7ecef611b1c52b929ef6c15157367d297fedf3bae8e6b516338de07689"} Apr 24 22:32:31.977355 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:31.977315 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-c49zn" event={"ID":"dc88cfaa-dbaf-4194-a0fe-e1ca69e89e4e","Type":"ContainerStarted","Data":"6ec262be87914394b3a7b6b04ff8880ee7d6c3dd0b330d0bc40c678b9c43cfee"} Apr 24 22:32:31.977355 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:31.977363 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-c49zn" event={"ID":"dc88cfaa-dbaf-4194-a0fe-e1ca69e89e4e","Type":"ContainerStarted","Data":"91501cab1bda9df4befbd0e83d55723880b3db7dda2cb91e69861ce69022867a"} Apr 24 22:32:32.000149 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:32.000102 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-c49zn" podStartSLOduration=1.685142753 podStartE2EDuration="3.000087316s" podCreationTimestamp="2026-04-24 22:32:29 +0000 UTC" firstStartedPulling="2026-04-24 22:32:30.438058468 +0000 UTC m=+136.487914323" lastFinishedPulling="2026-04-24 22:32:31.753003022 +0000 UTC m=+137.802858886" observedRunningTime="2026-04-24 22:32:31.999160818 +0000 UTC m=+138.049016695" watchObservedRunningTime="2026-04-24 22:32:32.000087316 +0000 UTC m=+138.049943228" Apr 24 22:32:34.356487 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.356448 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-7jvfr"] Apr 24 22:32:34.359891 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.359850 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-7jvfr" Apr 24 22:32:34.360028 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.360013 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-tpkcp"] Apr 24 22:32:34.363110 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.363081 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tpkcp" Apr 24 22:32:34.363229 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.363140 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-48jw6\"" Apr 24 22:32:34.363229 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.363202 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 24 22:32:34.363353 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.363237 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 24 22:32:34.363640 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.363624 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 24 22:32:34.365540 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.365513 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 22:32:34.365851 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.365831 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 22:32:34.366005 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.365984 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-8rvqf\"" Apr 24 22:32:34.366101 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.365988 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 22:32:34.379761 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.379742 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-7jvfr"] Apr 24 22:32:34.495724 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.495694 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e1635d2a-643f-4246-8704-77815f21915e-node-exporter-accelerators-collector-config\") pod \"node-exporter-tpkcp\" (UID: \"e1635d2a-643f-4246-8704-77815f21915e\") " pod="openshift-monitoring/node-exporter-tpkcp" Apr 24 22:32:34.495840 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.495728 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgh7h\" (UniqueName: \"kubernetes.io/projected/e1635d2a-643f-4246-8704-77815f21915e-kube-api-access-fgh7h\") pod \"node-exporter-tpkcp\" (UID: \"e1635d2a-643f-4246-8704-77815f21915e\") " pod="openshift-monitoring/node-exporter-tpkcp" Apr 24 22:32:34.495840 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.495759 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/867be6f7-bbb4-46de-b65e-2de56e6995cb-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-7jvfr\" (UID: \"867be6f7-bbb4-46de-b65e-2de56e6995cb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7jvfr" Apr 24 22:32:34.495840 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.495820 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfmlc\" (UniqueName: \"kubernetes.io/projected/867be6f7-bbb4-46de-b65e-2de56e6995cb-kube-api-access-kfmlc\") pod \"kube-state-metrics-69db897b98-7jvfr\" (UID: \"867be6f7-bbb4-46de-b65e-2de56e6995cb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7jvfr" Apr 24 22:32:34.496002 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.495853 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e1635d2a-643f-4246-8704-77815f21915e-node-exporter-wtmp\") pod \"node-exporter-tpkcp\" (UID: \"e1635d2a-643f-4246-8704-77815f21915e\") " pod="openshift-monitoring/node-exporter-tpkcp" Apr 24 22:32:34.496002 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.495925 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/867be6f7-bbb4-46de-b65e-2de56e6995cb-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-7jvfr\" (UID: \"867be6f7-bbb4-46de-b65e-2de56e6995cb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7jvfr" Apr 24 22:32:34.496002 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.495944 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e1635d2a-643f-4246-8704-77815f21915e-root\") pod \"node-exporter-tpkcp\" (UID: \"e1635d2a-643f-4246-8704-77815f21915e\") " pod="openshift-monitoring/node-exporter-tpkcp" Apr 24 22:32:34.496002 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.495967 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/867be6f7-bbb4-46de-b65e-2de56e6995cb-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-7jvfr\" (UID: \"867be6f7-bbb4-46de-b65e-2de56e6995cb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7jvfr" Apr 24 22:32:34.496002 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.495999 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/867be6f7-bbb4-46de-b65e-2de56e6995cb-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-7jvfr\" (UID: \"867be6f7-bbb4-46de-b65e-2de56e6995cb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7jvfr" Apr 24 22:32:34.496213 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.496016 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/867be6f7-bbb4-46de-b65e-2de56e6995cb-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-7jvfr\" (UID: \"867be6f7-bbb4-46de-b65e-2de56e6995cb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7jvfr" Apr 24 22:32:34.496213 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.496076 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e1635d2a-643f-4246-8704-77815f21915e-sys\") pod \"node-exporter-tpkcp\" (UID: \"e1635d2a-643f-4246-8704-77815f21915e\") " pod="openshift-monitoring/node-exporter-tpkcp" Apr 24 22:32:34.496213 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.496107 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e1635d2a-643f-4246-8704-77815f21915e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tpkcp\" (UID: \"e1635d2a-643f-4246-8704-77815f21915e\") " pod="openshift-monitoring/node-exporter-tpkcp" Apr 24 22:32:34.496213 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.496153 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e1635d2a-643f-4246-8704-77815f21915e-node-exporter-textfile\") pod \"node-exporter-tpkcp\" (UID: \"e1635d2a-643f-4246-8704-77815f21915e\") " pod="openshift-monitoring/node-exporter-tpkcp" Apr 24 22:32:34.496213 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.496185 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e1635d2a-643f-4246-8704-77815f21915e-node-exporter-tls\") pod \"node-exporter-tpkcp\" (UID: \"e1635d2a-643f-4246-8704-77815f21915e\") " pod="openshift-monitoring/node-exporter-tpkcp" Apr 24 22:32:34.496213 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.496205 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e1635d2a-643f-4246-8704-77815f21915e-metrics-client-ca\") pod \"node-exporter-tpkcp\" (UID: \"e1635d2a-643f-4246-8704-77815f21915e\") " pod="openshift-monitoring/node-exporter-tpkcp" Apr 24 22:32:34.597117 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.597084 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/867be6f7-bbb4-46de-b65e-2de56e6995cb-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-7jvfr\" (UID: \"867be6f7-bbb4-46de-b65e-2de56e6995cb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7jvfr" Apr 24 22:32:34.597117 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.597112 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e1635d2a-643f-4246-8704-77815f21915e-root\") pod \"node-exporter-tpkcp\" (UID: \"e1635d2a-643f-4246-8704-77815f21915e\") " pod="openshift-monitoring/node-exporter-tpkcp" Apr 24 22:32:34.597361 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.597131 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/867be6f7-bbb4-46de-b65e-2de56e6995cb-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-7jvfr\" (UID: \"867be6f7-bbb4-46de-b65e-2de56e6995cb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7jvfr" Apr 24 22:32:34.597361 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.597184 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e1635d2a-643f-4246-8704-77815f21915e-root\") pod \"node-exporter-tpkcp\" (UID: \"e1635d2a-643f-4246-8704-77815f21915e\") " pod="openshift-monitoring/node-exporter-tpkcp" Apr 24 22:32:34.597361 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.597207 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/867be6f7-bbb4-46de-b65e-2de56e6995cb-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-7jvfr\" (UID: \"867be6f7-bbb4-46de-b65e-2de56e6995cb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7jvfr" Apr 24 22:32:34.597361 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.597225 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/867be6f7-bbb4-46de-b65e-2de56e6995cb-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-7jvfr\" (UID: \"867be6f7-bbb4-46de-b65e-2de56e6995cb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7jvfr" Apr 24 22:32:34.597361 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.597245 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e1635d2a-643f-4246-8704-77815f21915e-sys\") pod \"node-exporter-tpkcp\" (UID: \"e1635d2a-643f-4246-8704-77815f21915e\") " pod="openshift-monitoring/node-exporter-tpkcp" Apr 24 22:32:34.597361 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.597263 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e1635d2a-643f-4246-8704-77815f21915e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tpkcp\" (UID: \"e1635d2a-643f-4246-8704-77815f21915e\") " pod="openshift-monitoring/node-exporter-tpkcp" Apr 24 22:32:34.597361 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.597297 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e1635d2a-643f-4246-8704-77815f21915e-node-exporter-textfile\") pod \"node-exporter-tpkcp\" (UID: \"e1635d2a-643f-4246-8704-77815f21915e\") " pod="openshift-monitoring/node-exporter-tpkcp" Apr 24 22:32:34.597361 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.597321 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e1635d2a-643f-4246-8704-77815f21915e-sys\") pod \"node-exporter-tpkcp\" (UID: \"e1635d2a-643f-4246-8704-77815f21915e\") " pod="openshift-monitoring/node-exporter-tpkcp" Apr 24 22:32:34.597361 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.597325 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e1635d2a-643f-4246-8704-77815f21915e-node-exporter-tls\") pod \"node-exporter-tpkcp\" (UID: \"e1635d2a-643f-4246-8704-77815f21915e\") " pod="openshift-monitoring/node-exporter-tpkcp" Apr 24 22:32:34.597767 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.597370 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e1635d2a-643f-4246-8704-77815f21915e-metrics-client-ca\") pod \"node-exporter-tpkcp\" (UID: \"e1635d2a-643f-4246-8704-77815f21915e\") " pod="openshift-monitoring/node-exporter-tpkcp" Apr 24 22:32:34.597767 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:32:34.597394 2574 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 22:32:34.597767 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.597419 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e1635d2a-643f-4246-8704-77815f21915e-node-exporter-accelerators-collector-config\") pod \"node-exporter-tpkcp\" (UID: \"e1635d2a-643f-4246-8704-77815f21915e\") " pod="openshift-monitoring/node-exporter-tpkcp" Apr 24 22:32:34.597767 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:32:34.597446 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1635d2a-643f-4246-8704-77815f21915e-node-exporter-tls podName:e1635d2a-643f-4246-8704-77815f21915e nodeName:}" failed. No retries permitted until 2026-04-24 22:32:35.097428986 +0000 UTC m=+141.147284842 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/e1635d2a-643f-4246-8704-77815f21915e-node-exporter-tls") pod "node-exporter-tpkcp" (UID: "e1635d2a-643f-4246-8704-77815f21915e") : secret "node-exporter-tls" not found Apr 24 22:32:34.597767 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.597471 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fgh7h\" (UniqueName: \"kubernetes.io/projected/e1635d2a-643f-4246-8704-77815f21915e-kube-api-access-fgh7h\") pod \"node-exporter-tpkcp\" (UID: \"e1635d2a-643f-4246-8704-77815f21915e\") " pod="openshift-monitoring/node-exporter-tpkcp" Apr 24 22:32:34.597767 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.597505 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/867be6f7-bbb4-46de-b65e-2de56e6995cb-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-7jvfr\" (UID: \"867be6f7-bbb4-46de-b65e-2de56e6995cb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7jvfr" Apr 24 22:32:34.597767 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.597530 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kfmlc\" (UniqueName: \"kubernetes.io/projected/867be6f7-bbb4-46de-b65e-2de56e6995cb-kube-api-access-kfmlc\") pod \"kube-state-metrics-69db897b98-7jvfr\" (UID: \"867be6f7-bbb4-46de-b65e-2de56e6995cb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7jvfr" Apr 24 22:32:34.597767 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.597554 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e1635d2a-643f-4246-8704-77815f21915e-node-exporter-wtmp\") pod \"node-exporter-tpkcp\" (UID: \"e1635d2a-643f-4246-8704-77815f21915e\") " pod="openshift-monitoring/node-exporter-tpkcp" Apr 24 22:32:34.597767 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.597613 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/867be6f7-bbb4-46de-b65e-2de56e6995cb-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-7jvfr\" (UID: \"867be6f7-bbb4-46de-b65e-2de56e6995cb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7jvfr" Apr 24 22:32:34.597767 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.597706 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e1635d2a-643f-4246-8704-77815f21915e-node-exporter-wtmp\") pod \"node-exporter-tpkcp\" (UID: \"e1635d2a-643f-4246-8704-77815f21915e\") " pod="openshift-monitoring/node-exporter-tpkcp" Apr 24 22:32:34.597767 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.597722 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e1635d2a-643f-4246-8704-77815f21915e-node-exporter-textfile\") pod \"node-exporter-tpkcp\" (UID: \"e1635d2a-643f-4246-8704-77815f21915e\") " pod="openshift-monitoring/node-exporter-tpkcp" Apr 24 22:32:34.598288 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.598083 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e1635d2a-643f-4246-8704-77815f21915e-node-exporter-accelerators-collector-config\") pod \"node-exporter-tpkcp\" (UID: \"e1635d2a-643f-4246-8704-77815f21915e\") " pod="openshift-monitoring/node-exporter-tpkcp" Apr 24 22:32:34.598619 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.598596 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/867be6f7-bbb4-46de-b65e-2de56e6995cb-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-7jvfr\" (UID: \"867be6f7-bbb4-46de-b65e-2de56e6995cb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7jvfr" Apr 24 22:32:34.598697 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.598676 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e1635d2a-643f-4246-8704-77815f21915e-metrics-client-ca\") pod \"node-exporter-tpkcp\" (UID: \"e1635d2a-643f-4246-8704-77815f21915e\") " pod="openshift-monitoring/node-exporter-tpkcp" Apr 24 22:32:34.599151 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.599128 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/867be6f7-bbb4-46de-b65e-2de56e6995cb-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-7jvfr\" (UID: \"867be6f7-bbb4-46de-b65e-2de56e6995cb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7jvfr" Apr 24 22:32:34.599952 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.599933 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/867be6f7-bbb4-46de-b65e-2de56e6995cb-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-7jvfr\" (UID: \"867be6f7-bbb4-46de-b65e-2de56e6995cb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7jvfr" Apr 24 22:32:34.600356 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.600340 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e1635d2a-643f-4246-8704-77815f21915e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tpkcp\" (UID: \"e1635d2a-643f-4246-8704-77815f21915e\") " pod="openshift-monitoring/node-exporter-tpkcp" Apr 24 22:32:34.600420 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.600398 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/867be6f7-bbb4-46de-b65e-2de56e6995cb-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-7jvfr\" (UID: \"867be6f7-bbb4-46de-b65e-2de56e6995cb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7jvfr" Apr 24 22:32:34.608212 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.608163 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgh7h\" (UniqueName: \"kubernetes.io/projected/e1635d2a-643f-4246-8704-77815f21915e-kube-api-access-fgh7h\") pod \"node-exporter-tpkcp\" (UID: \"e1635d2a-643f-4246-8704-77815f21915e\") " pod="openshift-monitoring/node-exporter-tpkcp" Apr 24 22:32:34.608631 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.608610 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfmlc\" (UniqueName: \"kubernetes.io/projected/867be6f7-bbb4-46de-b65e-2de56e6995cb-kube-api-access-kfmlc\") pod \"kube-state-metrics-69db897b98-7jvfr\" (UID: \"867be6f7-bbb4-46de-b65e-2de56e6995cb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7jvfr" Apr 24 22:32:34.670406 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.670385 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-7jvfr" Apr 24 22:32:34.800909 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.800868 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-7jvfr"] Apr 24 22:32:34.803227 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:32:34.803205 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod867be6f7_bbb4_46de_b65e_2de56e6995cb.slice/crio-648e61a72b7e31e4b61785ea4f43ca54910d7f166c9120fd1a83f942f7dde488 WatchSource:0}: Error finding container 648e61a72b7e31e4b61785ea4f43ca54910d7f166c9120fd1a83f942f7dde488: Status 404 returned error can't find the container with id 648e61a72b7e31e4b61785ea4f43ca54910d7f166c9120fd1a83f942f7dde488 Apr 24 22:32:34.985398 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:34.985330 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-7jvfr" event={"ID":"867be6f7-bbb4-46de-b65e-2de56e6995cb","Type":"ContainerStarted","Data":"648e61a72b7e31e4b61785ea4f43ca54910d7f166c9120fd1a83f942f7dde488"} Apr 24 22:32:35.099883 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:35.099848 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e1635d2a-643f-4246-8704-77815f21915e-node-exporter-tls\") pod \"node-exporter-tpkcp\" (UID: \"e1635d2a-643f-4246-8704-77815f21915e\") " pod="openshift-monitoring/node-exporter-tpkcp" Apr 24 22:32:35.101999 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:35.101980 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e1635d2a-643f-4246-8704-77815f21915e-node-exporter-tls\") pod \"node-exporter-tpkcp\" (UID: \"e1635d2a-643f-4246-8704-77815f21915e\") " pod="openshift-monitoring/node-exporter-tpkcp" Apr 24 22:32:35.274313 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:35.274201 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tpkcp" Apr 24 22:32:35.286988 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:32:35.286956 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1635d2a_643f_4246_8704_77815f21915e.slice/crio-d2f977e38b6fe4455fc3ecb378c9f983036fdcdf86eeb5a2781156b2faf2213f WatchSource:0}: Error finding container d2f977e38b6fe4455fc3ecb378c9f983036fdcdf86eeb5a2781156b2faf2213f: Status 404 returned error can't find the container with id d2f977e38b6fe4455fc3ecb378c9f983036fdcdf86eeb5a2781156b2faf2213f Apr 24 22:32:35.531124 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:35.531047 2574 scope.go:117] "RemoveContainer" containerID="10c85ade7a91bac987591bb4320e49ddec708909fbd021b53f90649fdd43487d" Apr 24 22:32:35.990013 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:35.989950 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6nnq9_6aaf0fb0-1f4a-46f7-a1db-82394fa8792a/console-operator/2.log" Apr 24 22:32:35.990153 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:35.990047 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-6nnq9" event={"ID":"6aaf0fb0-1f4a-46f7-a1db-82394fa8792a","Type":"ContainerStarted","Data":"18dac4cc8e89a671f3da43d5a1dd9ea90321b7e1f47a1cd82ca92eba23894015"} Apr 24 22:32:35.990377 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:35.990353 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-6nnq9" Apr 24 22:32:35.991224 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:35.991199 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tpkcp" event={"ID":"e1635d2a-643f-4246-8704-77815f21915e","Type":"ContainerStarted","Data":"d2f977e38b6fe4455fc3ecb378c9f983036fdcdf86eeb5a2781156b2faf2213f"} Apr 24 22:32:36.011197 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:36.011144 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-6nnq9" podStartSLOduration=43.607224429 podStartE2EDuration="46.011132456s" podCreationTimestamp="2026-04-24 22:31:50 +0000 UTC" firstStartedPulling="2026-04-24 22:31:51.200323336 +0000 UTC m=+97.250179206" lastFinishedPulling="2026-04-24 22:31:53.604231374 +0000 UTC m=+99.654087233" observedRunningTime="2026-04-24 22:32:36.009731828 +0000 UTC m=+142.059587727" watchObservedRunningTime="2026-04-24 22:32:36.011132456 +0000 UTC m=+142.060988392" Apr 24 22:32:36.048427 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:36.048400 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-6nnq9" Apr 24 22:32:36.995014 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:36.994932 2574 generic.go:358] "Generic (PLEG): container finished" podID="e1635d2a-643f-4246-8704-77815f21915e" containerID="21ca93c6ff765e1e3784996ed4e8fbcb93576b40cc8b562ea7a7613a6aec0537" exitCode=0 Apr 24 22:32:36.995422 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:36.995017 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tpkcp" event={"ID":"e1635d2a-643f-4246-8704-77815f21915e","Type":"ContainerDied","Data":"21ca93c6ff765e1e3784996ed4e8fbcb93576b40cc8b562ea7a7613a6aec0537"} Apr 24 22:32:36.996905 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:36.996863 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-7jvfr" event={"ID":"867be6f7-bbb4-46de-b65e-2de56e6995cb","Type":"ContainerStarted","Data":"c1698df3800303154e42d773442342dde8f06a1832257135a18475ee853a97cc"} Apr 24 22:32:36.997014 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:36.996908 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-7jvfr" event={"ID":"867be6f7-bbb4-46de-b65e-2de56e6995cb","Type":"ContainerStarted","Data":"9ba68a8acd47856ba7dae04d30f1af777ee0ca40f1fecb7deffff4ff8a4e9dc2"} Apr 24 22:32:36.997014 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:36.996918 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-7jvfr" event={"ID":"867be6f7-bbb4-46de-b65e-2de56e6995cb","Type":"ContainerStarted","Data":"74896587d6072e0d19869af1aeda60ada412cb27b04cfcb6ff449418f2d3badb"} Apr 24 22:32:37.053974 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:37.053935 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-7jvfr" podStartSLOduration=1.6423497440000001 podStartE2EDuration="3.053921622s" podCreationTimestamp="2026-04-24 22:32:34 +0000 UTC" firstStartedPulling="2026-04-24 22:32:34.805051118 +0000 UTC m=+140.854906973" lastFinishedPulling="2026-04-24 22:32:36.216622993 +0000 UTC m=+142.266478851" observedRunningTime="2026-04-24 22:32:37.052612843 +0000 UTC m=+143.102468723" watchObservedRunningTime="2026-04-24 22:32:37.053921622 +0000 UTC m=+143.103777498" Apr 24 22:32:38.001262 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:38.001223 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tpkcp" event={"ID":"e1635d2a-643f-4246-8704-77815f21915e","Type":"ContainerStarted","Data":"bdaaca6a4b6c147b9f070fef53c37fc333c7b635bb9af4c627c5a9fbe5344ece"} Apr 24 22:32:38.001262 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:38.001267 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tpkcp" event={"ID":"e1635d2a-643f-4246-8704-77815f21915e","Type":"ContainerStarted","Data":"c3163cba10d1ecc5a37faededf08113433604e5ce9602cb29bb96bab2a3bc74a"} Apr 24 22:32:38.022282 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:38.022235 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-tpkcp" podStartSLOduration=3.092573485 podStartE2EDuration="4.022223106s" podCreationTimestamp="2026-04-24 22:32:34 +0000 UTC" firstStartedPulling="2026-04-24 22:32:35.288814264 +0000 UTC m=+141.338670122" lastFinishedPulling="2026-04-24 22:32:36.218463885 +0000 UTC m=+142.268319743" observedRunningTime="2026-04-24 22:32:38.021266175 +0000 UTC m=+144.071122051" watchObservedRunningTime="2026-04-24 22:32:38.022223106 +0000 UTC m=+144.072078983" Apr 24 22:32:39.125428 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.125394 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-nw82c"] Apr 24 22:32:39.128596 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.128577 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nw82c" Apr 24 22:32:39.130964 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.130944 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-5bn68\"" Apr 24 22:32:39.131055 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.131010 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 24 22:32:39.133370 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.133347 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c7a56dd6-0aa1-4189-9b4e-e176c8e9aece-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-nw82c\" (UID: \"c7a56dd6-0aa1-4189-9b4e-e176c8e9aece\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nw82c" Apr 24 22:32:39.134507 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.134487 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-nw82c"] Apr 24 22:32:39.234321 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.234290 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c7a56dd6-0aa1-4189-9b4e-e176c8e9aece-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-nw82c\" (UID: \"c7a56dd6-0aa1-4189-9b4e-e176c8e9aece\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nw82c" Apr 24 22:32:39.234406 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:32:39.234374 2574 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 24 22:32:39.234456 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:32:39.234426 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7a56dd6-0aa1-4189-9b4e-e176c8e9aece-monitoring-plugin-cert podName:c7a56dd6-0aa1-4189-9b4e-e176c8e9aece nodeName:}" failed. No retries permitted until 2026-04-24 22:32:39.73441122 +0000 UTC m=+145.784267075 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/c7a56dd6-0aa1-4189-9b4e-e176c8e9aece-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-nw82c" (UID: "c7a56dd6-0aa1-4189-9b4e-e176c8e9aece") : secret "monitoring-plugin-cert" not found Apr 24 22:32:39.555039 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.555001 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-65c9867bbc-b7tk9"] Apr 24 22:32:39.558805 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.558780 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-65c9867bbc-b7tk9" Apr 24 22:32:39.561409 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.561384 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-hqrm9\"" Apr 24 22:32:39.561645 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.561626 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 24 22:32:39.561847 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.561822 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 24 22:32:39.562417 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.562400 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 24 22:32:39.562417 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.562410 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 24 22:32:39.562553 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.562451 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 24 22:32:39.569633 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.569608 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 24 22:32:39.572008 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.571986 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-65c9867bbc-b7tk9"] Apr 24 22:32:39.636845 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.636822 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/e0147744-e2f5-44c3-87b7-4c69075afa43-telemeter-client-tls\") pod \"telemeter-client-65c9867bbc-b7tk9\" (UID: \"e0147744-e2f5-44c3-87b7-4c69075afa43\") " pod="openshift-monitoring/telemeter-client-65c9867bbc-b7tk9" Apr 24 22:32:39.636964 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.636854 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e0147744-e2f5-44c3-87b7-4c69075afa43-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-65c9867bbc-b7tk9\" (UID: \"e0147744-e2f5-44c3-87b7-4c69075afa43\") " pod="openshift-monitoring/telemeter-client-65c9867bbc-b7tk9" Apr 24 22:32:39.636964 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.636942 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/e0147744-e2f5-44c3-87b7-4c69075afa43-secret-telemeter-client\") pod \"telemeter-client-65c9867bbc-b7tk9\" (UID: \"e0147744-e2f5-44c3-87b7-4c69075afa43\") " pod="openshift-monitoring/telemeter-client-65c9867bbc-b7tk9" Apr 24 22:32:39.637041 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.637011 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nv9l\" (UniqueName: \"kubernetes.io/projected/e0147744-e2f5-44c3-87b7-4c69075afa43-kube-api-access-2nv9l\") pod \"telemeter-client-65c9867bbc-b7tk9\" (UID: \"e0147744-e2f5-44c3-87b7-4c69075afa43\") " pod="openshift-monitoring/telemeter-client-65c9867bbc-b7tk9" Apr 24 22:32:39.637079 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.637050 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/e0147744-e2f5-44c3-87b7-4c69075afa43-federate-client-tls\") pod \"telemeter-client-65c9867bbc-b7tk9\" (UID: \"e0147744-e2f5-44c3-87b7-4c69075afa43\") " pod="openshift-monitoring/telemeter-client-65c9867bbc-b7tk9" Apr 24 22:32:39.637079 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.637068 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0147744-e2f5-44c3-87b7-4c69075afa43-telemeter-trusted-ca-bundle\") pod \"telemeter-client-65c9867bbc-b7tk9\" (UID: \"e0147744-e2f5-44c3-87b7-4c69075afa43\") " pod="openshift-monitoring/telemeter-client-65c9867bbc-b7tk9" Apr 24 22:32:39.637140 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.637107 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e0147744-e2f5-44c3-87b7-4c69075afa43-metrics-client-ca\") pod \"telemeter-client-65c9867bbc-b7tk9\" (UID: \"e0147744-e2f5-44c3-87b7-4c69075afa43\") " pod="openshift-monitoring/telemeter-client-65c9867bbc-b7tk9" Apr 24 22:32:39.637175 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.637148 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0147744-e2f5-44c3-87b7-4c69075afa43-serving-certs-ca-bundle\") pod \"telemeter-client-65c9867bbc-b7tk9\" (UID: \"e0147744-e2f5-44c3-87b7-4c69075afa43\") " pod="openshift-monitoring/telemeter-client-65c9867bbc-b7tk9" Apr 24 22:32:39.737484 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.737456 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/e0147744-e2f5-44c3-87b7-4c69075afa43-federate-client-tls\") pod \"telemeter-client-65c9867bbc-b7tk9\" (UID: \"e0147744-e2f5-44c3-87b7-4c69075afa43\") " pod="openshift-monitoring/telemeter-client-65c9867bbc-b7tk9" Apr 24 22:32:39.737484 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.737486 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0147744-e2f5-44c3-87b7-4c69075afa43-telemeter-trusted-ca-bundle\") pod \"telemeter-client-65c9867bbc-b7tk9\" (UID: \"e0147744-e2f5-44c3-87b7-4c69075afa43\") " pod="openshift-monitoring/telemeter-client-65c9867bbc-b7tk9" Apr 24 22:32:39.737668 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.737511 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e0147744-e2f5-44c3-87b7-4c69075afa43-metrics-client-ca\") pod \"telemeter-client-65c9867bbc-b7tk9\" (UID: \"e0147744-e2f5-44c3-87b7-4c69075afa43\") " pod="openshift-monitoring/telemeter-client-65c9867bbc-b7tk9" Apr 24 22:32:39.737668 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.737603 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c7a56dd6-0aa1-4189-9b4e-e176c8e9aece-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-nw82c\" (UID: \"c7a56dd6-0aa1-4189-9b4e-e176c8e9aece\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nw82c" Apr 24 22:32:39.737668 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.737632 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0147744-e2f5-44c3-87b7-4c69075afa43-serving-certs-ca-bundle\") pod \"telemeter-client-65c9867bbc-b7tk9\" (UID: \"e0147744-e2f5-44c3-87b7-4c69075afa43\") " pod="openshift-monitoring/telemeter-client-65c9867bbc-b7tk9" Apr 24 22:32:39.737668 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.737651 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/e0147744-e2f5-44c3-87b7-4c69075afa43-telemeter-client-tls\") pod \"telemeter-client-65c9867bbc-b7tk9\" (UID: \"e0147744-e2f5-44c3-87b7-4c69075afa43\") " pod="openshift-monitoring/telemeter-client-65c9867bbc-b7tk9" Apr 24 22:32:39.737810 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.737673 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e0147744-e2f5-44c3-87b7-4c69075afa43-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-65c9867bbc-b7tk9\" (UID: \"e0147744-e2f5-44c3-87b7-4c69075afa43\") " pod="openshift-monitoring/telemeter-client-65c9867bbc-b7tk9" Apr 24 22:32:39.737989 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.737959 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/e0147744-e2f5-44c3-87b7-4c69075afa43-secret-telemeter-client\") pod \"telemeter-client-65c9867bbc-b7tk9\" (UID: \"e0147744-e2f5-44c3-87b7-4c69075afa43\") " pod="openshift-monitoring/telemeter-client-65c9867bbc-b7tk9" Apr 24 22:32:39.738121 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.738044 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2nv9l\" (UniqueName: \"kubernetes.io/projected/e0147744-e2f5-44c3-87b7-4c69075afa43-kube-api-access-2nv9l\") pod \"telemeter-client-65c9867bbc-b7tk9\" (UID: \"e0147744-e2f5-44c3-87b7-4c69075afa43\") " pod="openshift-monitoring/telemeter-client-65c9867bbc-b7tk9" Apr 24 22:32:39.738474 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.738361 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e0147744-e2f5-44c3-87b7-4c69075afa43-metrics-client-ca\") pod \"telemeter-client-65c9867bbc-b7tk9\" (UID: \"e0147744-e2f5-44c3-87b7-4c69075afa43\") " pod="openshift-monitoring/telemeter-client-65c9867bbc-b7tk9" Apr 24 22:32:39.738474 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.738416 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0147744-e2f5-44c3-87b7-4c69075afa43-telemeter-trusted-ca-bundle\") pod \"telemeter-client-65c9867bbc-b7tk9\" (UID: \"e0147744-e2f5-44c3-87b7-4c69075afa43\") " pod="openshift-monitoring/telemeter-client-65c9867bbc-b7tk9" Apr 24 22:32:39.738817 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.738796 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0147744-e2f5-44c3-87b7-4c69075afa43-serving-certs-ca-bundle\") pod \"telemeter-client-65c9867bbc-b7tk9\" (UID: \"e0147744-e2f5-44c3-87b7-4c69075afa43\") " pod="openshift-monitoring/telemeter-client-65c9867bbc-b7tk9" Apr 24 22:32:39.740163 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.740137 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e0147744-e2f5-44c3-87b7-4c69075afa43-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-65c9867bbc-b7tk9\" (UID: \"e0147744-e2f5-44c3-87b7-4c69075afa43\") " pod="openshift-monitoring/telemeter-client-65c9867bbc-b7tk9" Apr 24 22:32:39.740417 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.740397 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/e0147744-e2f5-44c3-87b7-4c69075afa43-telemeter-client-tls\") pod \"telemeter-client-65c9867bbc-b7tk9\" (UID: \"e0147744-e2f5-44c3-87b7-4c69075afa43\") " pod="openshift-monitoring/telemeter-client-65c9867bbc-b7tk9" Apr 24 22:32:39.740471 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.740417 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/e0147744-e2f5-44c3-87b7-4c69075afa43-federate-client-tls\") pod \"telemeter-client-65c9867bbc-b7tk9\" (UID: \"e0147744-e2f5-44c3-87b7-4c69075afa43\") " pod="openshift-monitoring/telemeter-client-65c9867bbc-b7tk9" Apr 24 22:32:39.740631 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.740617 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/e0147744-e2f5-44c3-87b7-4c69075afa43-secret-telemeter-client\") pod \"telemeter-client-65c9867bbc-b7tk9\" (UID: \"e0147744-e2f5-44c3-87b7-4c69075afa43\") " pod="openshift-monitoring/telemeter-client-65c9867bbc-b7tk9" Apr 24 22:32:39.740674 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.740632 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c7a56dd6-0aa1-4189-9b4e-e176c8e9aece-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-nw82c\" (UID: \"c7a56dd6-0aa1-4189-9b4e-e176c8e9aece\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nw82c" Apr 24 22:32:39.746141 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.746122 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nv9l\" (UniqueName: \"kubernetes.io/projected/e0147744-e2f5-44c3-87b7-4c69075afa43-kube-api-access-2nv9l\") pod \"telemeter-client-65c9867bbc-b7tk9\" (UID: \"e0147744-e2f5-44c3-87b7-4c69075afa43\") " pod="openshift-monitoring/telemeter-client-65c9867bbc-b7tk9" Apr 24 22:32:39.872600 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.872549 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-65c9867bbc-b7tk9" Apr 24 22:32:39.990615 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:39.990586 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-65c9867bbc-b7tk9"] Apr 24 22:32:39.993748 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:32:39.993708 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0147744_e2f5_44c3_87b7_4c69075afa43.slice/crio-34abbb6ee3c621669100e95c8894d86a4147f2b3310a07925a2517012ac7932d WatchSource:0}: Error finding container 34abbb6ee3c621669100e95c8894d86a4147f2b3310a07925a2517012ac7932d: Status 404 returned error can't find the container with id 34abbb6ee3c621669100e95c8894d86a4147f2b3310a07925a2517012ac7932d Apr 24 22:32:40.006729 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:40.006699 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-65c9867bbc-b7tk9" event={"ID":"e0147744-e2f5-44c3-87b7-4c69075afa43","Type":"ContainerStarted","Data":"34abbb6ee3c621669100e95c8894d86a4147f2b3310a07925a2517012ac7932d"} Apr 24 22:32:40.040035 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:40.040012 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nw82c" Apr 24 22:32:40.151514 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:40.151350 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-nw82c"] Apr 24 22:32:40.154325 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:32:40.154298 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7a56dd6_0aa1_4189_9b4e_e176c8e9aece.slice/crio-fdbf943b136ecebe4857148d9decc51492992f7563933d555d675d8cad5c2c28 WatchSource:0}: Error finding container fdbf943b136ecebe4857148d9decc51492992f7563933d555d675d8cad5c2c28: Status 404 returned error can't find the container with id fdbf943b136ecebe4857148d9decc51492992f7563933d555d675d8cad5c2c28 Apr 24 22:32:41.014405 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:41.014334 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nw82c" event={"ID":"c7a56dd6-0aa1-4189-9b4e-e176c8e9aece","Type":"ContainerStarted","Data":"fdbf943b136ecebe4857148d9decc51492992f7563933d555d675d8cad5c2c28"} Apr 24 22:32:42.018596 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:42.018562 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-65c9867bbc-b7tk9" event={"ID":"e0147744-e2f5-44c3-87b7-4c69075afa43","Type":"ContainerStarted","Data":"7f654a580864448bc44368930d4a91dbcfb12ab44b4689276fe3ca5e2e1fbb21"} Apr 24 22:32:42.019940 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:42.019910 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nw82c" event={"ID":"c7a56dd6-0aa1-4189-9b4e-e176c8e9aece","Type":"ContainerStarted","Data":"24185efd5e4576e3858447657819351e2b05557c29e2e1c640e474d71cddad32"} Apr 24 22:32:42.020130 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:42.020109 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nw82c" Apr 24 22:32:42.024922 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:42.024903 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nw82c" Apr 24 22:32:42.036991 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:42.036953 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nw82c" podStartSLOduration=1.359230176 podStartE2EDuration="3.036938774s" podCreationTimestamp="2026-04-24 22:32:39 +0000 UTC" firstStartedPulling="2026-04-24 22:32:40.15607625 +0000 UTC m=+146.205932106" lastFinishedPulling="2026-04-24 22:32:41.833784835 +0000 UTC m=+147.883640704" observedRunningTime="2026-04-24 22:32:42.036128448 +0000 UTC m=+148.085984326" watchObservedRunningTime="2026-04-24 22:32:42.036938774 +0000 UTC m=+148.086794652" Apr 24 22:32:42.147190 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:42.147112 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-85745c4dfd-ts96w"] Apr 24 22:32:42.150370 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:42.150347 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85745c4dfd-ts96w" Apr 24 22:32:42.154966 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:42.154945 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 22:32:42.155071 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:42.154974 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 22:32:42.155255 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:42.155220 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 22:32:42.155255 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:42.155221 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 22:32:42.155255 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:42.155252 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 22:32:42.155496 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:42.155223 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 22:32:42.155567 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:42.155553 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-49xfc\"" Apr 24 22:32:42.156041 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:42.156026 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 22:32:42.159469 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:42.159443 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d8e23003-a6c4-4082-b3fd-0eb9f4c1e291-console-oauth-config\") pod \"console-85745c4dfd-ts96w\" (UID: \"d8e23003-a6c4-4082-b3fd-0eb9f4c1e291\") " pod="openshift-console/console-85745c4dfd-ts96w" Apr 24 22:32:42.159469 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:42.159467 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d8e23003-a6c4-4082-b3fd-0eb9f4c1e291-console-config\") pod \"console-85745c4dfd-ts96w\" (UID: \"d8e23003-a6c4-4082-b3fd-0eb9f4c1e291\") " pod="openshift-console/console-85745c4dfd-ts96w" Apr 24 22:32:42.159593 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:42.159499 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpjhv\" (UniqueName: \"kubernetes.io/projected/d8e23003-a6c4-4082-b3fd-0eb9f4c1e291-kube-api-access-xpjhv\") pod \"console-85745c4dfd-ts96w\" (UID: \"d8e23003-a6c4-4082-b3fd-0eb9f4c1e291\") " pod="openshift-console/console-85745c4dfd-ts96w" Apr 24 22:32:42.159593 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:42.159565 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d8e23003-a6c4-4082-b3fd-0eb9f4c1e291-console-serving-cert\") pod \"console-85745c4dfd-ts96w\" (UID: \"d8e23003-a6c4-4082-b3fd-0eb9f4c1e291\") " pod="openshift-console/console-85745c4dfd-ts96w" Apr 24 22:32:42.159670 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:42.159597 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d8e23003-a6c4-4082-b3fd-0eb9f4c1e291-service-ca\") pod \"console-85745c4dfd-ts96w\" (UID: \"d8e23003-a6c4-4082-b3fd-0eb9f4c1e291\") " pod="openshift-console/console-85745c4dfd-ts96w" Apr 24 22:32:42.159670 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:42.159619 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d8e23003-a6c4-4082-b3fd-0eb9f4c1e291-oauth-serving-cert\") pod \"console-85745c4dfd-ts96w\" (UID: \"d8e23003-a6c4-4082-b3fd-0eb9f4c1e291\") " pod="openshift-console/console-85745c4dfd-ts96w" Apr 24 22:32:42.164948 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:42.164928 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85745c4dfd-ts96w"] Apr 24 22:32:42.260748 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:42.260718 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d8e23003-a6c4-4082-b3fd-0eb9f4c1e291-console-serving-cert\") pod \"console-85745c4dfd-ts96w\" (UID: \"d8e23003-a6c4-4082-b3fd-0eb9f4c1e291\") " pod="openshift-console/console-85745c4dfd-ts96w" Apr 24 22:32:42.260937 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:42.260755 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d8e23003-a6c4-4082-b3fd-0eb9f4c1e291-service-ca\") pod \"console-85745c4dfd-ts96w\" (UID: \"d8e23003-a6c4-4082-b3fd-0eb9f4c1e291\") " pod="openshift-console/console-85745c4dfd-ts96w" Apr 24 22:32:42.260937 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:42.260781 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d8e23003-a6c4-4082-b3fd-0eb9f4c1e291-oauth-serving-cert\") pod \"console-85745c4dfd-ts96w\" (UID: \"d8e23003-a6c4-4082-b3fd-0eb9f4c1e291\") " pod="openshift-console/console-85745c4dfd-ts96w" Apr 24 22:32:42.260937 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:42.260856 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d8e23003-a6c4-4082-b3fd-0eb9f4c1e291-console-oauth-config\") pod \"console-85745c4dfd-ts96w\" (UID: \"d8e23003-a6c4-4082-b3fd-0eb9f4c1e291\") " pod="openshift-console/console-85745c4dfd-ts96w" Apr 24 22:32:42.260937 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:42.260896 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d8e23003-a6c4-4082-b3fd-0eb9f4c1e291-console-config\") pod \"console-85745c4dfd-ts96w\" (UID: \"d8e23003-a6c4-4082-b3fd-0eb9f4c1e291\") " pod="openshift-console/console-85745c4dfd-ts96w" Apr 24 22:32:42.261143 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:42.260953 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xpjhv\" (UniqueName: \"kubernetes.io/projected/d8e23003-a6c4-4082-b3fd-0eb9f4c1e291-kube-api-access-xpjhv\") pod \"console-85745c4dfd-ts96w\" (UID: \"d8e23003-a6c4-4082-b3fd-0eb9f4c1e291\") " pod="openshift-console/console-85745c4dfd-ts96w" Apr 24 22:32:42.261547 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:42.261517 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d8e23003-a6c4-4082-b3fd-0eb9f4c1e291-service-ca\") pod \"console-85745c4dfd-ts96w\" (UID: \"d8e23003-a6c4-4082-b3fd-0eb9f4c1e291\") " pod="openshift-console/console-85745c4dfd-ts96w" Apr 24 22:32:42.261857 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:42.261836 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d8e23003-a6c4-4082-b3fd-0eb9f4c1e291-oauth-serving-cert\") pod \"console-85745c4dfd-ts96w\" (UID: \"d8e23003-a6c4-4082-b3fd-0eb9f4c1e291\") " pod="openshift-console/console-85745c4dfd-ts96w" Apr 24 22:32:42.261857 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:42.261848 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d8e23003-a6c4-4082-b3fd-0eb9f4c1e291-console-config\") pod \"console-85745c4dfd-ts96w\" (UID: \"d8e23003-a6c4-4082-b3fd-0eb9f4c1e291\") " pod="openshift-console/console-85745c4dfd-ts96w" Apr 24 22:32:42.263744 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:42.263722 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d8e23003-a6c4-4082-b3fd-0eb9f4c1e291-console-oauth-config\") pod \"console-85745c4dfd-ts96w\" (UID: \"d8e23003-a6c4-4082-b3fd-0eb9f4c1e291\") " pod="openshift-console/console-85745c4dfd-ts96w" Apr 24 22:32:42.264219 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:42.264197 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d8e23003-a6c4-4082-b3fd-0eb9f4c1e291-console-serving-cert\") pod \"console-85745c4dfd-ts96w\" (UID: \"d8e23003-a6c4-4082-b3fd-0eb9f4c1e291\") " pod="openshift-console/console-85745c4dfd-ts96w" Apr 24 22:32:42.269157 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:42.269134 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpjhv\" (UniqueName: \"kubernetes.io/projected/d8e23003-a6c4-4082-b3fd-0eb9f4c1e291-kube-api-access-xpjhv\") pod \"console-85745c4dfd-ts96w\" (UID: \"d8e23003-a6c4-4082-b3fd-0eb9f4c1e291\") " pod="openshift-console/console-85745c4dfd-ts96w" Apr 24 22:32:42.460139 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:42.460055 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85745c4dfd-ts96w" Apr 24 22:32:42.507890 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:42.507833 2574 patch_prober.go:28] interesting pod/image-registry-846bc59fcf-t65jg container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 22:32:42.508062 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:42.507913 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-846bc59fcf-t65jg" podUID="9bb13040-efc6-4698-8a0a-b1270f5d0998" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:32:42.589927 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:42.589861 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85745c4dfd-ts96w"] Apr 24 22:32:42.753477 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:32:42.753401 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8e23003_a6c4_4082_b3fd_0eb9f4c1e291.slice/crio-a460fff69100b65a67bac2a7e84e451f800e72488c945d7fc08af09a4956cc31 WatchSource:0}: Error finding container a460fff69100b65a67bac2a7e84e451f800e72488c945d7fc08af09a4956cc31: Status 404 returned error can't find the container with id a460fff69100b65a67bac2a7e84e451f800e72488c945d7fc08af09a4956cc31 Apr 24 22:32:43.027733 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:43.027695 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85745c4dfd-ts96w" event={"ID":"d8e23003-a6c4-4082-b3fd-0eb9f4c1e291","Type":"ContainerStarted","Data":"a460fff69100b65a67bac2a7e84e451f800e72488c945d7fc08af09a4956cc31"} Apr 24 22:32:43.029626 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:43.029596 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-65c9867bbc-b7tk9" event={"ID":"e0147744-e2f5-44c3-87b7-4c69075afa43","Type":"ContainerStarted","Data":"33d3d4ea0c141b3a876a0fe50eb9d880a40ac3c0ff00b39132c46a8e75503ac9"} Apr 24 22:32:43.029743 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:43.029633 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-65c9867bbc-b7tk9" event={"ID":"e0147744-e2f5-44c3-87b7-4c69075afa43","Type":"ContainerStarted","Data":"51fffacbf05671fb5ddf2ba580aa0e05f22549b1d0ba7be67c79444a333855ae"} Apr 24 22:32:43.053404 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:43.053366 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-65c9867bbc-b7tk9" podStartSLOduration=1.251761033 podStartE2EDuration="4.053355715s" podCreationTimestamp="2026-04-24 22:32:39 +0000 UTC" firstStartedPulling="2026-04-24 22:32:39.995750272 +0000 UTC m=+146.045606134" lastFinishedPulling="2026-04-24 22:32:42.797344962 +0000 UTC m=+148.847200816" observedRunningTime="2026-04-24 22:32:43.051334795 +0000 UTC m=+149.101190673" watchObservedRunningTime="2026-04-24 22:32:43.053355715 +0000 UTC m=+149.103211591" Apr 24 22:32:44.943646 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:44.943597 2574 patch_prober.go:28] interesting pod/image-registry-846bc59fcf-t65jg container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 22:32:44.944150 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:44.943660 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-846bc59fcf-t65jg" podUID="9bb13040-efc6-4698-8a0a-b1270f5d0998" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:32:46.039719 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:46.039685 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85745c4dfd-ts96w" event={"ID":"d8e23003-a6c4-4082-b3fd-0eb9f4c1e291","Type":"ContainerStarted","Data":"2a2039b52259a78a7cb66cd034ec51cafa0571315a7345a43edfdc569b65bc99"} Apr 24 22:32:46.058568 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:46.058513 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-85745c4dfd-ts96w" podStartSLOduration=1.43707496 podStartE2EDuration="4.058497178s" podCreationTimestamp="2026-04-24 22:32:42 +0000 UTC" firstStartedPulling="2026-04-24 22:32:42.755422519 +0000 UTC m=+148.805278374" lastFinishedPulling="2026-04-24 22:32:45.376844724 +0000 UTC m=+151.426700592" observedRunningTime="2026-04-24 22:32:46.056896874 +0000 UTC m=+152.106752753" watchObservedRunningTime="2026-04-24 22:32:46.058497178 +0000 UTC m=+152.108353054" Apr 24 22:32:50.355387 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:32:50.355346 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-mfpb9" podUID="b3ad74e7-fb86-475b-88a4-5b2f7848cd68" Apr 24 22:32:50.371508 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:32:50.371481 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-q7rg6" podUID="6f622f7a-7d90-4dd1-af3f-3f27ebad181a" Apr 24 22:32:51.052678 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:51.052644 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-q7rg6" Apr 24 22:32:51.052817 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:51.052692 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mfpb9" Apr 24 22:32:52.460932 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:52.460902 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-85745c4dfd-ts96w" Apr 24 22:32:52.460932 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:52.460937 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-85745c4dfd-ts96w" Apr 24 22:32:52.465232 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:52.465209 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-85745c4dfd-ts96w" Apr 24 22:32:52.506943 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:52.506913 2574 patch_prober.go:28] interesting pod/image-registry-846bc59fcf-t65jg container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 22:32:52.507053 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:52.506961 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-846bc59fcf-t65jg" podUID="9bb13040-efc6-4698-8a0a-b1270f5d0998" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:32:53.062459 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:53.062430 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-85745c4dfd-ts96w" Apr 24 22:32:53.428531 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:53.428444 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-86d8f6f677-gt98f"] Apr 24 22:32:53.433294 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:53.433269 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86d8f6f677-gt98f" Apr 24 22:32:53.441414 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:53.441393 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 22:32:53.442721 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:53.442700 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86d8f6f677-gt98f"] Apr 24 22:32:53.541968 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:53.541939 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/580a1a04-5f29-43ba-910a-0c6d221ead8e-console-config\") pod \"console-86d8f6f677-gt98f\" (UID: \"580a1a04-5f29-43ba-910a-0c6d221ead8e\") " pod="openshift-console/console-86d8f6f677-gt98f" Apr 24 22:32:53.541968 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:53.541970 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/580a1a04-5f29-43ba-910a-0c6d221ead8e-oauth-serving-cert\") pod \"console-86d8f6f677-gt98f\" (UID: \"580a1a04-5f29-43ba-910a-0c6d221ead8e\") " pod="openshift-console/console-86d8f6f677-gt98f" Apr 24 22:32:53.542360 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:53.541989 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/580a1a04-5f29-43ba-910a-0c6d221ead8e-console-oauth-config\") pod \"console-86d8f6f677-gt98f\" (UID: \"580a1a04-5f29-43ba-910a-0c6d221ead8e\") " pod="openshift-console/console-86d8f6f677-gt98f" Apr 24 22:32:53.542360 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:53.542034 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/580a1a04-5f29-43ba-910a-0c6d221ead8e-trusted-ca-bundle\") pod \"console-86d8f6f677-gt98f\" (UID: \"580a1a04-5f29-43ba-910a-0c6d221ead8e\") " pod="openshift-console/console-86d8f6f677-gt98f" Apr 24 22:32:53.542360 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:53.542062 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qb9n\" (UniqueName: \"kubernetes.io/projected/580a1a04-5f29-43ba-910a-0c6d221ead8e-kube-api-access-9qb9n\") pod \"console-86d8f6f677-gt98f\" (UID: \"580a1a04-5f29-43ba-910a-0c6d221ead8e\") " pod="openshift-console/console-86d8f6f677-gt98f" Apr 24 22:32:53.542360 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:53.542102 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/580a1a04-5f29-43ba-910a-0c6d221ead8e-console-serving-cert\") pod \"console-86d8f6f677-gt98f\" (UID: \"580a1a04-5f29-43ba-910a-0c6d221ead8e\") " pod="openshift-console/console-86d8f6f677-gt98f" Apr 24 22:32:53.542360 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:53.542125 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/580a1a04-5f29-43ba-910a-0c6d221ead8e-service-ca\") pod \"console-86d8f6f677-gt98f\" (UID: \"580a1a04-5f29-43ba-910a-0c6d221ead8e\") " pod="openshift-console/console-86d8f6f677-gt98f" Apr 24 22:32:53.643462 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:53.643436 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/580a1a04-5f29-43ba-910a-0c6d221ead8e-console-serving-cert\") pod \"console-86d8f6f677-gt98f\" (UID: \"580a1a04-5f29-43ba-910a-0c6d221ead8e\") " pod="openshift-console/console-86d8f6f677-gt98f" Apr 24 22:32:53.643578 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:53.643475 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/580a1a04-5f29-43ba-910a-0c6d221ead8e-service-ca\") pod \"console-86d8f6f677-gt98f\" (UID: \"580a1a04-5f29-43ba-910a-0c6d221ead8e\") " pod="openshift-console/console-86d8f6f677-gt98f" Apr 24 22:32:53.643578 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:53.643504 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/580a1a04-5f29-43ba-910a-0c6d221ead8e-console-config\") pod \"console-86d8f6f677-gt98f\" (UID: \"580a1a04-5f29-43ba-910a-0c6d221ead8e\") " pod="openshift-console/console-86d8f6f677-gt98f" Apr 24 22:32:53.643578 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:53.643521 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/580a1a04-5f29-43ba-910a-0c6d221ead8e-oauth-serving-cert\") pod \"console-86d8f6f677-gt98f\" (UID: \"580a1a04-5f29-43ba-910a-0c6d221ead8e\") " pod="openshift-console/console-86d8f6f677-gt98f" Apr 24 22:32:53.643578 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:53.643543 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/580a1a04-5f29-43ba-910a-0c6d221ead8e-console-oauth-config\") pod \"console-86d8f6f677-gt98f\" (UID: \"580a1a04-5f29-43ba-910a-0c6d221ead8e\") " pod="openshift-console/console-86d8f6f677-gt98f" Apr 24 22:32:53.643767 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:53.643586 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/580a1a04-5f29-43ba-910a-0c6d221ead8e-trusted-ca-bundle\") pod \"console-86d8f6f677-gt98f\" (UID: \"580a1a04-5f29-43ba-910a-0c6d221ead8e\") " pod="openshift-console/console-86d8f6f677-gt98f" Apr 24 22:32:53.643767 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:53.643617 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qb9n\" (UniqueName: \"kubernetes.io/projected/580a1a04-5f29-43ba-910a-0c6d221ead8e-kube-api-access-9qb9n\") pod \"console-86d8f6f677-gt98f\" (UID: \"580a1a04-5f29-43ba-910a-0c6d221ead8e\") " pod="openshift-console/console-86d8f6f677-gt98f" Apr 24 22:32:53.644297 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:53.644273 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/580a1a04-5f29-43ba-910a-0c6d221ead8e-service-ca\") pod \"console-86d8f6f677-gt98f\" (UID: \"580a1a04-5f29-43ba-910a-0c6d221ead8e\") " pod="openshift-console/console-86d8f6f677-gt98f" Apr 24 22:32:53.644400 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:53.644280 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/580a1a04-5f29-43ba-910a-0c6d221ead8e-console-config\") pod \"console-86d8f6f677-gt98f\" (UID: \"580a1a04-5f29-43ba-910a-0c6d221ead8e\") " pod="openshift-console/console-86d8f6f677-gt98f" Apr 24 22:32:53.644443 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:53.644403 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/580a1a04-5f29-43ba-910a-0c6d221ead8e-oauth-serving-cert\") pod \"console-86d8f6f677-gt98f\" (UID: \"580a1a04-5f29-43ba-910a-0c6d221ead8e\") " pod="openshift-console/console-86d8f6f677-gt98f" Apr 24 22:32:53.644443 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:53.644431 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/580a1a04-5f29-43ba-910a-0c6d221ead8e-trusted-ca-bundle\") pod \"console-86d8f6f677-gt98f\" (UID: \"580a1a04-5f29-43ba-910a-0c6d221ead8e\") " pod="openshift-console/console-86d8f6f677-gt98f" Apr 24 22:32:53.645950 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:53.645923 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/580a1a04-5f29-43ba-910a-0c6d221ead8e-console-serving-cert\") pod \"console-86d8f6f677-gt98f\" (UID: \"580a1a04-5f29-43ba-910a-0c6d221ead8e\") " pod="openshift-console/console-86d8f6f677-gt98f" Apr 24 22:32:53.646042 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:53.645979 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/580a1a04-5f29-43ba-910a-0c6d221ead8e-console-oauth-config\") pod \"console-86d8f6f677-gt98f\" (UID: \"580a1a04-5f29-43ba-910a-0c6d221ead8e\") " pod="openshift-console/console-86d8f6f677-gt98f" Apr 24 22:32:53.652466 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:53.652445 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qb9n\" (UniqueName: \"kubernetes.io/projected/580a1a04-5f29-43ba-910a-0c6d221ead8e-kube-api-access-9qb9n\") pod \"console-86d8f6f677-gt98f\" (UID: \"580a1a04-5f29-43ba-910a-0c6d221ead8e\") " pod="openshift-console/console-86d8f6f677-gt98f" Apr 24 22:32:53.744544 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:53.744487 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86d8f6f677-gt98f" Apr 24 22:32:53.862944 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:53.862851 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86d8f6f677-gt98f"] Apr 24 22:32:53.864973 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:32:53.864946 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod580a1a04_5f29_43ba_910a_0c6d221ead8e.slice/crio-48c86b87cb525737c6716920c7f3804a2845be04eda0bb98f4a21d95cf758de9 WatchSource:0}: Error finding container 48c86b87cb525737c6716920c7f3804a2845be04eda0bb98f4a21d95cf758de9: Status 404 returned error can't find the container with id 48c86b87cb525737c6716920c7f3804a2845be04eda0bb98f4a21d95cf758de9 Apr 24 22:32:54.063453 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:54.063419 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86d8f6f677-gt98f" event={"ID":"580a1a04-5f29-43ba-910a-0c6d221ead8e","Type":"ContainerStarted","Data":"09a4ac6a728e2769c295057fbb25dbc7c387667e5da322cef08a84f4b7fe5d66"} Apr 24 22:32:54.063600 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:54.063461 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86d8f6f677-gt98f" event={"ID":"580a1a04-5f29-43ba-910a-0c6d221ead8e","Type":"ContainerStarted","Data":"48c86b87cb525737c6716920c7f3804a2845be04eda0bb98f4a21d95cf758de9"} Apr 24 22:32:54.081307 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:54.081258 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-86d8f6f677-gt98f" podStartSLOduration=1.081242209 podStartE2EDuration="1.081242209s" podCreationTimestamp="2026-04-24 22:32:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:32:54.07963409 +0000 UTC m=+160.129489981" watchObservedRunningTime="2026-04-24 22:32:54.081242209 +0000 UTC m=+160.131098085" Apr 24 22:32:54.942573 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:54.942549 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-846bc59fcf-t65jg" Apr 24 22:32:55.258612 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:55.258540 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3ad74e7-fb86-475b-88a4-5b2f7848cd68-metrics-tls\") pod \"dns-default-mfpb9\" (UID: \"b3ad74e7-fb86-475b-88a4-5b2f7848cd68\") " pod="openshift-dns/dns-default-mfpb9" Apr 24 22:32:55.258612 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:55.258601 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f622f7a-7d90-4dd1-af3f-3f27ebad181a-cert\") pod \"ingress-canary-q7rg6\" (UID: \"6f622f7a-7d90-4dd1-af3f-3f27ebad181a\") " pod="openshift-ingress-canary/ingress-canary-q7rg6" Apr 24 22:32:55.260902 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:55.260848 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3ad74e7-fb86-475b-88a4-5b2f7848cd68-metrics-tls\") pod \"dns-default-mfpb9\" (UID: \"b3ad74e7-fb86-475b-88a4-5b2f7848cd68\") " pod="openshift-dns/dns-default-mfpb9" Apr 24 22:32:55.261007 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:55.260915 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f622f7a-7d90-4dd1-af3f-3f27ebad181a-cert\") pod \"ingress-canary-q7rg6\" (UID: \"6f622f7a-7d90-4dd1-af3f-3f27ebad181a\") " pod="openshift-ingress-canary/ingress-canary-q7rg6" Apr 24 22:32:55.557650 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:55.557624 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tdthp\"" Apr 24 22:32:55.557807 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:55.557650 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-svq4d\"" Apr 24 22:32:55.564951 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:55.564934 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mfpb9" Apr 24 22:32:55.565067 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:55.565001 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-q7rg6" Apr 24 22:32:55.692055 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:55.691806 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mfpb9"] Apr 24 22:32:55.694552 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:32:55.694513 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3ad74e7_fb86_475b_88a4_5b2f7848cd68.slice/crio-14fc01d16a7eb0a3f838d15c14bdf6e05162192c5b1d7aa853835e0c33c17497 WatchSource:0}: Error finding container 14fc01d16a7eb0a3f838d15c14bdf6e05162192c5b1d7aa853835e0c33c17497: Status 404 returned error can't find the container with id 14fc01d16a7eb0a3f838d15c14bdf6e05162192c5b1d7aa853835e0c33c17497 Apr 24 22:32:55.707636 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:55.707616 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-q7rg6"] Apr 24 22:32:55.710121 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:32:55.710096 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f622f7a_7d90_4dd1_af3f_3f27ebad181a.slice/crio-d1816c0b51f952a7f02bd37e6e4d652f88508957493849364d5584157a882ed6 WatchSource:0}: Error finding container d1816c0b51f952a7f02bd37e6e4d652f88508957493849364d5584157a882ed6: Status 404 returned error can't find the container with id d1816c0b51f952a7f02bd37e6e4d652f88508957493849364d5584157a882ed6 Apr 24 22:32:56.070105 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:56.070076 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-q7rg6" event={"ID":"6f622f7a-7d90-4dd1-af3f-3f27ebad181a","Type":"ContainerStarted","Data":"d1816c0b51f952a7f02bd37e6e4d652f88508957493849364d5584157a882ed6"} Apr 24 22:32:56.071372 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:56.071340 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mfpb9" event={"ID":"b3ad74e7-fb86-475b-88a4-5b2f7848cd68","Type":"ContainerStarted","Data":"14fc01d16a7eb0a3f838d15c14bdf6e05162192c5b1d7aa853835e0c33c17497"} Apr 24 22:32:58.077936 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:58.077870 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-q7rg6" event={"ID":"6f622f7a-7d90-4dd1-af3f-3f27ebad181a","Type":"ContainerStarted","Data":"233dba55644737e311a9c44f83beb7c5442a903e85ee34d322488803d14ea837"} Apr 24 22:32:58.079564 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:58.079539 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mfpb9" event={"ID":"b3ad74e7-fb86-475b-88a4-5b2f7848cd68","Type":"ContainerStarted","Data":"ae6dd292ac3fc391f18b1826d68dd3304828ef391e7d3af279338fbd8643e18b"} Apr 24 22:32:58.079564 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:58.079565 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mfpb9" event={"ID":"b3ad74e7-fb86-475b-88a4-5b2f7848cd68","Type":"ContainerStarted","Data":"e80fe639cab09a6dcc0ea01dd3b0ba31215332b3055c6e5cf683fe24bfe7e0c5"} Apr 24 22:32:58.079758 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:58.079691 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-mfpb9" Apr 24 22:32:58.094225 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:58.094184 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-q7rg6" podStartSLOduration=129.208003946 podStartE2EDuration="2m11.094173294s" podCreationTimestamp="2026-04-24 22:30:47 +0000 UTC" firstStartedPulling="2026-04-24 22:32:55.714392712 +0000 UTC m=+161.764248566" lastFinishedPulling="2026-04-24 22:32:57.600562055 +0000 UTC m=+163.650417914" observedRunningTime="2026-04-24 22:32:58.094005228 +0000 UTC m=+164.143861104" watchObservedRunningTime="2026-04-24 22:32:58.094173294 +0000 UTC m=+164.144029162" Apr 24 22:32:58.111022 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:32:58.110931 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-mfpb9" podStartSLOduration=129.211543549 podStartE2EDuration="2m11.110918476s" podCreationTimestamp="2026-04-24 22:30:47 +0000 UTC" firstStartedPulling="2026-04-24 22:32:55.697361884 +0000 UTC m=+161.747217743" lastFinishedPulling="2026-04-24 22:32:57.596736802 +0000 UTC m=+163.646592670" observedRunningTime="2026-04-24 22:32:58.109676993 +0000 UTC m=+164.159532872" watchObservedRunningTime="2026-04-24 22:32:58.110918476 +0000 UTC m=+164.160774566" Apr 24 22:33:01.903304 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:01.903269 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-mfpb9_b3ad74e7-fb86-475b-88a4-5b2f7848cd68/dns/0.log" Apr 24 22:33:02.104458 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:02.104431 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-mfpb9_b3ad74e7-fb86-475b-88a4-5b2f7848cd68/kube-rbac-proxy/0.log" Apr 24 22:33:02.707829 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:02.707801 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-t25xz_31be179c-c441-48a4-8779-593458646c77/dns-node-resolver/0.log" Apr 24 22:33:02.904145 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:02.904116 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-6898f6cd9b-c68zq_b3432e60-bf25-4ee8-876a-a1ee3c4b5846/router/0.log" Apr 24 22:33:03.303932 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:03.303845 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-q7rg6_6f622f7a-7d90-4dd1-af3f-3f27ebad181a/serve-healthcheck-canary/0.log" Apr 24 22:33:03.705529 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:03.705436 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-bzh27_5469c5ef-21d3-4db2-8b90-fe6fca022351/cluster-samples-operator/0.log" Apr 24 22:33:03.745597 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:03.745562 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-86d8f6f677-gt98f" Apr 24 22:33:03.745597 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:03.745603 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-86d8f6f677-gt98f" Apr 24 22:33:03.750103 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:03.750083 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-86d8f6f677-gt98f" Apr 24 22:33:03.903663 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:03.903630 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-bzh27_5469c5ef-21d3-4db2-8b90-fe6fca022351/cluster-samples-operator-watch/0.log" Apr 24 22:33:04.101170 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:04.101138 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-86d8f6f677-gt98f" Apr 24 22:33:04.149865 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:04.149833 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-85745c4dfd-ts96w"] Apr 24 22:33:08.085026 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:08.084995 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-mfpb9" Apr 24 22:33:10.119253 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:10.119217 2574 generic.go:358] "Generic (PLEG): container finished" podID="9f025f22-b0e0-48ff-8928-ed22d22ab622" containerID="aea56cdb8d55ea5e2c24edb99ac1e576201133af7053e1703730266352d065ba" exitCode=0 Apr 24 22:33:10.119649 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:10.119293 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v29tm" event={"ID":"9f025f22-b0e0-48ff-8928-ed22d22ab622","Type":"ContainerDied","Data":"aea56cdb8d55ea5e2c24edb99ac1e576201133af7053e1703730266352d065ba"} Apr 24 22:33:10.119649 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:10.119597 2574 scope.go:117] "RemoveContainer" containerID="aea56cdb8d55ea5e2c24edb99ac1e576201133af7053e1703730266352d065ba" Apr 24 22:33:11.123646 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:11.123611 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v29tm" event={"ID":"9f025f22-b0e0-48ff-8928-ed22d22ab622","Type":"ContainerStarted","Data":"5d9dd8cc50686060f57dda387fe7f1a29e9174b223e41e341d7287f80424afe0"} Apr 24 22:33:29.170709 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:29.170652 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-85745c4dfd-ts96w" podUID="d8e23003-a6c4-4082-b3fd-0eb9f4c1e291" containerName="console" containerID="cri-o://2a2039b52259a78a7cb66cd034ec51cafa0571315a7345a43edfdc569b65bc99" gracePeriod=15 Apr 24 22:33:29.408329 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:29.408307 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-85745c4dfd-ts96w_d8e23003-a6c4-4082-b3fd-0eb9f4c1e291/console/0.log" Apr 24 22:33:29.408438 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:29.408379 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85745c4dfd-ts96w" Apr 24 22:33:29.538329 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:29.538303 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d8e23003-a6c4-4082-b3fd-0eb9f4c1e291-console-oauth-config\") pod \"d8e23003-a6c4-4082-b3fd-0eb9f4c1e291\" (UID: \"d8e23003-a6c4-4082-b3fd-0eb9f4c1e291\") " Apr 24 22:33:29.538520 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:29.538359 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d8e23003-a6c4-4082-b3fd-0eb9f4c1e291-oauth-serving-cert\") pod \"d8e23003-a6c4-4082-b3fd-0eb9f4c1e291\" (UID: \"d8e23003-a6c4-4082-b3fd-0eb9f4c1e291\") " Apr 24 22:33:29.538520 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:29.538393 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d8e23003-a6c4-4082-b3fd-0eb9f4c1e291-service-ca\") pod \"d8e23003-a6c4-4082-b3fd-0eb9f4c1e291\" (UID: \"d8e23003-a6c4-4082-b3fd-0eb9f4c1e291\") " Apr 24 22:33:29.538520 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:29.538423 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d8e23003-a6c4-4082-b3fd-0eb9f4c1e291-console-config\") pod \"d8e23003-a6c4-4082-b3fd-0eb9f4c1e291\" (UID: \"d8e23003-a6c4-4082-b3fd-0eb9f4c1e291\") " Apr 24 22:33:29.538520 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:29.538460 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d8e23003-a6c4-4082-b3fd-0eb9f4c1e291-console-serving-cert\") pod \"d8e23003-a6c4-4082-b3fd-0eb9f4c1e291\" (UID: \"d8e23003-a6c4-4082-b3fd-0eb9f4c1e291\") " Apr 24 22:33:29.538520 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:29.538496 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpjhv\" (UniqueName: \"kubernetes.io/projected/d8e23003-a6c4-4082-b3fd-0eb9f4c1e291-kube-api-access-xpjhv\") pod \"d8e23003-a6c4-4082-b3fd-0eb9f4c1e291\" (UID: \"d8e23003-a6c4-4082-b3fd-0eb9f4c1e291\") " Apr 24 22:33:29.538841 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:29.538817 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8e23003-a6c4-4082-b3fd-0eb9f4c1e291-service-ca" (OuterVolumeSpecName: "service-ca") pod "d8e23003-a6c4-4082-b3fd-0eb9f4c1e291" (UID: "d8e23003-a6c4-4082-b3fd-0eb9f4c1e291"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:33:29.538917 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:29.538817 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8e23003-a6c4-4082-b3fd-0eb9f4c1e291-console-config" (OuterVolumeSpecName: "console-config") pod "d8e23003-a6c4-4082-b3fd-0eb9f4c1e291" (UID: "d8e23003-a6c4-4082-b3fd-0eb9f4c1e291"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:33:29.538917 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:29.538824 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8e23003-a6c4-4082-b3fd-0eb9f4c1e291-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d8e23003-a6c4-4082-b3fd-0eb9f4c1e291" (UID: "d8e23003-a6c4-4082-b3fd-0eb9f4c1e291"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:33:29.540601 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:29.540567 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8e23003-a6c4-4082-b3fd-0eb9f4c1e291-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d8e23003-a6c4-4082-b3fd-0eb9f4c1e291" (UID: "d8e23003-a6c4-4082-b3fd-0eb9f4c1e291"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:33:29.540709 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:29.540633 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8e23003-a6c4-4082-b3fd-0eb9f4c1e291-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d8e23003-a6c4-4082-b3fd-0eb9f4c1e291" (UID: "d8e23003-a6c4-4082-b3fd-0eb9f4c1e291"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:33:29.540757 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:29.540713 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8e23003-a6c4-4082-b3fd-0eb9f4c1e291-kube-api-access-xpjhv" (OuterVolumeSpecName: "kube-api-access-xpjhv") pod "d8e23003-a6c4-4082-b3fd-0eb9f4c1e291" (UID: "d8e23003-a6c4-4082-b3fd-0eb9f4c1e291"). InnerVolumeSpecName "kube-api-access-xpjhv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:33:29.639094 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:29.639057 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d8e23003-a6c4-4082-b3fd-0eb9f4c1e291-console-serving-cert\") on node \"ip-10-0-129-176.ec2.internal\" DevicePath \"\"" Apr 24 22:33:29.639094 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:29.639086 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xpjhv\" (UniqueName: \"kubernetes.io/projected/d8e23003-a6c4-4082-b3fd-0eb9f4c1e291-kube-api-access-xpjhv\") on node \"ip-10-0-129-176.ec2.internal\" DevicePath \"\"" Apr 24 22:33:29.639094 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:29.639097 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d8e23003-a6c4-4082-b3fd-0eb9f4c1e291-console-oauth-config\") on node \"ip-10-0-129-176.ec2.internal\" DevicePath \"\"" Apr 24 22:33:29.639315 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:29.639107 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d8e23003-a6c4-4082-b3fd-0eb9f4c1e291-oauth-serving-cert\") on node \"ip-10-0-129-176.ec2.internal\" DevicePath \"\"" Apr 24 22:33:29.639315 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:29.639117 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d8e23003-a6c4-4082-b3fd-0eb9f4c1e291-service-ca\") on node \"ip-10-0-129-176.ec2.internal\" DevicePath \"\"" Apr 24 22:33:29.639315 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:29.639126 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d8e23003-a6c4-4082-b3fd-0eb9f4c1e291-console-config\") on node \"ip-10-0-129-176.ec2.internal\" DevicePath \"\"" Apr 24 22:33:30.180373 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:30.180344 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-85745c4dfd-ts96w_d8e23003-a6c4-4082-b3fd-0eb9f4c1e291/console/0.log" Apr 24 22:33:30.180783 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:30.180386 2574 generic.go:358] "Generic (PLEG): container finished" podID="d8e23003-a6c4-4082-b3fd-0eb9f4c1e291" containerID="2a2039b52259a78a7cb66cd034ec51cafa0571315a7345a43edfdc569b65bc99" exitCode=2 Apr 24 22:33:30.180783 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:30.180424 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85745c4dfd-ts96w" event={"ID":"d8e23003-a6c4-4082-b3fd-0eb9f4c1e291","Type":"ContainerDied","Data":"2a2039b52259a78a7cb66cd034ec51cafa0571315a7345a43edfdc569b65bc99"} Apr 24 22:33:30.180783 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:30.180445 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85745c4dfd-ts96w" Apr 24 22:33:30.180783 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:30.180452 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85745c4dfd-ts96w" event={"ID":"d8e23003-a6c4-4082-b3fd-0eb9f4c1e291","Type":"ContainerDied","Data":"a460fff69100b65a67bac2a7e84e451f800e72488c945d7fc08af09a4956cc31"} Apr 24 22:33:30.180783 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:30.180473 2574 scope.go:117] "RemoveContainer" containerID="2a2039b52259a78a7cb66cd034ec51cafa0571315a7345a43edfdc569b65bc99" Apr 24 22:33:30.188743 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:30.188724 2574 scope.go:117] "RemoveContainer" containerID="2a2039b52259a78a7cb66cd034ec51cafa0571315a7345a43edfdc569b65bc99" Apr 24 22:33:30.189024 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:33:30.189001 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a2039b52259a78a7cb66cd034ec51cafa0571315a7345a43edfdc569b65bc99\": container with ID starting with 2a2039b52259a78a7cb66cd034ec51cafa0571315a7345a43edfdc569b65bc99 not found: ID does not exist" containerID="2a2039b52259a78a7cb66cd034ec51cafa0571315a7345a43edfdc569b65bc99" Apr 24 22:33:30.189093 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:30.189036 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a2039b52259a78a7cb66cd034ec51cafa0571315a7345a43edfdc569b65bc99"} err="failed to get container status \"2a2039b52259a78a7cb66cd034ec51cafa0571315a7345a43edfdc569b65bc99\": rpc error: code = NotFound desc = could not find container \"2a2039b52259a78a7cb66cd034ec51cafa0571315a7345a43edfdc569b65bc99\": container with ID starting with 2a2039b52259a78a7cb66cd034ec51cafa0571315a7345a43edfdc569b65bc99 not found: ID does not exist" Apr 24 22:33:30.201675 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:30.201649 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-85745c4dfd-ts96w"] Apr 24 22:33:30.204959 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:30.204940 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-85745c4dfd-ts96w"] Apr 24 22:33:30.535277 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:30.535245 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8e23003-a6c4-4082-b3fd-0eb9f4c1e291" path="/var/lib/kubelet/pods/d8e23003-a6c4-4082-b3fd-0eb9f4c1e291/volumes" Apr 24 22:33:57.720161 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:57.720124 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5685b685df-f9lgc"] Apr 24 22:33:57.720618 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:57.720457 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d8e23003-a6c4-4082-b3fd-0eb9f4c1e291" containerName="console" Apr 24 22:33:57.720618 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:57.720497 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8e23003-a6c4-4082-b3fd-0eb9f4c1e291" containerName="console" Apr 24 22:33:57.720618 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:57.720596 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="d8e23003-a6c4-4082-b3fd-0eb9f4c1e291" containerName="console" Apr 24 22:33:57.725075 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:57.725052 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5685b685df-f9lgc" Apr 24 22:33:57.734568 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:57.734544 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5685b685df-f9lgc"] Apr 24 22:33:57.754401 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:57.754379 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f9af5106-aa0e-4042-8609-9a787ce03588-service-ca\") pod \"console-5685b685df-f9lgc\" (UID: \"f9af5106-aa0e-4042-8609-9a787ce03588\") " pod="openshift-console/console-5685b685df-f9lgc" Apr 24 22:33:57.754537 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:57.754413 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f9af5106-aa0e-4042-8609-9a787ce03588-oauth-serving-cert\") pod \"console-5685b685df-f9lgc\" (UID: \"f9af5106-aa0e-4042-8609-9a787ce03588\") " pod="openshift-console/console-5685b685df-f9lgc" Apr 24 22:33:57.754537 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:57.754442 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f9af5106-aa0e-4042-8609-9a787ce03588-console-oauth-config\") pod \"console-5685b685df-f9lgc\" (UID: \"f9af5106-aa0e-4042-8609-9a787ce03588\") " pod="openshift-console/console-5685b685df-f9lgc" Apr 24 22:33:57.754537 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:57.754518 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f9af5106-aa0e-4042-8609-9a787ce03588-console-config\") pod \"console-5685b685df-f9lgc\" (UID: \"f9af5106-aa0e-4042-8609-9a787ce03588\") " pod="openshift-console/console-5685b685df-f9lgc" Apr 24 22:33:57.754636 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:57.754554 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9af5106-aa0e-4042-8609-9a787ce03588-trusted-ca-bundle\") pod \"console-5685b685df-f9lgc\" (UID: \"f9af5106-aa0e-4042-8609-9a787ce03588\") " pod="openshift-console/console-5685b685df-f9lgc" Apr 24 22:33:57.754636 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:57.754578 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f9af5106-aa0e-4042-8609-9a787ce03588-console-serving-cert\") pod \"console-5685b685df-f9lgc\" (UID: \"f9af5106-aa0e-4042-8609-9a787ce03588\") " pod="openshift-console/console-5685b685df-f9lgc" Apr 24 22:33:57.754636 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:57.754594 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2959\" (UniqueName: \"kubernetes.io/projected/f9af5106-aa0e-4042-8609-9a787ce03588-kube-api-access-z2959\") pod \"console-5685b685df-f9lgc\" (UID: \"f9af5106-aa0e-4042-8609-9a787ce03588\") " pod="openshift-console/console-5685b685df-f9lgc" Apr 24 22:33:57.856048 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:57.856009 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f9af5106-aa0e-4042-8609-9a787ce03588-service-ca\") pod \"console-5685b685df-f9lgc\" (UID: \"f9af5106-aa0e-4042-8609-9a787ce03588\") " pod="openshift-console/console-5685b685df-f9lgc" Apr 24 22:33:57.856048 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:57.856053 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f9af5106-aa0e-4042-8609-9a787ce03588-oauth-serving-cert\") pod \"console-5685b685df-f9lgc\" (UID: \"f9af5106-aa0e-4042-8609-9a787ce03588\") " pod="openshift-console/console-5685b685df-f9lgc" Apr 24 22:33:57.856241 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:57.856178 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f9af5106-aa0e-4042-8609-9a787ce03588-console-oauth-config\") pod \"console-5685b685df-f9lgc\" (UID: \"f9af5106-aa0e-4042-8609-9a787ce03588\") " pod="openshift-console/console-5685b685df-f9lgc" Apr 24 22:33:57.856411 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:57.856387 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f9af5106-aa0e-4042-8609-9a787ce03588-console-config\") pod \"console-5685b685df-f9lgc\" (UID: \"f9af5106-aa0e-4042-8609-9a787ce03588\") " pod="openshift-console/console-5685b685df-f9lgc" Apr 24 22:33:57.856466 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:57.856431 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9af5106-aa0e-4042-8609-9a787ce03588-trusted-ca-bundle\") pod \"console-5685b685df-f9lgc\" (UID: \"f9af5106-aa0e-4042-8609-9a787ce03588\") " pod="openshift-console/console-5685b685df-f9lgc" Apr 24 22:33:57.856466 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:57.856454 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f9af5106-aa0e-4042-8609-9a787ce03588-console-serving-cert\") pod \"console-5685b685df-f9lgc\" (UID: \"f9af5106-aa0e-4042-8609-9a787ce03588\") " pod="openshift-console/console-5685b685df-f9lgc" Apr 24 22:33:57.856559 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:57.856476 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2959\" (UniqueName: \"kubernetes.io/projected/f9af5106-aa0e-4042-8609-9a787ce03588-kube-api-access-z2959\") pod \"console-5685b685df-f9lgc\" (UID: \"f9af5106-aa0e-4042-8609-9a787ce03588\") " pod="openshift-console/console-5685b685df-f9lgc" Apr 24 22:33:57.856834 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:57.856809 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f9af5106-aa0e-4042-8609-9a787ce03588-service-ca\") pod \"console-5685b685df-f9lgc\" (UID: \"f9af5106-aa0e-4042-8609-9a787ce03588\") " pod="openshift-console/console-5685b685df-f9lgc" Apr 24 22:33:57.856997 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:57.856970 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f9af5106-aa0e-4042-8609-9a787ce03588-oauth-serving-cert\") pod \"console-5685b685df-f9lgc\" (UID: \"f9af5106-aa0e-4042-8609-9a787ce03588\") " pod="openshift-console/console-5685b685df-f9lgc" Apr 24 22:33:57.857121 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:57.857072 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f9af5106-aa0e-4042-8609-9a787ce03588-console-config\") pod \"console-5685b685df-f9lgc\" (UID: \"f9af5106-aa0e-4042-8609-9a787ce03588\") " pod="openshift-console/console-5685b685df-f9lgc" Apr 24 22:33:57.857249 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:57.857227 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9af5106-aa0e-4042-8609-9a787ce03588-trusted-ca-bundle\") pod \"console-5685b685df-f9lgc\" (UID: \"f9af5106-aa0e-4042-8609-9a787ce03588\") " pod="openshift-console/console-5685b685df-f9lgc" Apr 24 22:33:57.858676 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:57.858657 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f9af5106-aa0e-4042-8609-9a787ce03588-console-oauth-config\") pod \"console-5685b685df-f9lgc\" (UID: \"f9af5106-aa0e-4042-8609-9a787ce03588\") " pod="openshift-console/console-5685b685df-f9lgc" Apr 24 22:33:57.858865 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:57.858848 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f9af5106-aa0e-4042-8609-9a787ce03588-console-serving-cert\") pod \"console-5685b685df-f9lgc\" (UID: \"f9af5106-aa0e-4042-8609-9a787ce03588\") " pod="openshift-console/console-5685b685df-f9lgc" Apr 24 22:33:57.864748 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:57.864725 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2959\" (UniqueName: \"kubernetes.io/projected/f9af5106-aa0e-4042-8609-9a787ce03588-kube-api-access-z2959\") pod \"console-5685b685df-f9lgc\" (UID: \"f9af5106-aa0e-4042-8609-9a787ce03588\") " pod="openshift-console/console-5685b685df-f9lgc" Apr 24 22:33:58.036453 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:58.036422 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5685b685df-f9lgc" Apr 24 22:33:58.158790 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:58.158754 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5685b685df-f9lgc"] Apr 24 22:33:58.162407 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:33:58.162380 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9af5106_aa0e_4042_8609_9a787ce03588.slice/crio-4cdba999ff802d11d3b6d5d5febc1f6bd31b9a5dd4162cb3aee0f9b6b3e0c747 WatchSource:0}: Error finding container 4cdba999ff802d11d3b6d5d5febc1f6bd31b9a5dd4162cb3aee0f9b6b3e0c747: Status 404 returned error can't find the container with id 4cdba999ff802d11d3b6d5d5febc1f6bd31b9a5dd4162cb3aee0f9b6b3e0c747 Apr 24 22:33:58.259917 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:58.259854 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5685b685df-f9lgc" event={"ID":"f9af5106-aa0e-4042-8609-9a787ce03588","Type":"ContainerStarted","Data":"fa1f72788b57c7e5d200183f2c91d302d34562ed274b3e00bbae9e5c50c80739"} Apr 24 22:33:58.259917 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:58.259911 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5685b685df-f9lgc" event={"ID":"f9af5106-aa0e-4042-8609-9a787ce03588","Type":"ContainerStarted","Data":"4cdba999ff802d11d3b6d5d5febc1f6bd31b9a5dd4162cb3aee0f9b6b3e0c747"} Apr 24 22:33:58.278088 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:33:58.278030 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5685b685df-f9lgc" podStartSLOduration=1.278014716 podStartE2EDuration="1.278014716s" podCreationTimestamp="2026-04-24 22:33:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:33:58.276945753 +0000 UTC m=+224.326801631" watchObservedRunningTime="2026-04-24 22:33:58.278014716 +0000 UTC m=+224.327870592" Apr 24 22:34:08.037536 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:08.037501 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5685b685df-f9lgc" Apr 24 22:34:08.037999 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:08.037550 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5685b685df-f9lgc" Apr 24 22:34:08.042297 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:08.042272 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5685b685df-f9lgc" Apr 24 22:34:08.296608 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:08.296530 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5685b685df-f9lgc" Apr 24 22:34:08.343085 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:08.343050 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-86d8f6f677-gt98f"] Apr 24 22:34:29.724978 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:29.724940 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-7z4fb"] Apr 24 22:34:29.729434 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:29.729416 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7z4fb" Apr 24 22:34:29.732621 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:29.732598 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 22:34:29.744678 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:29.744656 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7z4fb"] Apr 24 22:34:29.813371 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:29.813342 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/49978182-365f-4022-9727-6a7530dcbc1e-dbus\") pod \"global-pull-secret-syncer-7z4fb\" (UID: \"49978182-365f-4022-9727-6a7530dcbc1e\") " pod="kube-system/global-pull-secret-syncer-7z4fb" Apr 24 22:34:29.813530 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:29.813391 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/49978182-365f-4022-9727-6a7530dcbc1e-original-pull-secret\") pod \"global-pull-secret-syncer-7z4fb\" (UID: \"49978182-365f-4022-9727-6a7530dcbc1e\") " pod="kube-system/global-pull-secret-syncer-7z4fb" Apr 24 22:34:29.813530 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:29.813501 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/49978182-365f-4022-9727-6a7530dcbc1e-kubelet-config\") pod \"global-pull-secret-syncer-7z4fb\" (UID: \"49978182-365f-4022-9727-6a7530dcbc1e\") " pod="kube-system/global-pull-secret-syncer-7z4fb" Apr 24 22:34:29.914168 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:29.914132 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/49978182-365f-4022-9727-6a7530dcbc1e-original-pull-secret\") pod \"global-pull-secret-syncer-7z4fb\" (UID: \"49978182-365f-4022-9727-6a7530dcbc1e\") " pod="kube-system/global-pull-secret-syncer-7z4fb" Apr 24 22:34:29.914353 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:29.914192 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/49978182-365f-4022-9727-6a7530dcbc1e-kubelet-config\") pod \"global-pull-secret-syncer-7z4fb\" (UID: \"49978182-365f-4022-9727-6a7530dcbc1e\") " pod="kube-system/global-pull-secret-syncer-7z4fb" Apr 24 22:34:29.914353 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:29.914223 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/49978182-365f-4022-9727-6a7530dcbc1e-dbus\") pod \"global-pull-secret-syncer-7z4fb\" (UID: \"49978182-365f-4022-9727-6a7530dcbc1e\") " pod="kube-system/global-pull-secret-syncer-7z4fb" Apr 24 22:34:29.914353 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:29.914339 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/49978182-365f-4022-9727-6a7530dcbc1e-kubelet-config\") pod \"global-pull-secret-syncer-7z4fb\" (UID: \"49978182-365f-4022-9727-6a7530dcbc1e\") " pod="kube-system/global-pull-secret-syncer-7z4fb" Apr 24 22:34:29.914472 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:29.914363 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/49978182-365f-4022-9727-6a7530dcbc1e-dbus\") pod \"global-pull-secret-syncer-7z4fb\" (UID: \"49978182-365f-4022-9727-6a7530dcbc1e\") " pod="kube-system/global-pull-secret-syncer-7z4fb" Apr 24 22:34:29.916479 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:29.916447 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/49978182-365f-4022-9727-6a7530dcbc1e-original-pull-secret\") pod \"global-pull-secret-syncer-7z4fb\" (UID: \"49978182-365f-4022-9727-6a7530dcbc1e\") " pod="kube-system/global-pull-secret-syncer-7z4fb" Apr 24 22:34:30.039191 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:30.039162 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7z4fb" Apr 24 22:34:30.156348 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:30.156325 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7z4fb"] Apr 24 22:34:30.159028 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:34:30.159002 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49978182_365f_4022_9727_6a7530dcbc1e.slice/crio-e674f380a8445b1df511dbbb4324c5663ef5890adca49c47eaf0c9ca2b41c933 WatchSource:0}: Error finding container e674f380a8445b1df511dbbb4324c5663ef5890adca49c47eaf0c9ca2b41c933: Status 404 returned error can't find the container with id e674f380a8445b1df511dbbb4324c5663ef5890adca49c47eaf0c9ca2b41c933 Apr 24 22:34:30.352349 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:30.352265 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7z4fb" event={"ID":"49978182-365f-4022-9727-6a7530dcbc1e","Type":"ContainerStarted","Data":"e674f380a8445b1df511dbbb4324c5663ef5890adca49c47eaf0c9ca2b41c933"} Apr 24 22:34:33.369045 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:33.368941 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-86d8f6f677-gt98f" podUID="580a1a04-5f29-43ba-910a-0c6d221ead8e" containerName="console" containerID="cri-o://09a4ac6a728e2769c295057fbb25dbc7c387667e5da322cef08a84f4b7fe5d66" gracePeriod=15 Apr 24 22:34:33.864023 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:33.863996 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-86d8f6f677-gt98f_580a1a04-5f29-43ba-910a-0c6d221ead8e/console/0.log" Apr 24 22:34:33.864125 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:33.864074 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86d8f6f677-gt98f" Apr 24 22:34:33.951631 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:33.951548 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/580a1a04-5f29-43ba-910a-0c6d221ead8e-service-ca\") pod \"580a1a04-5f29-43ba-910a-0c6d221ead8e\" (UID: \"580a1a04-5f29-43ba-910a-0c6d221ead8e\") " Apr 24 22:34:33.951631 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:33.951590 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/580a1a04-5f29-43ba-910a-0c6d221ead8e-console-oauth-config\") pod \"580a1a04-5f29-43ba-910a-0c6d221ead8e\" (UID: \"580a1a04-5f29-43ba-910a-0c6d221ead8e\") " Apr 24 22:34:33.951631 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:33.951632 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/580a1a04-5f29-43ba-910a-0c6d221ead8e-oauth-serving-cert\") pod \"580a1a04-5f29-43ba-910a-0c6d221ead8e\" (UID: \"580a1a04-5f29-43ba-910a-0c6d221ead8e\") " Apr 24 22:34:33.951922 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:33.951655 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/580a1a04-5f29-43ba-910a-0c6d221ead8e-console-config\") pod \"580a1a04-5f29-43ba-910a-0c6d221ead8e\" (UID: \"580a1a04-5f29-43ba-910a-0c6d221ead8e\") " Apr 24 22:34:33.951922 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:33.951686 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/580a1a04-5f29-43ba-910a-0c6d221ead8e-trusted-ca-bundle\") pod \"580a1a04-5f29-43ba-910a-0c6d221ead8e\" (UID: \"580a1a04-5f29-43ba-910a-0c6d221ead8e\") " Apr 24 22:34:33.951922 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:33.951730 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qb9n\" (UniqueName: \"kubernetes.io/projected/580a1a04-5f29-43ba-910a-0c6d221ead8e-kube-api-access-9qb9n\") pod \"580a1a04-5f29-43ba-910a-0c6d221ead8e\" (UID: \"580a1a04-5f29-43ba-910a-0c6d221ead8e\") " Apr 24 22:34:33.951922 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:33.951753 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/580a1a04-5f29-43ba-910a-0c6d221ead8e-console-serving-cert\") pod \"580a1a04-5f29-43ba-910a-0c6d221ead8e\" (UID: \"580a1a04-5f29-43ba-910a-0c6d221ead8e\") " Apr 24 22:34:33.952142 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:33.952062 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/580a1a04-5f29-43ba-910a-0c6d221ead8e-service-ca" (OuterVolumeSpecName: "service-ca") pod "580a1a04-5f29-43ba-910a-0c6d221ead8e" (UID: "580a1a04-5f29-43ba-910a-0c6d221ead8e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:34:33.952251 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:33.952212 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/580a1a04-5f29-43ba-910a-0c6d221ead8e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "580a1a04-5f29-43ba-910a-0c6d221ead8e" (UID: "580a1a04-5f29-43ba-910a-0c6d221ead8e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:34:33.952251 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:33.952226 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/580a1a04-5f29-43ba-910a-0c6d221ead8e-console-config" (OuterVolumeSpecName: "console-config") pod "580a1a04-5f29-43ba-910a-0c6d221ead8e" (UID: "580a1a04-5f29-43ba-910a-0c6d221ead8e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:34:33.952491 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:33.952443 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/580a1a04-5f29-43ba-910a-0c6d221ead8e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "580a1a04-5f29-43ba-910a-0c6d221ead8e" (UID: "580a1a04-5f29-43ba-910a-0c6d221ead8e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:34:33.955008 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:33.954668 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/580a1a04-5f29-43ba-910a-0c6d221ead8e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "580a1a04-5f29-43ba-910a-0c6d221ead8e" (UID: "580a1a04-5f29-43ba-910a-0c6d221ead8e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:34:33.955408 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:33.955375 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/580a1a04-5f29-43ba-910a-0c6d221ead8e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "580a1a04-5f29-43ba-910a-0c6d221ead8e" (UID: "580a1a04-5f29-43ba-910a-0c6d221ead8e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:34:33.955510 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:33.955477 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/580a1a04-5f29-43ba-910a-0c6d221ead8e-kube-api-access-9qb9n" (OuterVolumeSpecName: "kube-api-access-9qb9n") pod "580a1a04-5f29-43ba-910a-0c6d221ead8e" (UID: "580a1a04-5f29-43ba-910a-0c6d221ead8e"). InnerVolumeSpecName "kube-api-access-9qb9n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:34:34.052506 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:34.052474 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/580a1a04-5f29-43ba-910a-0c6d221ead8e-service-ca\") on node \"ip-10-0-129-176.ec2.internal\" DevicePath \"\"" Apr 24 22:34:34.052506 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:34.052501 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/580a1a04-5f29-43ba-910a-0c6d221ead8e-console-oauth-config\") on node \"ip-10-0-129-176.ec2.internal\" DevicePath \"\"" Apr 24 22:34:34.052506 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:34.052511 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/580a1a04-5f29-43ba-910a-0c6d221ead8e-oauth-serving-cert\") on node \"ip-10-0-129-176.ec2.internal\" DevicePath \"\"" Apr 24 22:34:34.052711 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:34.052522 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/580a1a04-5f29-43ba-910a-0c6d221ead8e-console-config\") on node \"ip-10-0-129-176.ec2.internal\" DevicePath \"\"" Apr 24 22:34:34.052711 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:34.052532 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/580a1a04-5f29-43ba-910a-0c6d221ead8e-trusted-ca-bundle\") on node \"ip-10-0-129-176.ec2.internal\" DevicePath \"\"" Apr 24 22:34:34.052711 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:34.052549 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9qb9n\" (UniqueName: \"kubernetes.io/projected/580a1a04-5f29-43ba-910a-0c6d221ead8e-kube-api-access-9qb9n\") on node \"ip-10-0-129-176.ec2.internal\" DevicePath \"\"" Apr 24 22:34:34.052711 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:34.052558 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/580a1a04-5f29-43ba-910a-0c6d221ead8e-console-serving-cert\") on node \"ip-10-0-129-176.ec2.internal\" DevicePath \"\"" Apr 24 22:34:34.366969 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:34.366939 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-86d8f6f677-gt98f_580a1a04-5f29-43ba-910a-0c6d221ead8e/console/0.log" Apr 24 22:34:34.367159 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:34.366984 2574 generic.go:358] "Generic (PLEG): container finished" podID="580a1a04-5f29-43ba-910a-0c6d221ead8e" containerID="09a4ac6a728e2769c295057fbb25dbc7c387667e5da322cef08a84f4b7fe5d66" exitCode=2 Apr 24 22:34:34.367159 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:34.367043 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86d8f6f677-gt98f" Apr 24 22:34:34.367159 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:34.367060 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86d8f6f677-gt98f" event={"ID":"580a1a04-5f29-43ba-910a-0c6d221ead8e","Type":"ContainerDied","Data":"09a4ac6a728e2769c295057fbb25dbc7c387667e5da322cef08a84f4b7fe5d66"} Apr 24 22:34:34.367159 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:34.367095 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86d8f6f677-gt98f" event={"ID":"580a1a04-5f29-43ba-910a-0c6d221ead8e","Type":"ContainerDied","Data":"48c86b87cb525737c6716920c7f3804a2845be04eda0bb98f4a21d95cf758de9"} Apr 24 22:34:34.367159 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:34.367113 2574 scope.go:117] "RemoveContainer" containerID="09a4ac6a728e2769c295057fbb25dbc7c387667e5da322cef08a84f4b7fe5d66" Apr 24 22:34:34.368326 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:34.368299 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7z4fb" event={"ID":"49978182-365f-4022-9727-6a7530dcbc1e","Type":"ContainerStarted","Data":"adf96f091c374d464d4791e42315d28e9d118fb2e665af85cc267f91c5e5992f"} Apr 24 22:34:34.375736 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:34.375595 2574 scope.go:117] "RemoveContainer" containerID="09a4ac6a728e2769c295057fbb25dbc7c387667e5da322cef08a84f4b7fe5d66" Apr 24 22:34:34.375969 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:34:34.375824 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09a4ac6a728e2769c295057fbb25dbc7c387667e5da322cef08a84f4b7fe5d66\": container with ID starting with 09a4ac6a728e2769c295057fbb25dbc7c387667e5da322cef08a84f4b7fe5d66 not found: ID does not exist" containerID="09a4ac6a728e2769c295057fbb25dbc7c387667e5da322cef08a84f4b7fe5d66" Apr 24 22:34:34.375969 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:34.375846 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09a4ac6a728e2769c295057fbb25dbc7c387667e5da322cef08a84f4b7fe5d66"} err="failed to get container status \"09a4ac6a728e2769c295057fbb25dbc7c387667e5da322cef08a84f4b7fe5d66\": rpc error: code = NotFound desc = could not find container \"09a4ac6a728e2769c295057fbb25dbc7c387667e5da322cef08a84f4b7fe5d66\": container with ID starting with 09a4ac6a728e2769c295057fbb25dbc7c387667e5da322cef08a84f4b7fe5d66 not found: ID does not exist" Apr 24 22:34:34.388540 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:34.388505 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-7z4fb" podStartSLOduration=1.839769698 podStartE2EDuration="5.388495474s" podCreationTimestamp="2026-04-24 22:34:29 +0000 UTC" firstStartedPulling="2026-04-24 22:34:30.160569866 +0000 UTC m=+256.210425722" lastFinishedPulling="2026-04-24 22:34:33.709295642 +0000 UTC m=+259.759151498" observedRunningTime="2026-04-24 22:34:34.387137116 +0000 UTC m=+260.436992992" watchObservedRunningTime="2026-04-24 22:34:34.388495474 +0000 UTC m=+260.438351381" Apr 24 22:34:34.401739 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:34.401719 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-86d8f6f677-gt98f"] Apr 24 22:34:34.404957 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:34.404939 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-86d8f6f677-gt98f"] Apr 24 22:34:34.534731 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:34:34.534702 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="580a1a04-5f29-43ba-910a-0c6d221ead8e" path="/var/lib/kubelet/pods/580a1a04-5f29-43ba-910a-0c6d221ead8e/volumes" Apr 24 22:35:14.419683 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:35:14.419652 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6nnq9_6aaf0fb0-1f4a-46f7-a1db-82394fa8792a/console-operator/2.log" Apr 24 22:35:14.420277 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:35:14.419701 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6nnq9_6aaf0fb0-1f4a-46f7-a1db-82394fa8792a/console-operator/2.log" Apr 24 22:35:14.429649 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:35:14.429627 2574 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 22:35:15.474109 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:35:15.474081 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-z2whz"] Apr 24 22:35:15.476261 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:35:15.474383 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="580a1a04-5f29-43ba-910a-0c6d221ead8e" containerName="console" Apr 24 22:35:15.476261 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:35:15.474394 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="580a1a04-5f29-43ba-910a-0c6d221ead8e" containerName="console" Apr 24 22:35:15.476261 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:35:15.474449 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="580a1a04-5f29-43ba-910a-0c6d221ead8e" containerName="console" Apr 24 22:35:15.477098 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:35:15.477083 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-z2whz" Apr 24 22:35:15.480547 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:35:15.480522 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 24 22:35:15.480668 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:35:15.480525 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 24 22:35:15.481579 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:35:15.481555 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 24 22:35:15.481661 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:35:15.481582 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 24 22:35:15.481661 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:35:15.481628 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 24 22:35:15.481661 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:35:15.481650 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-t2khr\"" Apr 24 22:35:15.490598 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:35:15.490578 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-z2whz"] Apr 24 22:35:15.559927 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:35:15.559905 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/c1ec943b-aed4-4743-a627-fc68924ecaa2-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-z2whz\" (UID: \"c1ec943b-aed4-4743-a627-fc68924ecaa2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-z2whz" Apr 24 22:35:15.560052 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:35:15.559988 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwfk2\" (UniqueName: \"kubernetes.io/projected/c1ec943b-aed4-4743-a627-fc68924ecaa2-kube-api-access-wwfk2\") pod \"keda-metrics-apiserver-7c9f485588-z2whz\" (UID: \"c1ec943b-aed4-4743-a627-fc68924ecaa2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-z2whz" Apr 24 22:35:15.560100 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:35:15.560068 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c1ec943b-aed4-4743-a627-fc68924ecaa2-certificates\") pod \"keda-metrics-apiserver-7c9f485588-z2whz\" (UID: \"c1ec943b-aed4-4743-a627-fc68924ecaa2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-z2whz" Apr 24 22:35:15.661140 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:35:15.661114 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wwfk2\" (UniqueName: \"kubernetes.io/projected/c1ec943b-aed4-4743-a627-fc68924ecaa2-kube-api-access-wwfk2\") pod \"keda-metrics-apiserver-7c9f485588-z2whz\" (UID: \"c1ec943b-aed4-4743-a627-fc68924ecaa2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-z2whz" Apr 24 22:35:15.661297 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:35:15.661156 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c1ec943b-aed4-4743-a627-fc68924ecaa2-certificates\") pod \"keda-metrics-apiserver-7c9f485588-z2whz\" (UID: \"c1ec943b-aed4-4743-a627-fc68924ecaa2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-z2whz" Apr 24 22:35:15.661297 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:35:15.661254 2574 secret.go:281] references non-existent secret key: tls.crt Apr 24 22:35:15.661297 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:35:15.661265 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 22:35:15.661297 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:35:15.661281 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-z2whz: references non-existent secret key: tls.crt Apr 24 22:35:15.661499 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:35:15.661330 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c1ec943b-aed4-4743-a627-fc68924ecaa2-certificates podName:c1ec943b-aed4-4743-a627-fc68924ecaa2 nodeName:}" failed. No retries permitted until 2026-04-24 22:35:16.161311337 +0000 UTC m=+302.211167192 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c1ec943b-aed4-4743-a627-fc68924ecaa2-certificates") pod "keda-metrics-apiserver-7c9f485588-z2whz" (UID: "c1ec943b-aed4-4743-a627-fc68924ecaa2") : references non-existent secret key: tls.crt Apr 24 22:35:15.661499 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:35:15.661342 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/c1ec943b-aed4-4743-a627-fc68924ecaa2-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-z2whz\" (UID: \"c1ec943b-aed4-4743-a627-fc68924ecaa2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-z2whz" Apr 24 22:35:15.661787 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:35:15.661743 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/c1ec943b-aed4-4743-a627-fc68924ecaa2-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-z2whz\" (UID: \"c1ec943b-aed4-4743-a627-fc68924ecaa2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-z2whz" Apr 24 22:35:15.670435 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:35:15.670407 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwfk2\" (UniqueName: \"kubernetes.io/projected/c1ec943b-aed4-4743-a627-fc68924ecaa2-kube-api-access-wwfk2\") pod \"keda-metrics-apiserver-7c9f485588-z2whz\" (UID: \"c1ec943b-aed4-4743-a627-fc68924ecaa2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-z2whz" Apr 24 22:35:16.165139 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:35:16.165099 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c1ec943b-aed4-4743-a627-fc68924ecaa2-certificates\") pod \"keda-metrics-apiserver-7c9f485588-z2whz\" (UID: \"c1ec943b-aed4-4743-a627-fc68924ecaa2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-z2whz" Apr 24 22:35:16.167706 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:35:16.167679 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c1ec943b-aed4-4743-a627-fc68924ecaa2-certificates\") pod \"keda-metrics-apiserver-7c9f485588-z2whz\" (UID: \"c1ec943b-aed4-4743-a627-fc68924ecaa2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-z2whz" Apr 24 22:35:16.386959 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:35:16.386927 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-z2whz" Apr 24 22:35:16.521455 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:35:16.521351 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-z2whz"] Apr 24 22:35:16.524108 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:35:16.524073 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1ec943b_aed4_4743_a627_fc68924ecaa2.slice/crio-fadc965218bf15c24979264b875d9c0a7dbac25c59d15bd9f99ee46fe11635f3 WatchSource:0}: Error finding container fadc965218bf15c24979264b875d9c0a7dbac25c59d15bd9f99ee46fe11635f3: Status 404 returned error can't find the container with id fadc965218bf15c24979264b875d9c0a7dbac25c59d15bd9f99ee46fe11635f3 Apr 24 22:35:16.525589 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:35:16.525567 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:35:17.494913 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:35:17.494842 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-z2whz" event={"ID":"c1ec943b-aed4-4743-a627-fc68924ecaa2","Type":"ContainerStarted","Data":"fadc965218bf15c24979264b875d9c0a7dbac25c59d15bd9f99ee46fe11635f3"} Apr 24 22:35:19.507168 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:35:19.507069 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-z2whz" event={"ID":"c1ec943b-aed4-4743-a627-fc68924ecaa2","Type":"ContainerStarted","Data":"154e0103a15af8980bfaa0ed07cd41973f7d289251fa9ff2810bfd4e44ced305"} Apr 24 22:35:19.507576 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:35:19.507170 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-z2whz" Apr 24 22:35:19.529193 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:35:19.529144 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-z2whz" podStartSLOduration=1.946916689 podStartE2EDuration="4.529131496s" podCreationTimestamp="2026-04-24 22:35:15 +0000 UTC" firstStartedPulling="2026-04-24 22:35:16.525726242 +0000 UTC m=+302.575582098" lastFinishedPulling="2026-04-24 22:35:19.107941047 +0000 UTC m=+305.157796905" observedRunningTime="2026-04-24 22:35:19.52878547 +0000 UTC m=+305.578641346" watchObservedRunningTime="2026-04-24 22:35:19.529131496 +0000 UTC m=+305.578987408" Apr 24 22:35:30.515227 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:35:30.515197 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-z2whz" Apr 24 22:36:21.539616 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:21.539525 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-flmk9"] Apr 24 22:36:21.543119 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:21.543099 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-flmk9" Apr 24 22:36:21.546793 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:21.546754 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 24 22:36:21.546945 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:21.546926 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:36:21.547499 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:21.547475 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-jw8cv\"" Apr 24 22:36:21.573963 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:21.573939 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-flmk9"] Apr 24 22:36:21.575160 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:21.575133 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s7l4\" (UniqueName: \"kubernetes.io/projected/9254b60a-4f47-40d7-8c83-211726f6e9c8-kube-api-access-8s7l4\") pod \"cert-manager-operator-controller-manager-54b9655956-flmk9\" (UID: \"9254b60a-4f47-40d7-8c83-211726f6e9c8\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-flmk9" Apr 24 22:36:21.575255 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:21.575196 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9254b60a-4f47-40d7-8c83-211726f6e9c8-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-flmk9\" (UID: \"9254b60a-4f47-40d7-8c83-211726f6e9c8\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-flmk9" Apr 24 22:36:21.675609 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:21.675583 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8s7l4\" (UniqueName: \"kubernetes.io/projected/9254b60a-4f47-40d7-8c83-211726f6e9c8-kube-api-access-8s7l4\") pod \"cert-manager-operator-controller-manager-54b9655956-flmk9\" (UID: \"9254b60a-4f47-40d7-8c83-211726f6e9c8\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-flmk9" Apr 24 22:36:21.675770 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:21.675638 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9254b60a-4f47-40d7-8c83-211726f6e9c8-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-flmk9\" (UID: \"9254b60a-4f47-40d7-8c83-211726f6e9c8\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-flmk9" Apr 24 22:36:21.676016 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:21.675997 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9254b60a-4f47-40d7-8c83-211726f6e9c8-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-flmk9\" (UID: \"9254b60a-4f47-40d7-8c83-211726f6e9c8\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-flmk9" Apr 24 22:36:21.686007 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:21.685990 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s7l4\" (UniqueName: \"kubernetes.io/projected/9254b60a-4f47-40d7-8c83-211726f6e9c8-kube-api-access-8s7l4\") pod \"cert-manager-operator-controller-manager-54b9655956-flmk9\" (UID: \"9254b60a-4f47-40d7-8c83-211726f6e9c8\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-flmk9" Apr 24 22:36:21.853064 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:21.852987 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-flmk9" Apr 24 22:36:21.973750 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:21.973662 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-flmk9"] Apr 24 22:36:21.976198 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:36:21.976168 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9254b60a_4f47_40d7_8c83_211726f6e9c8.slice/crio-e0f2946c42a5532feeb00719b53eafdc5d0c552ab3cb113561eb2e07694e6605 WatchSource:0}: Error finding container e0f2946c42a5532feeb00719b53eafdc5d0c552ab3cb113561eb2e07694e6605: Status 404 returned error can't find the container with id e0f2946c42a5532feeb00719b53eafdc5d0c552ab3cb113561eb2e07694e6605 Apr 24 22:36:22.681435 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:22.681398 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-flmk9" event={"ID":"9254b60a-4f47-40d7-8c83-211726f6e9c8","Type":"ContainerStarted","Data":"e0f2946c42a5532feeb00719b53eafdc5d0c552ab3cb113561eb2e07694e6605"} Apr 24 22:36:24.690165 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:24.690136 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-flmk9" event={"ID":"9254b60a-4f47-40d7-8c83-211726f6e9c8","Type":"ContainerStarted","Data":"b65df4877c1e315a5b03b2ab3ed6414778b0b8dcfee8642d623cf15fdabaee62"} Apr 24 22:36:24.709719 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:24.709672 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-flmk9" podStartSLOduration=1.116104932 podStartE2EDuration="3.709659312s" podCreationTimestamp="2026-04-24 22:36:21 +0000 UTC" firstStartedPulling="2026-04-24 22:36:21.978523136 +0000 UTC m=+368.028378991" lastFinishedPulling="2026-04-24 22:36:24.572077512 +0000 UTC m=+370.621933371" observedRunningTime="2026-04-24 22:36:24.70856492 +0000 UTC m=+370.758420797" watchObservedRunningTime="2026-04-24 22:36:24.709659312 +0000 UTC m=+370.759515188" Apr 24 22:36:44.040291 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:44.040241 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-prjrt"] Apr 24 22:36:44.042563 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:44.042547 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-prjrt" Apr 24 22:36:44.045148 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:44.045123 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 24 22:36:44.045267 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:44.045234 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:36:44.046230 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:44.046205 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-gdsj2\"" Apr 24 22:36:44.050739 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:44.050714 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-prjrt"] Apr 24 22:36:44.151159 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:44.151134 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/535c70f5-87b5-4a78-8666-36f332fe14fc-tmp\") pod \"openshift-lws-operator-bfc7f696d-prjrt\" (UID: \"535c70f5-87b5-4a78-8666-36f332fe14fc\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-prjrt" Apr 24 22:36:44.151254 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:44.151173 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz7xl\" (UniqueName: \"kubernetes.io/projected/535c70f5-87b5-4a78-8666-36f332fe14fc-kube-api-access-xz7xl\") pod \"openshift-lws-operator-bfc7f696d-prjrt\" (UID: \"535c70f5-87b5-4a78-8666-36f332fe14fc\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-prjrt" Apr 24 22:36:44.251547 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:44.251519 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xz7xl\" (UniqueName: \"kubernetes.io/projected/535c70f5-87b5-4a78-8666-36f332fe14fc-kube-api-access-xz7xl\") pod \"openshift-lws-operator-bfc7f696d-prjrt\" (UID: \"535c70f5-87b5-4a78-8666-36f332fe14fc\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-prjrt" Apr 24 22:36:44.251697 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:44.251584 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/535c70f5-87b5-4a78-8666-36f332fe14fc-tmp\") pod \"openshift-lws-operator-bfc7f696d-prjrt\" (UID: \"535c70f5-87b5-4a78-8666-36f332fe14fc\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-prjrt" Apr 24 22:36:44.251888 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:44.251861 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/535c70f5-87b5-4a78-8666-36f332fe14fc-tmp\") pod \"openshift-lws-operator-bfc7f696d-prjrt\" (UID: \"535c70f5-87b5-4a78-8666-36f332fe14fc\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-prjrt" Apr 24 22:36:44.260339 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:44.260314 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz7xl\" (UniqueName: \"kubernetes.io/projected/535c70f5-87b5-4a78-8666-36f332fe14fc-kube-api-access-xz7xl\") pod \"openshift-lws-operator-bfc7f696d-prjrt\" (UID: \"535c70f5-87b5-4a78-8666-36f332fe14fc\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-prjrt" Apr 24 22:36:44.359606 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:44.359538 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-prjrt" Apr 24 22:36:44.474305 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:44.474265 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-prjrt"] Apr 24 22:36:44.477213 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:36:44.477188 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod535c70f5_87b5_4a78_8666_36f332fe14fc.slice/crio-ac567709825dfc884e293e7d7cf125df6c9a5c3bba781c2cd6d685c3d5bbccd2 WatchSource:0}: Error finding container ac567709825dfc884e293e7d7cf125df6c9a5c3bba781c2cd6d685c3d5bbccd2: Status 404 returned error can't find the container with id ac567709825dfc884e293e7d7cf125df6c9a5c3bba781c2cd6d685c3d5bbccd2 Apr 24 22:36:44.753839 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:44.753760 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-prjrt" event={"ID":"535c70f5-87b5-4a78-8666-36f332fe14fc","Type":"ContainerStarted","Data":"ac567709825dfc884e293e7d7cf125df6c9a5c3bba781c2cd6d685c3d5bbccd2"} Apr 24 22:36:45.920605 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:45.920564 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-k56v2"] Apr 24 22:36:45.923718 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:45.923695 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-k56v2" Apr 24 22:36:45.926218 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:45.926192 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 24 22:36:45.926218 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:45.926208 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 24 22:36:45.926388 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:45.926305 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-qmc6r\"" Apr 24 22:36:45.933187 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:45.932856 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-k56v2"] Apr 24 22:36:46.066617 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:46.066500 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh2dn\" (UniqueName: \"kubernetes.io/projected/df64f144-42ef-4aa8-800c-17359687f6c5-kube-api-access-nh2dn\") pod \"cert-manager-79c8d999ff-k56v2\" (UID: \"df64f144-42ef-4aa8-800c-17359687f6c5\") " pod="cert-manager/cert-manager-79c8d999ff-k56v2" Apr 24 22:36:46.066617 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:46.066569 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/df64f144-42ef-4aa8-800c-17359687f6c5-bound-sa-token\") pod \"cert-manager-79c8d999ff-k56v2\" (UID: \"df64f144-42ef-4aa8-800c-17359687f6c5\") " pod="cert-manager/cert-manager-79c8d999ff-k56v2" Apr 24 22:36:46.167721 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:46.167691 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nh2dn\" (UniqueName: \"kubernetes.io/projected/df64f144-42ef-4aa8-800c-17359687f6c5-kube-api-access-nh2dn\") pod \"cert-manager-79c8d999ff-k56v2\" (UID: \"df64f144-42ef-4aa8-800c-17359687f6c5\") " pod="cert-manager/cert-manager-79c8d999ff-k56v2" Apr 24 22:36:46.167883 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:46.167736 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/df64f144-42ef-4aa8-800c-17359687f6c5-bound-sa-token\") pod \"cert-manager-79c8d999ff-k56v2\" (UID: \"df64f144-42ef-4aa8-800c-17359687f6c5\") " pod="cert-manager/cert-manager-79c8d999ff-k56v2" Apr 24 22:36:46.176921 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:46.176841 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/df64f144-42ef-4aa8-800c-17359687f6c5-bound-sa-token\") pod \"cert-manager-79c8d999ff-k56v2\" (UID: \"df64f144-42ef-4aa8-800c-17359687f6c5\") " pod="cert-manager/cert-manager-79c8d999ff-k56v2" Apr 24 22:36:46.177087 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:46.177065 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh2dn\" (UniqueName: \"kubernetes.io/projected/df64f144-42ef-4aa8-800c-17359687f6c5-kube-api-access-nh2dn\") pod \"cert-manager-79c8d999ff-k56v2\" (UID: \"df64f144-42ef-4aa8-800c-17359687f6c5\") " pod="cert-manager/cert-manager-79c8d999ff-k56v2" Apr 24 22:36:46.237020 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:46.236989 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-k56v2" Apr 24 22:36:47.292344 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:47.292316 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-k56v2"] Apr 24 22:36:47.295200 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:36:47.295179 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf64f144_42ef_4aa8_800c_17359687f6c5.slice/crio-c1f078cb609278eb80767fa5d6bc91f632879dbcdc4c8eb77ffc889885bf0f22 WatchSource:0}: Error finding container c1f078cb609278eb80767fa5d6bc91f632879dbcdc4c8eb77ffc889885bf0f22: Status 404 returned error can't find the container with id c1f078cb609278eb80767fa5d6bc91f632879dbcdc4c8eb77ffc889885bf0f22 Apr 24 22:36:47.765640 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:47.765602 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-prjrt" event={"ID":"535c70f5-87b5-4a78-8666-36f332fe14fc","Type":"ContainerStarted","Data":"41fb9d45907413ab086c56c9241122852012b1b609dce5b40929a235930eefde"} Apr 24 22:36:47.766681 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:47.766648 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-k56v2" event={"ID":"df64f144-42ef-4aa8-800c-17359687f6c5","Type":"ContainerStarted","Data":"c1f078cb609278eb80767fa5d6bc91f632879dbcdc4c8eb77ffc889885bf0f22"} Apr 24 22:36:47.788749 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:47.788703 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-prjrt" podStartSLOduration=1.01716499 podStartE2EDuration="3.788688886s" podCreationTimestamp="2026-04-24 22:36:44 +0000 UTC" firstStartedPulling="2026-04-24 22:36:44.4786572 +0000 UTC m=+390.528513055" lastFinishedPulling="2026-04-24 22:36:47.250181081 +0000 UTC m=+393.300036951" observedRunningTime="2026-04-24 22:36:47.785569279 +0000 UTC m=+393.835425156" watchObservedRunningTime="2026-04-24 22:36:47.788688886 +0000 UTC m=+393.838544764" Apr 24 22:36:49.776096 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:49.776063 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-k56v2" event={"ID":"df64f144-42ef-4aa8-800c-17359687f6c5","Type":"ContainerStarted","Data":"8bc807fac33d2196a923c95d715fb1e0c9b390f07dfe0203251470cad2d85a41"} Apr 24 22:36:49.791523 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:49.791477 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-k56v2" podStartSLOduration=2.5016278830000003 podStartE2EDuration="4.791464416s" podCreationTimestamp="2026-04-24 22:36:45 +0000 UTC" firstStartedPulling="2026-04-24 22:36:47.29707431 +0000 UTC m=+393.346930165" lastFinishedPulling="2026-04-24 22:36:49.586910838 +0000 UTC m=+395.636766698" observedRunningTime="2026-04-24 22:36:49.790658781 +0000 UTC m=+395.840514658" watchObservedRunningTime="2026-04-24 22:36:49.791464416 +0000 UTC m=+395.841320293" Apr 24 22:36:57.935575 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:57.935542 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-859c5c9fd7-2flfv"] Apr 24 22:36:57.939007 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:57.938991 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-859c5c9fd7-2flfv" Apr 24 22:36:57.947008 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:57.946984 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-jq65j\"" Apr 24 22:36:57.947008 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:57.946989 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 24 22:36:57.947168 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:57.947116 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 24 22:36:57.947557 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:57.947537 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 24 22:36:57.968701 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:57.968677 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-859c5c9fd7-2flfv"] Apr 24 22:36:58.065360 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:58.065323 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/92c004d9-1387-4f01-a97b-ce4eb2153936-manager-config\") pod \"lws-controller-manager-859c5c9fd7-2flfv\" (UID: \"92c004d9-1387-4f01-a97b-ce4eb2153936\") " pod="openshift-lws-operator/lws-controller-manager-859c5c9fd7-2flfv" Apr 24 22:36:58.065514 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:58.065400 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92c004d9-1387-4f01-a97b-ce4eb2153936-cert\") pod \"lws-controller-manager-859c5c9fd7-2flfv\" (UID: \"92c004d9-1387-4f01-a97b-ce4eb2153936\") " pod="openshift-lws-operator/lws-controller-manager-859c5c9fd7-2flfv" Apr 24 22:36:58.065514 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:58.065420 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk2fl\" (UniqueName: \"kubernetes.io/projected/92c004d9-1387-4f01-a97b-ce4eb2153936-kube-api-access-dk2fl\") pod \"lws-controller-manager-859c5c9fd7-2flfv\" (UID: \"92c004d9-1387-4f01-a97b-ce4eb2153936\") " pod="openshift-lws-operator/lws-controller-manager-859c5c9fd7-2flfv" Apr 24 22:36:58.065514 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:58.065443 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/92c004d9-1387-4f01-a97b-ce4eb2153936-metrics-cert\") pod \"lws-controller-manager-859c5c9fd7-2flfv\" (UID: \"92c004d9-1387-4f01-a97b-ce4eb2153936\") " pod="openshift-lws-operator/lws-controller-manager-859c5c9fd7-2flfv" Apr 24 22:36:58.166144 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:58.166106 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92c004d9-1387-4f01-a97b-ce4eb2153936-cert\") pod \"lws-controller-manager-859c5c9fd7-2flfv\" (UID: \"92c004d9-1387-4f01-a97b-ce4eb2153936\") " pod="openshift-lws-operator/lws-controller-manager-859c5c9fd7-2flfv" Apr 24 22:36:58.166144 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:58.166149 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dk2fl\" (UniqueName: \"kubernetes.io/projected/92c004d9-1387-4f01-a97b-ce4eb2153936-kube-api-access-dk2fl\") pod \"lws-controller-manager-859c5c9fd7-2flfv\" (UID: \"92c004d9-1387-4f01-a97b-ce4eb2153936\") " pod="openshift-lws-operator/lws-controller-manager-859c5c9fd7-2flfv" Apr 24 22:36:58.166341 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:58.166179 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/92c004d9-1387-4f01-a97b-ce4eb2153936-metrics-cert\") pod \"lws-controller-manager-859c5c9fd7-2flfv\" (UID: \"92c004d9-1387-4f01-a97b-ce4eb2153936\") " pod="openshift-lws-operator/lws-controller-manager-859c5c9fd7-2flfv" Apr 24 22:36:58.166341 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:58.166206 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/92c004d9-1387-4f01-a97b-ce4eb2153936-manager-config\") pod \"lws-controller-manager-859c5c9fd7-2flfv\" (UID: \"92c004d9-1387-4f01-a97b-ce4eb2153936\") " pod="openshift-lws-operator/lws-controller-manager-859c5c9fd7-2flfv" Apr 24 22:36:58.166926 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:58.166901 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/92c004d9-1387-4f01-a97b-ce4eb2153936-manager-config\") pod \"lws-controller-manager-859c5c9fd7-2flfv\" (UID: \"92c004d9-1387-4f01-a97b-ce4eb2153936\") " pod="openshift-lws-operator/lws-controller-manager-859c5c9fd7-2flfv" Apr 24 22:36:58.168705 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:58.168686 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/92c004d9-1387-4f01-a97b-ce4eb2153936-metrics-cert\") pod \"lws-controller-manager-859c5c9fd7-2flfv\" (UID: \"92c004d9-1387-4f01-a97b-ce4eb2153936\") " pod="openshift-lws-operator/lws-controller-manager-859c5c9fd7-2flfv" Apr 24 22:36:58.168705 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:58.168698 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92c004d9-1387-4f01-a97b-ce4eb2153936-cert\") pod \"lws-controller-manager-859c5c9fd7-2flfv\" (UID: \"92c004d9-1387-4f01-a97b-ce4eb2153936\") " pod="openshift-lws-operator/lws-controller-manager-859c5c9fd7-2flfv" Apr 24 22:36:58.177022 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:58.177002 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk2fl\" (UniqueName: \"kubernetes.io/projected/92c004d9-1387-4f01-a97b-ce4eb2153936-kube-api-access-dk2fl\") pod \"lws-controller-manager-859c5c9fd7-2flfv\" (UID: \"92c004d9-1387-4f01-a97b-ce4eb2153936\") " pod="openshift-lws-operator/lws-controller-manager-859c5c9fd7-2flfv" Apr 24 22:36:58.248116 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:58.248041 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-859c5c9fd7-2flfv" Apr 24 22:36:58.387371 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:58.387340 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-859c5c9fd7-2flfv"] Apr 24 22:36:58.390676 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:36:58.390648 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92c004d9_1387_4f01_a97b_ce4eb2153936.slice/crio-ed2e8ed8f59775b69c22c2423b8e6a04f0879f74e590504e3e6f0aa79fdcba46 WatchSource:0}: Error finding container ed2e8ed8f59775b69c22c2423b8e6a04f0879f74e590504e3e6f0aa79fdcba46: Status 404 returned error can't find the container with id ed2e8ed8f59775b69c22c2423b8e6a04f0879f74e590504e3e6f0aa79fdcba46 Apr 24 22:36:58.803689 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:36:58.803655 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-859c5c9fd7-2flfv" event={"ID":"92c004d9-1387-4f01-a97b-ce4eb2153936","Type":"ContainerStarted","Data":"ed2e8ed8f59775b69c22c2423b8e6a04f0879f74e590504e3e6f0aa79fdcba46"} Apr 24 22:37:00.812417 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:00.812379 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-859c5c9fd7-2flfv" event={"ID":"92c004d9-1387-4f01-a97b-ce4eb2153936","Type":"ContainerStarted","Data":"0493ec796b985b3d1af1454991fde1c198dd0c3041ca3955de44eb59debe289a"} Apr 24 22:37:00.812799 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:00.812439 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-859c5c9fd7-2flfv" Apr 24 22:37:00.831807 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:00.831755 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-859c5c9fd7-2flfv" podStartSLOduration=2.345844939 podStartE2EDuration="3.831739901s" podCreationTimestamp="2026-04-24 22:36:57 +0000 UTC" firstStartedPulling="2026-04-24 22:36:58.392365132 +0000 UTC m=+404.442220987" lastFinishedPulling="2026-04-24 22:36:59.878260095 +0000 UTC m=+405.928115949" observedRunningTime="2026-04-24 22:37:00.830054842 +0000 UTC m=+406.879910734" watchObservedRunningTime="2026-04-24 22:37:00.831739901 +0000 UTC m=+406.881595775" Apr 24 22:37:11.821277 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:11.821246 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-859c5c9fd7-2flfv" Apr 24 22:37:54.782441 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:54.782403 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-b4d6b6964-qfdj4"] Apr 24 22:37:54.784842 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:54.784822 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b4d6b6964-qfdj4" Apr 24 22:37:54.872179 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:54.872149 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b4d6b6964-qfdj4"] Apr 24 22:37:54.923947 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:54.923913 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/30ab53a3-5d80-4449-af21-7cb13656bbb7-console-serving-cert\") pod \"console-b4d6b6964-qfdj4\" (UID: \"30ab53a3-5d80-4449-af21-7cb13656bbb7\") " pod="openshift-console/console-b4d6b6964-qfdj4" Apr 24 22:37:54.924099 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:54.923957 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/30ab53a3-5d80-4449-af21-7cb13656bbb7-console-config\") pod \"console-b4d6b6964-qfdj4\" (UID: \"30ab53a3-5d80-4449-af21-7cb13656bbb7\") " pod="openshift-console/console-b4d6b6964-qfdj4" Apr 24 22:37:54.924099 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:54.923985 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/30ab53a3-5d80-4449-af21-7cb13656bbb7-service-ca\") pod \"console-b4d6b6964-qfdj4\" (UID: \"30ab53a3-5d80-4449-af21-7cb13656bbb7\") " pod="openshift-console/console-b4d6b6964-qfdj4" Apr 24 22:37:54.924099 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:54.924013 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30ab53a3-5d80-4449-af21-7cb13656bbb7-trusted-ca-bundle\") pod \"console-b4d6b6964-qfdj4\" (UID: \"30ab53a3-5d80-4449-af21-7cb13656bbb7\") " pod="openshift-console/console-b4d6b6964-qfdj4" Apr 24 22:37:54.924099 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:54.924065 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/30ab53a3-5d80-4449-af21-7cb13656bbb7-oauth-serving-cert\") pod \"console-b4d6b6964-qfdj4\" (UID: \"30ab53a3-5d80-4449-af21-7cb13656bbb7\") " pod="openshift-console/console-b4d6b6964-qfdj4" Apr 24 22:37:54.924249 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:54.924117 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7jgx\" (UniqueName: \"kubernetes.io/projected/30ab53a3-5d80-4449-af21-7cb13656bbb7-kube-api-access-m7jgx\") pod \"console-b4d6b6964-qfdj4\" (UID: \"30ab53a3-5d80-4449-af21-7cb13656bbb7\") " pod="openshift-console/console-b4d6b6964-qfdj4" Apr 24 22:37:54.924249 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:54.924189 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/30ab53a3-5d80-4449-af21-7cb13656bbb7-console-oauth-config\") pod \"console-b4d6b6964-qfdj4\" (UID: \"30ab53a3-5d80-4449-af21-7cb13656bbb7\") " pod="openshift-console/console-b4d6b6964-qfdj4" Apr 24 22:37:55.025575 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:55.025549 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/30ab53a3-5d80-4449-af21-7cb13656bbb7-console-serving-cert\") pod \"console-b4d6b6964-qfdj4\" (UID: \"30ab53a3-5d80-4449-af21-7cb13656bbb7\") " pod="openshift-console/console-b4d6b6964-qfdj4" Apr 24 22:37:55.025575 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:55.025578 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/30ab53a3-5d80-4449-af21-7cb13656bbb7-console-config\") pod \"console-b4d6b6964-qfdj4\" (UID: \"30ab53a3-5d80-4449-af21-7cb13656bbb7\") " pod="openshift-console/console-b4d6b6964-qfdj4" Apr 24 22:37:55.025738 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:55.025603 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/30ab53a3-5d80-4449-af21-7cb13656bbb7-service-ca\") pod \"console-b4d6b6964-qfdj4\" (UID: \"30ab53a3-5d80-4449-af21-7cb13656bbb7\") " pod="openshift-console/console-b4d6b6964-qfdj4" Apr 24 22:37:55.025738 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:55.025628 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30ab53a3-5d80-4449-af21-7cb13656bbb7-trusted-ca-bundle\") pod \"console-b4d6b6964-qfdj4\" (UID: \"30ab53a3-5d80-4449-af21-7cb13656bbb7\") " pod="openshift-console/console-b4d6b6964-qfdj4" Apr 24 22:37:55.025738 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:55.025655 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/30ab53a3-5d80-4449-af21-7cb13656bbb7-oauth-serving-cert\") pod \"console-b4d6b6964-qfdj4\" (UID: \"30ab53a3-5d80-4449-af21-7cb13656bbb7\") " pod="openshift-console/console-b4d6b6964-qfdj4" Apr 24 22:37:55.025738 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:55.025692 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m7jgx\" (UniqueName: \"kubernetes.io/projected/30ab53a3-5d80-4449-af21-7cb13656bbb7-kube-api-access-m7jgx\") pod \"console-b4d6b6964-qfdj4\" (UID: \"30ab53a3-5d80-4449-af21-7cb13656bbb7\") " pod="openshift-console/console-b4d6b6964-qfdj4" Apr 24 22:37:55.025738 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:55.025726 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/30ab53a3-5d80-4449-af21-7cb13656bbb7-console-oauth-config\") pod \"console-b4d6b6964-qfdj4\" (UID: \"30ab53a3-5d80-4449-af21-7cb13656bbb7\") " pod="openshift-console/console-b4d6b6964-qfdj4" Apr 24 22:37:55.026356 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:55.026328 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/30ab53a3-5d80-4449-af21-7cb13656bbb7-service-ca\") pod \"console-b4d6b6964-qfdj4\" (UID: \"30ab53a3-5d80-4449-af21-7cb13656bbb7\") " pod="openshift-console/console-b4d6b6964-qfdj4" Apr 24 22:37:55.026462 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:55.026362 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/30ab53a3-5d80-4449-af21-7cb13656bbb7-console-config\") pod \"console-b4d6b6964-qfdj4\" (UID: \"30ab53a3-5d80-4449-af21-7cb13656bbb7\") " pod="openshift-console/console-b4d6b6964-qfdj4" Apr 24 22:37:55.026520 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:55.026452 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/30ab53a3-5d80-4449-af21-7cb13656bbb7-oauth-serving-cert\") pod \"console-b4d6b6964-qfdj4\" (UID: \"30ab53a3-5d80-4449-af21-7cb13656bbb7\") " pod="openshift-console/console-b4d6b6964-qfdj4" Apr 24 22:37:55.026578 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:55.026548 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30ab53a3-5d80-4449-af21-7cb13656bbb7-trusted-ca-bundle\") pod \"console-b4d6b6964-qfdj4\" (UID: \"30ab53a3-5d80-4449-af21-7cb13656bbb7\") " pod="openshift-console/console-b4d6b6964-qfdj4" Apr 24 22:37:55.028149 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:55.028122 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/30ab53a3-5d80-4449-af21-7cb13656bbb7-console-oauth-config\") pod \"console-b4d6b6964-qfdj4\" (UID: \"30ab53a3-5d80-4449-af21-7cb13656bbb7\") " pod="openshift-console/console-b4d6b6964-qfdj4" Apr 24 22:37:55.028315 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:55.028296 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/30ab53a3-5d80-4449-af21-7cb13656bbb7-console-serving-cert\") pod \"console-b4d6b6964-qfdj4\" (UID: \"30ab53a3-5d80-4449-af21-7cb13656bbb7\") " pod="openshift-console/console-b4d6b6964-qfdj4" Apr 24 22:37:55.034328 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:55.034273 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7jgx\" (UniqueName: \"kubernetes.io/projected/30ab53a3-5d80-4449-af21-7cb13656bbb7-kube-api-access-m7jgx\") pod \"console-b4d6b6964-qfdj4\" (UID: \"30ab53a3-5d80-4449-af21-7cb13656bbb7\") " pod="openshift-console/console-b4d6b6964-qfdj4" Apr 24 22:37:55.093911 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:55.093867 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b4d6b6964-qfdj4" Apr 24 22:37:55.232419 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:55.232395 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b4d6b6964-qfdj4"] Apr 24 22:37:55.235753 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:37:55.235725 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30ab53a3_5d80_4449_af21_7cb13656bbb7.slice/crio-cd3efc7fafb6147589f30b2e02623c949f926ec61c579ad73773d32598a43b55 WatchSource:0}: Error finding container cd3efc7fafb6147589f30b2e02623c949f926ec61c579ad73773d32598a43b55: Status 404 returned error can't find the container with id cd3efc7fafb6147589f30b2e02623c949f926ec61c579ad73773d32598a43b55 Apr 24 22:37:55.992015 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:55.991981 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b4d6b6964-qfdj4" event={"ID":"30ab53a3-5d80-4449-af21-7cb13656bbb7","Type":"ContainerStarted","Data":"95c9b85de209182e74f5ac55d9d83bab0e58c40c3ef68444d9320af781f88488"} Apr 24 22:37:55.992015 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:55.992015 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b4d6b6964-qfdj4" event={"ID":"30ab53a3-5d80-4449-af21-7cb13656bbb7","Type":"ContainerStarted","Data":"cd3efc7fafb6147589f30b2e02623c949f926ec61c579ad73773d32598a43b55"} Apr 24 22:37:56.035502 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:56.035456 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-b4d6b6964-qfdj4" podStartSLOduration=2.035439543 podStartE2EDuration="2.035439543s" podCreationTimestamp="2026-04-24 22:37:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:37:56.034185332 +0000 UTC m=+462.084041209" watchObservedRunningTime="2026-04-24 22:37:56.035439543 +0000 UTC m=+462.085295421" Apr 24 22:37:57.530862 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:57.530819 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-g2zkf"] Apr 24 22:37:57.533490 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:57.533463 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-g2zkf" Apr 24 22:37:57.536999 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:57.536976 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 24 22:37:57.538339 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:57.538315 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 24 22:37:57.538445 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:57.538339 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 24 22:37:57.538445 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:57.538367 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 24 22:37:57.538445 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:57.538437 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-hg677\"" Apr 24 22:37:57.546504 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:57.546479 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-g2zkf"] Apr 24 22:37:57.649699 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:57.649669 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0a617d1b-ced0-42bf-a30e-385b14d42d96-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-g2zkf\" (UID: \"0a617d1b-ced0-42bf-a30e-385b14d42d96\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-g2zkf" Apr 24 22:37:57.649850 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:57.649712 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8nxc\" (UniqueName: \"kubernetes.io/projected/0a617d1b-ced0-42bf-a30e-385b14d42d96-kube-api-access-z8nxc\") pod \"kuadrant-console-plugin-6c886788f8-g2zkf\" (UID: \"0a617d1b-ced0-42bf-a30e-385b14d42d96\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-g2zkf" Apr 24 22:37:57.649850 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:57.649831 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a617d1b-ced0-42bf-a30e-385b14d42d96-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-g2zkf\" (UID: \"0a617d1b-ced0-42bf-a30e-385b14d42d96\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-g2zkf" Apr 24 22:37:57.751150 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:57.751124 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0a617d1b-ced0-42bf-a30e-385b14d42d96-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-g2zkf\" (UID: \"0a617d1b-ced0-42bf-a30e-385b14d42d96\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-g2zkf" Apr 24 22:37:57.751250 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:57.751165 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8nxc\" (UniqueName: \"kubernetes.io/projected/0a617d1b-ced0-42bf-a30e-385b14d42d96-kube-api-access-z8nxc\") pod \"kuadrant-console-plugin-6c886788f8-g2zkf\" (UID: \"0a617d1b-ced0-42bf-a30e-385b14d42d96\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-g2zkf" Apr 24 22:37:57.751250 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:57.751197 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a617d1b-ced0-42bf-a30e-385b14d42d96-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-g2zkf\" (UID: \"0a617d1b-ced0-42bf-a30e-385b14d42d96\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-g2zkf" Apr 24 22:37:57.751373 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:37:57.751284 2574 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 24 22:37:57.751373 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:37:57.751342 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a617d1b-ced0-42bf-a30e-385b14d42d96-plugin-serving-cert podName:0a617d1b-ced0-42bf-a30e-385b14d42d96 nodeName:}" failed. No retries permitted until 2026-04-24 22:37:58.251326844 +0000 UTC m=+464.301182698 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/0a617d1b-ced0-42bf-a30e-385b14d42d96-plugin-serving-cert") pod "kuadrant-console-plugin-6c886788f8-g2zkf" (UID: "0a617d1b-ced0-42bf-a30e-385b14d42d96") : secret "plugin-serving-cert" not found Apr 24 22:37:57.751700 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:57.751680 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0a617d1b-ced0-42bf-a30e-385b14d42d96-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-g2zkf\" (UID: \"0a617d1b-ced0-42bf-a30e-385b14d42d96\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-g2zkf" Apr 24 22:37:57.760831 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:57.760810 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8nxc\" (UniqueName: \"kubernetes.io/projected/0a617d1b-ced0-42bf-a30e-385b14d42d96-kube-api-access-z8nxc\") pod \"kuadrant-console-plugin-6c886788f8-g2zkf\" (UID: \"0a617d1b-ced0-42bf-a30e-385b14d42d96\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-g2zkf" Apr 24 22:37:58.256748 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:58.256717 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a617d1b-ced0-42bf-a30e-385b14d42d96-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-g2zkf\" (UID: \"0a617d1b-ced0-42bf-a30e-385b14d42d96\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-g2zkf" Apr 24 22:37:58.259163 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:58.259142 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a617d1b-ced0-42bf-a30e-385b14d42d96-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-g2zkf\" (UID: \"0a617d1b-ced0-42bf-a30e-385b14d42d96\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-g2zkf" Apr 24 22:37:58.443472 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:58.443435 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-g2zkf" Apr 24 22:37:58.565489 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:58.565468 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-g2zkf"] Apr 24 22:37:58.567422 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:37:58.567393 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a617d1b_ced0_42bf_a30e_385b14d42d96.slice/crio-7821fcfed5a3a3fe479d110056360ef113ad229d8e25de920674ef7ac773991c WatchSource:0}: Error finding container 7821fcfed5a3a3fe479d110056360ef113ad229d8e25de920674ef7ac773991c: Status 404 returned error can't find the container with id 7821fcfed5a3a3fe479d110056360ef113ad229d8e25de920674ef7ac773991c Apr 24 22:37:59.003948 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:37:59.003906 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-g2zkf" event={"ID":"0a617d1b-ced0-42bf-a30e-385b14d42d96","Type":"ContainerStarted","Data":"7821fcfed5a3a3fe479d110056360ef113ad229d8e25de920674ef7ac773991c"} Apr 24 22:38:04.025201 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:04.025154 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-g2zkf" event={"ID":"0a617d1b-ced0-42bf-a30e-385b14d42d96","Type":"ContainerStarted","Data":"58c534d33df0f1f2ae518751044f3ebef889cd34588c50396cefd1f2856b4e4b"} Apr 24 22:38:04.041223 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:04.041179 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-g2zkf" podStartSLOduration=2.499188955 podStartE2EDuration="7.041167325s" podCreationTimestamp="2026-04-24 22:37:57 +0000 UTC" firstStartedPulling="2026-04-24 22:37:58.568713476 +0000 UTC m=+464.618569331" lastFinishedPulling="2026-04-24 22:38:03.110691843 +0000 UTC m=+469.160547701" observedRunningTime="2026-04-24 22:38:04.039711051 +0000 UTC m=+470.089566929" watchObservedRunningTime="2026-04-24 22:38:04.041167325 +0000 UTC m=+470.091023201" Apr 24 22:38:05.094778 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:05.094745 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-b4d6b6964-qfdj4" Apr 24 22:38:05.095171 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:05.094791 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-b4d6b6964-qfdj4" Apr 24 22:38:05.099212 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:05.099189 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-b4d6b6964-qfdj4" Apr 24 22:38:06.035191 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:06.035164 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-b4d6b6964-qfdj4" Apr 24 22:38:06.085400 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:06.085367 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5685b685df-f9lgc"] Apr 24 22:38:31.111943 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:31.111889 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5685b685df-f9lgc" podUID="f9af5106-aa0e-4042-8609-9a787ce03588" containerName="console" containerID="cri-o://fa1f72788b57c7e5d200183f2c91d302d34562ed274b3e00bbae9e5c50c80739" gracePeriod=15 Apr 24 22:38:31.359000 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:31.358978 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5685b685df-f9lgc_f9af5106-aa0e-4042-8609-9a787ce03588/console/0.log" Apr 24 22:38:31.359110 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:31.359037 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5685b685df-f9lgc" Apr 24 22:38:31.431224 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:31.431147 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2959\" (UniqueName: \"kubernetes.io/projected/f9af5106-aa0e-4042-8609-9a787ce03588-kube-api-access-z2959\") pod \"f9af5106-aa0e-4042-8609-9a787ce03588\" (UID: \"f9af5106-aa0e-4042-8609-9a787ce03588\") " Apr 24 22:38:31.431224 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:31.431183 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f9af5106-aa0e-4042-8609-9a787ce03588-console-config\") pod \"f9af5106-aa0e-4042-8609-9a787ce03588\" (UID: \"f9af5106-aa0e-4042-8609-9a787ce03588\") " Apr 24 22:38:31.431224 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:31.431216 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f9af5106-aa0e-4042-8609-9a787ce03588-service-ca\") pod \"f9af5106-aa0e-4042-8609-9a787ce03588\" (UID: \"f9af5106-aa0e-4042-8609-9a787ce03588\") " Apr 24 22:38:31.431500 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:31.431234 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f9af5106-aa0e-4042-8609-9a787ce03588-oauth-serving-cert\") pod \"f9af5106-aa0e-4042-8609-9a787ce03588\" (UID: \"f9af5106-aa0e-4042-8609-9a787ce03588\") " Apr 24 22:38:31.431500 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:31.431449 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f9af5106-aa0e-4042-8609-9a787ce03588-console-serving-cert\") pod \"f9af5106-aa0e-4042-8609-9a787ce03588\" (UID: \"f9af5106-aa0e-4042-8609-9a787ce03588\") " Apr 24 22:38:31.431604 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:31.431505 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9af5106-aa0e-4042-8609-9a787ce03588-trusted-ca-bundle\") pod \"f9af5106-aa0e-4042-8609-9a787ce03588\" (UID: \"f9af5106-aa0e-4042-8609-9a787ce03588\") " Apr 24 22:38:31.431604 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:31.431566 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f9af5106-aa0e-4042-8609-9a787ce03588-console-oauth-config\") pod \"f9af5106-aa0e-4042-8609-9a787ce03588\" (UID: \"f9af5106-aa0e-4042-8609-9a787ce03588\") " Apr 24 22:38:31.431604 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:31.431582 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9af5106-aa0e-4042-8609-9a787ce03588-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f9af5106-aa0e-4042-8609-9a787ce03588" (UID: "f9af5106-aa0e-4042-8609-9a787ce03588"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:38:31.431727 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:31.431598 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9af5106-aa0e-4042-8609-9a787ce03588-service-ca" (OuterVolumeSpecName: "service-ca") pod "f9af5106-aa0e-4042-8609-9a787ce03588" (UID: "f9af5106-aa0e-4042-8609-9a787ce03588"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:38:31.431727 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:31.431583 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9af5106-aa0e-4042-8609-9a787ce03588-console-config" (OuterVolumeSpecName: "console-config") pod "f9af5106-aa0e-4042-8609-9a787ce03588" (UID: "f9af5106-aa0e-4042-8609-9a787ce03588"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:38:31.431986 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:31.431959 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9af5106-aa0e-4042-8609-9a787ce03588-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f9af5106-aa0e-4042-8609-9a787ce03588" (UID: "f9af5106-aa0e-4042-8609-9a787ce03588"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:38:31.432108 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:31.432044 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9af5106-aa0e-4042-8609-9a787ce03588-trusted-ca-bundle\") on node \"ip-10-0-129-176.ec2.internal\" DevicePath \"\"" Apr 24 22:38:31.432108 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:31.432062 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f9af5106-aa0e-4042-8609-9a787ce03588-console-config\") on node \"ip-10-0-129-176.ec2.internal\" DevicePath \"\"" Apr 24 22:38:31.432108 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:31.432075 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f9af5106-aa0e-4042-8609-9a787ce03588-service-ca\") on node \"ip-10-0-129-176.ec2.internal\" DevicePath \"\"" Apr 24 22:38:31.432108 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:31.432088 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f9af5106-aa0e-4042-8609-9a787ce03588-oauth-serving-cert\") on node \"ip-10-0-129-176.ec2.internal\" DevicePath \"\"" Apr 24 22:38:31.433752 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:31.433716 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9af5106-aa0e-4042-8609-9a787ce03588-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f9af5106-aa0e-4042-8609-9a787ce03588" (UID: "f9af5106-aa0e-4042-8609-9a787ce03588"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:38:31.433752 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:31.433740 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9af5106-aa0e-4042-8609-9a787ce03588-kube-api-access-z2959" (OuterVolumeSpecName: "kube-api-access-z2959") pod "f9af5106-aa0e-4042-8609-9a787ce03588" (UID: "f9af5106-aa0e-4042-8609-9a787ce03588"). InnerVolumeSpecName "kube-api-access-z2959". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:38:31.433919 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:31.433770 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9af5106-aa0e-4042-8609-9a787ce03588-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f9af5106-aa0e-4042-8609-9a787ce03588" (UID: "f9af5106-aa0e-4042-8609-9a787ce03588"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:38:31.533279 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:31.533239 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f9af5106-aa0e-4042-8609-9a787ce03588-console-oauth-config\") on node \"ip-10-0-129-176.ec2.internal\" DevicePath \"\"" Apr 24 22:38:31.533279 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:31.533272 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z2959\" (UniqueName: \"kubernetes.io/projected/f9af5106-aa0e-4042-8609-9a787ce03588-kube-api-access-z2959\") on node \"ip-10-0-129-176.ec2.internal\" DevicePath \"\"" Apr 24 22:38:31.533279 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:31.533282 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f9af5106-aa0e-4042-8609-9a787ce03588-console-serving-cert\") on node \"ip-10-0-129-176.ec2.internal\" DevicePath \"\"" Apr 24 22:38:32.120059 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:32.120028 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5685b685df-f9lgc_f9af5106-aa0e-4042-8609-9a787ce03588/console/0.log" Apr 24 22:38:32.120493 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:32.120067 2574 generic.go:358] "Generic (PLEG): container finished" podID="f9af5106-aa0e-4042-8609-9a787ce03588" containerID="fa1f72788b57c7e5d200183f2c91d302d34562ed274b3e00bbae9e5c50c80739" exitCode=2 Apr 24 22:38:32.120493 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:32.120096 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5685b685df-f9lgc" event={"ID":"f9af5106-aa0e-4042-8609-9a787ce03588","Type":"ContainerDied","Data":"fa1f72788b57c7e5d200183f2c91d302d34562ed274b3e00bbae9e5c50c80739"} Apr 24 22:38:32.120493 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:32.120125 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5685b685df-f9lgc" Apr 24 22:38:32.120493 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:32.120144 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5685b685df-f9lgc" event={"ID":"f9af5106-aa0e-4042-8609-9a787ce03588","Type":"ContainerDied","Data":"4cdba999ff802d11d3b6d5d5febc1f6bd31b9a5dd4162cb3aee0f9b6b3e0c747"} Apr 24 22:38:32.120493 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:32.120158 2574 scope.go:117] "RemoveContainer" containerID="fa1f72788b57c7e5d200183f2c91d302d34562ed274b3e00bbae9e5c50c80739" Apr 24 22:38:32.128959 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:32.128943 2574 scope.go:117] "RemoveContainer" containerID="fa1f72788b57c7e5d200183f2c91d302d34562ed274b3e00bbae9e5c50c80739" Apr 24 22:38:32.129205 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:38:32.129187 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa1f72788b57c7e5d200183f2c91d302d34562ed274b3e00bbae9e5c50c80739\": container with ID starting with fa1f72788b57c7e5d200183f2c91d302d34562ed274b3e00bbae9e5c50c80739 not found: ID does not exist" containerID="fa1f72788b57c7e5d200183f2c91d302d34562ed274b3e00bbae9e5c50c80739" Apr 24 22:38:32.129250 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:32.129213 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa1f72788b57c7e5d200183f2c91d302d34562ed274b3e00bbae9e5c50c80739"} err="failed to get container status \"fa1f72788b57c7e5d200183f2c91d302d34562ed274b3e00bbae9e5c50c80739\": rpc error: code = NotFound desc = could not find container \"fa1f72788b57c7e5d200183f2c91d302d34562ed274b3e00bbae9e5c50c80739\": container with ID starting with fa1f72788b57c7e5d200183f2c91d302d34562ed274b3e00bbae9e5c50c80739 not found: ID does not exist" Apr 24 22:38:32.140763 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:32.140741 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5685b685df-f9lgc"] Apr 24 22:38:32.146085 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:32.146066 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5685b685df-f9lgc"] Apr 24 22:38:32.541944 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:38:32.541911 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9af5106-aa0e-4042-8609-9a787ce03588" path="/var/lib/kubelet/pods/f9af5106-aa0e-4042-8609-9a787ce03588/volumes" Apr 24 22:40:14.450399 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:40:14.450370 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6nnq9_6aaf0fb0-1f4a-46f7-a1db-82394fa8792a/console-operator/2.log" Apr 24 22:40:14.451160 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:40:14.451142 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6nnq9_6aaf0fb0-1f4a-46f7-a1db-82394fa8792a/console-operator/2.log" Apr 24 22:45:14.474829 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:45:14.474756 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6nnq9_6aaf0fb0-1f4a-46f7-a1db-82394fa8792a/console-operator/2.log" Apr 24 22:45:14.476944 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:45:14.476918 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6nnq9_6aaf0fb0-1f4a-46f7-a1db-82394fa8792a/console-operator/2.log" Apr 24 22:48:58.094941 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:48:58.094900 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pxmb8/must-gather-rkvn4"] Apr 24 22:48:58.097190 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:48:58.095220 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f9af5106-aa0e-4042-8609-9a787ce03588" containerName="console" Apr 24 22:48:58.097190 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:48:58.095232 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9af5106-aa0e-4042-8609-9a787ce03588" containerName="console" Apr 24 22:48:58.097190 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:48:58.095293 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="f9af5106-aa0e-4042-8609-9a787ce03588" containerName="console" Apr 24 22:48:58.098038 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:48:58.098023 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pxmb8/must-gather-rkvn4" Apr 24 22:48:58.100587 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:48:58.100563 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-pxmb8\"/\"openshift-service-ca.crt\"" Apr 24 22:48:58.101722 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:48:58.101701 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-pxmb8\"/\"kube-root-ca.crt\"" Apr 24 22:48:58.101834 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:48:58.101704 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-pxmb8\"/\"default-dockercfg-zgbnd\"" Apr 24 22:48:58.113719 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:48:58.113696 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pxmb8/must-gather-rkvn4"] Apr 24 22:48:58.165372 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:48:58.165343 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d5fk\" (UniqueName: \"kubernetes.io/projected/42fb01e1-e838-4921-99ab-4917ff3b07c0-kube-api-access-9d5fk\") pod \"must-gather-rkvn4\" (UID: \"42fb01e1-e838-4921-99ab-4917ff3b07c0\") " pod="openshift-must-gather-pxmb8/must-gather-rkvn4" Apr 24 22:48:58.165512 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:48:58.165391 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/42fb01e1-e838-4921-99ab-4917ff3b07c0-must-gather-output\") pod \"must-gather-rkvn4\" (UID: \"42fb01e1-e838-4921-99ab-4917ff3b07c0\") " pod="openshift-must-gather-pxmb8/must-gather-rkvn4" Apr 24 22:48:58.266111 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:48:58.266081 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9d5fk\" (UniqueName: \"kubernetes.io/projected/42fb01e1-e838-4921-99ab-4917ff3b07c0-kube-api-access-9d5fk\") pod \"must-gather-rkvn4\" (UID: \"42fb01e1-e838-4921-99ab-4917ff3b07c0\") " pod="openshift-must-gather-pxmb8/must-gather-rkvn4" Apr 24 22:48:58.266305 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:48:58.266133 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/42fb01e1-e838-4921-99ab-4917ff3b07c0-must-gather-output\") pod \"must-gather-rkvn4\" (UID: \"42fb01e1-e838-4921-99ab-4917ff3b07c0\") " pod="openshift-must-gather-pxmb8/must-gather-rkvn4" Apr 24 22:48:58.266487 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:48:58.266464 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/42fb01e1-e838-4921-99ab-4917ff3b07c0-must-gather-output\") pod \"must-gather-rkvn4\" (UID: \"42fb01e1-e838-4921-99ab-4917ff3b07c0\") " pod="openshift-must-gather-pxmb8/must-gather-rkvn4" Apr 24 22:48:58.274401 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:48:58.274377 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d5fk\" (UniqueName: \"kubernetes.io/projected/42fb01e1-e838-4921-99ab-4917ff3b07c0-kube-api-access-9d5fk\") pod \"must-gather-rkvn4\" (UID: \"42fb01e1-e838-4921-99ab-4917ff3b07c0\") " pod="openshift-must-gather-pxmb8/must-gather-rkvn4" Apr 24 22:48:58.407923 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:48:58.407792 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pxmb8/must-gather-rkvn4" Apr 24 22:48:58.546909 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:48:58.546860 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pxmb8/must-gather-rkvn4"] Apr 24 22:48:58.549887 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:48:58.549844 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42fb01e1_e838_4921_99ab_4917ff3b07c0.slice/crio-d7f2df4d134eab7e187ecff37720c178a0beaaebbffeafaa993ab48f501acb24 WatchSource:0}: Error finding container d7f2df4d134eab7e187ecff37720c178a0beaaebbffeafaa993ab48f501acb24: Status 404 returned error can't find the container with id d7f2df4d134eab7e187ecff37720c178a0beaaebbffeafaa993ab48f501acb24 Apr 24 22:48:58.551450 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:48:58.551436 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:48:59.185373 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:48:59.185329 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pxmb8/must-gather-rkvn4" event={"ID":"42fb01e1-e838-4921-99ab-4917ff3b07c0","Type":"ContainerStarted","Data":"d7f2df4d134eab7e187ecff37720c178a0beaaebbffeafaa993ab48f501acb24"} Apr 24 22:49:03.201392 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:03.201296 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pxmb8/must-gather-rkvn4" event={"ID":"42fb01e1-e838-4921-99ab-4917ff3b07c0","Type":"ContainerStarted","Data":"3a7ba071f942950c1aff6f020857332313af2c014773c500d3f65959c32de049"} Apr 24 22:49:03.201392 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:03.201338 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pxmb8/must-gather-rkvn4" event={"ID":"42fb01e1-e838-4921-99ab-4917ff3b07c0","Type":"ContainerStarted","Data":"ca060330384a5a1ff803ca23dee85e6f384b1cc9e488ce502fe30070b790fc9c"} Apr 24 22:49:11.774278 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:11.774244 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-6898f6cd9b-c68zq_b3432e60-bf25-4ee8-876a-a1ee3c4b5846/router/0.log" Apr 24 22:49:15.244843 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:15.244806 2574 generic.go:358] "Generic (PLEG): container finished" podID="42fb01e1-e838-4921-99ab-4917ff3b07c0" containerID="ca060330384a5a1ff803ca23dee85e6f384b1cc9e488ce502fe30070b790fc9c" exitCode=0 Apr 24 22:49:15.245287 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:15.244852 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pxmb8/must-gather-rkvn4" event={"ID":"42fb01e1-e838-4921-99ab-4917ff3b07c0","Type":"ContainerDied","Data":"ca060330384a5a1ff803ca23dee85e6f384b1cc9e488ce502fe30070b790fc9c"} Apr 24 22:49:15.245287 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:15.245151 2574 scope.go:117] "RemoveContainer" containerID="ca060330384a5a1ff803ca23dee85e6f384b1cc9e488ce502fe30070b790fc9c" Apr 24 22:49:15.362265 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:15.362232 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pxmb8_must-gather-rkvn4_42fb01e1-e838-4921-99ab-4917ff3b07c0/gather/0.log" Apr 24 22:49:15.852169 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:15.852085 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5lfh5/must-gather-f9hcz"] Apr 24 22:49:15.855957 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:15.855923 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5lfh5/must-gather-f9hcz" Apr 24 22:49:15.858476 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:15.858453 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5lfh5\"/\"openshift-service-ca.crt\"" Apr 24 22:49:15.858609 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:15.858485 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-5lfh5\"/\"default-dockercfg-p7s87\"" Apr 24 22:49:15.859516 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:15.859497 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5lfh5\"/\"kube-root-ca.crt\"" Apr 24 22:49:15.862195 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:15.862170 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5lfh5/must-gather-f9hcz"] Apr 24 22:49:16.018166 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:16.018131 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6b9b55f6-d069-44e3-8799-435a9381ea57-must-gather-output\") pod \"must-gather-f9hcz\" (UID: \"6b9b55f6-d069-44e3-8799-435a9381ea57\") " pod="openshift-must-gather-5lfh5/must-gather-f9hcz" Apr 24 22:49:16.018166 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:16.018168 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbv7d\" (UniqueName: \"kubernetes.io/projected/6b9b55f6-d069-44e3-8799-435a9381ea57-kube-api-access-qbv7d\") pod \"must-gather-f9hcz\" (UID: \"6b9b55f6-d069-44e3-8799-435a9381ea57\") " pod="openshift-must-gather-5lfh5/must-gather-f9hcz" Apr 24 22:49:16.119253 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:16.119154 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6b9b55f6-d069-44e3-8799-435a9381ea57-must-gather-output\") pod \"must-gather-f9hcz\" (UID: \"6b9b55f6-d069-44e3-8799-435a9381ea57\") " pod="openshift-must-gather-5lfh5/must-gather-f9hcz" Apr 24 22:49:16.119253 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:16.119206 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qbv7d\" (UniqueName: \"kubernetes.io/projected/6b9b55f6-d069-44e3-8799-435a9381ea57-kube-api-access-qbv7d\") pod \"must-gather-f9hcz\" (UID: \"6b9b55f6-d069-44e3-8799-435a9381ea57\") " pod="openshift-must-gather-5lfh5/must-gather-f9hcz" Apr 24 22:49:16.119582 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:16.119560 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6b9b55f6-d069-44e3-8799-435a9381ea57-must-gather-output\") pod \"must-gather-f9hcz\" (UID: \"6b9b55f6-d069-44e3-8799-435a9381ea57\") " pod="openshift-must-gather-5lfh5/must-gather-f9hcz" Apr 24 22:49:16.127531 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:16.127494 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbv7d\" (UniqueName: \"kubernetes.io/projected/6b9b55f6-d069-44e3-8799-435a9381ea57-kube-api-access-qbv7d\") pod \"must-gather-f9hcz\" (UID: \"6b9b55f6-d069-44e3-8799-435a9381ea57\") " pod="openshift-must-gather-5lfh5/must-gather-f9hcz" Apr 24 22:49:16.166496 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:16.166466 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5lfh5/must-gather-f9hcz" Apr 24 22:49:16.287112 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:16.287081 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5lfh5/must-gather-f9hcz"] Apr 24 22:49:16.290005 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:49:16.289976 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b9b55f6_d069_44e3_8799_435a9381ea57.slice/crio-27f73e660015cfc7248e9164959628ab06653f668102676734094e845928a5d8 WatchSource:0}: Error finding container 27f73e660015cfc7248e9164959628ab06653f668102676734094e845928a5d8: Status 404 returned error can't find the container with id 27f73e660015cfc7248e9164959628ab06653f668102676734094e845928a5d8 Apr 24 22:49:17.254513 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:17.254414 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5lfh5/must-gather-f9hcz" event={"ID":"6b9b55f6-d069-44e3-8799-435a9381ea57","Type":"ContainerStarted","Data":"e47732da3170880d61831d14022286eda99782f0e54a089323a26198bf0a87be"} Apr 24 22:49:17.254513 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:17.254469 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5lfh5/must-gather-f9hcz" event={"ID":"6b9b55f6-d069-44e3-8799-435a9381ea57","Type":"ContainerStarted","Data":"27f73e660015cfc7248e9164959628ab06653f668102676734094e845928a5d8"} Apr 24 22:49:18.261073 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:18.261033 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5lfh5/must-gather-f9hcz" event={"ID":"6b9b55f6-d069-44e3-8799-435a9381ea57","Type":"ContainerStarted","Data":"b8a775a5a977e6140984c4de4e281917d9098e13af54e952fa4712e921cf3750"} Apr 24 22:49:18.277060 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:18.277000 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5lfh5/must-gather-f9hcz" podStartSLOduration=2.497894608 podStartE2EDuration="3.276979853s" podCreationTimestamp="2026-04-24 22:49:15 +0000 UTC" firstStartedPulling="2026-04-24 22:49:16.291849713 +0000 UTC m=+1142.341705567" lastFinishedPulling="2026-04-24 22:49:17.070934956 +0000 UTC m=+1143.120790812" observedRunningTime="2026-04-24 22:49:18.275747013 +0000 UTC m=+1144.325602917" watchObservedRunningTime="2026-04-24 22:49:18.276979853 +0000 UTC m=+1144.326835732" Apr 24 22:49:18.459829 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:18.459795 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-7z4fb_49978182-365f-4022-9727-6a7530dcbc1e/global-pull-secret-syncer/0.log" Apr 24 22:49:18.705504 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:18.705474 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-vqh8g_10e80548-a374-4b56-8428-91554c6f203b/konnectivity-agent/0.log" Apr 24 22:49:18.737540 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:18.737513 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-176.ec2.internal_b3281f0d425c69d03b02f901cc1387c8/haproxy/0.log" Apr 24 22:49:20.689730 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:20.689687 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pxmb8/must-gather-rkvn4"] Apr 24 22:49:20.693064 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:20.693038 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pxmb8/must-gather-rkvn4"] Apr 24 22:49:20.693961 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:20.693467 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-pxmb8/must-gather-rkvn4" podUID="42fb01e1-e838-4921-99ab-4917ff3b07c0" containerName="copy" containerID="cri-o://3a7ba071f942950c1aff6f020857332313af2c014773c500d3f65959c32de049" gracePeriod=2 Apr 24 22:49:20.700068 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:20.700037 2574 status_manager.go:895] "Failed to get status for pod" podUID="42fb01e1-e838-4921-99ab-4917ff3b07c0" pod="openshift-must-gather-pxmb8/must-gather-rkvn4" err="pods \"must-gather-rkvn4\" is forbidden: User \"system:node:ip-10-0-129-176.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-pxmb8\": no relationship found between node 'ip-10-0-129-176.ec2.internal' and this object" Apr 24 22:49:21.144496 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:21.144408 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pxmb8_must-gather-rkvn4_42fb01e1-e838-4921-99ab-4917ff3b07c0/copy/0.log" Apr 24 22:49:21.145416 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:21.145153 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pxmb8/must-gather-rkvn4" Apr 24 22:49:21.149161 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:21.149119 2574 status_manager.go:895] "Failed to get status for pod" podUID="42fb01e1-e838-4921-99ab-4917ff3b07c0" pod="openshift-must-gather-pxmb8/must-gather-rkvn4" err="pods \"must-gather-rkvn4\" is forbidden: User \"system:node:ip-10-0-129-176.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-pxmb8\": no relationship found between node 'ip-10-0-129-176.ec2.internal' and this object" Apr 24 22:49:21.277417 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:21.275253 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9d5fk\" (UniqueName: \"kubernetes.io/projected/42fb01e1-e838-4921-99ab-4917ff3b07c0-kube-api-access-9d5fk\") pod \"42fb01e1-e838-4921-99ab-4917ff3b07c0\" (UID: \"42fb01e1-e838-4921-99ab-4917ff3b07c0\") " Apr 24 22:49:21.277417 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:21.275313 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/42fb01e1-e838-4921-99ab-4917ff3b07c0-must-gather-output\") pod \"42fb01e1-e838-4921-99ab-4917ff3b07c0\" (UID: \"42fb01e1-e838-4921-99ab-4917ff3b07c0\") " Apr 24 22:49:21.277417 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:21.276965 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42fb01e1-e838-4921-99ab-4917ff3b07c0-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "42fb01e1-e838-4921-99ab-4917ff3b07c0" (UID: "42fb01e1-e838-4921-99ab-4917ff3b07c0"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:49:21.283355 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:21.283262 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pxmb8_must-gather-rkvn4_42fb01e1-e838-4921-99ab-4917ff3b07c0/copy/0.log" Apr 24 22:49:21.284542 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:21.283830 2574 generic.go:358] "Generic (PLEG): container finished" podID="42fb01e1-e838-4921-99ab-4917ff3b07c0" containerID="3a7ba071f942950c1aff6f020857332313af2c014773c500d3f65959c32de049" exitCode=143 Apr 24 22:49:21.284542 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:21.283998 2574 scope.go:117] "RemoveContainer" containerID="3a7ba071f942950c1aff6f020857332313af2c014773c500d3f65959c32de049" Apr 24 22:49:21.284542 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:21.284141 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pxmb8/must-gather-rkvn4" Apr 24 22:49:21.291147 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:21.288069 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42fb01e1-e838-4921-99ab-4917ff3b07c0-kube-api-access-9d5fk" (OuterVolumeSpecName: "kube-api-access-9d5fk") pod "42fb01e1-e838-4921-99ab-4917ff3b07c0" (UID: "42fb01e1-e838-4921-99ab-4917ff3b07c0"). InnerVolumeSpecName "kube-api-access-9d5fk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:49:21.291147 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:21.288077 2574 status_manager.go:895] "Failed to get status for pod" podUID="42fb01e1-e838-4921-99ab-4917ff3b07c0" pod="openshift-must-gather-pxmb8/must-gather-rkvn4" err="pods \"must-gather-rkvn4\" is forbidden: User \"system:node:ip-10-0-129-176.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-pxmb8\": no relationship found between node 'ip-10-0-129-176.ec2.internal' and this object" Apr 24 22:49:21.300038 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:21.300007 2574 scope.go:117] "RemoveContainer" containerID="ca060330384a5a1ff803ca23dee85e6f384b1cc9e488ce502fe30070b790fc9c" Apr 24 22:49:21.326402 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:21.326316 2574 scope.go:117] "RemoveContainer" containerID="3a7ba071f942950c1aff6f020857332313af2c014773c500d3f65959c32de049" Apr 24 22:49:21.327000 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:49:21.326838 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a7ba071f942950c1aff6f020857332313af2c014773c500d3f65959c32de049\": container with ID starting with 3a7ba071f942950c1aff6f020857332313af2c014773c500d3f65959c32de049 not found: ID does not exist" containerID="3a7ba071f942950c1aff6f020857332313af2c014773c500d3f65959c32de049" Apr 24 22:49:21.327000 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:21.326906 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a7ba071f942950c1aff6f020857332313af2c014773c500d3f65959c32de049"} err="failed to get container status \"3a7ba071f942950c1aff6f020857332313af2c014773c500d3f65959c32de049\": rpc error: code = NotFound desc = could not find container \"3a7ba071f942950c1aff6f020857332313af2c014773c500d3f65959c32de049\": container with ID starting with 3a7ba071f942950c1aff6f020857332313af2c014773c500d3f65959c32de049 not found: ID does not exist" Apr 24 22:49:21.327000 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:21.326933 2574 scope.go:117] "RemoveContainer" containerID="ca060330384a5a1ff803ca23dee85e6f384b1cc9e488ce502fe30070b790fc9c" Apr 24 22:49:21.327582 ip-10-0-129-176 kubenswrapper[2574]: E0424 22:49:21.327553 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca060330384a5a1ff803ca23dee85e6f384b1cc9e488ce502fe30070b790fc9c\": container with ID starting with ca060330384a5a1ff803ca23dee85e6f384b1cc9e488ce502fe30070b790fc9c not found: ID does not exist" containerID="ca060330384a5a1ff803ca23dee85e6f384b1cc9e488ce502fe30070b790fc9c" Apr 24 22:49:21.327695 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:21.327596 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca060330384a5a1ff803ca23dee85e6f384b1cc9e488ce502fe30070b790fc9c"} err="failed to get container status \"ca060330384a5a1ff803ca23dee85e6f384b1cc9e488ce502fe30070b790fc9c\": rpc error: code = NotFound desc = could not find container \"ca060330384a5a1ff803ca23dee85e6f384b1cc9e488ce502fe30070b790fc9c\": container with ID starting with ca060330384a5a1ff803ca23dee85e6f384b1cc9e488ce502fe30070b790fc9c not found: ID does not exist" Apr 24 22:49:21.376333 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:21.376294 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9d5fk\" (UniqueName: \"kubernetes.io/projected/42fb01e1-e838-4921-99ab-4917ff3b07c0-kube-api-access-9d5fk\") on node \"ip-10-0-129-176.ec2.internal\" DevicePath \"\"" Apr 24 22:49:21.376333 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:21.376333 2574 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/42fb01e1-e838-4921-99ab-4917ff3b07c0-must-gather-output\") on node \"ip-10-0-129-176.ec2.internal\" DevicePath \"\"" Apr 24 22:49:21.600295 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:21.600246 2574 status_manager.go:895] "Failed to get status for pod" podUID="42fb01e1-e838-4921-99ab-4917ff3b07c0" pod="openshift-must-gather-pxmb8/must-gather-rkvn4" err="pods \"must-gather-rkvn4\" is forbidden: User \"system:node:ip-10-0-129-176.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-pxmb8\": no relationship found between node 'ip-10-0-129-176.ec2.internal' and this object" Apr 24 22:49:22.538336 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:22.538303 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42fb01e1-e838-4921-99ab-4917ff3b07c0" path="/var/lib/kubelet/pods/42fb01e1-e838-4921-99ab-4917ff3b07c0/volumes" Apr 24 22:49:22.815682 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:22.815592 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-g2zkf_0a617d1b-ced0-42bf-a30e-385b14d42d96/kuadrant-console-plugin/0.log" Apr 24 22:49:23.868384 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:23.868348 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-22dqh_b2412ae3-7a69-4505-b419-5d96e93a567c/cluster-monitoring-operator/0.log" Apr 24 22:49:23.891298 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:23.891257 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-7jvfr_867be6f7-bbb4-46de-b65e-2de56e6995cb/kube-state-metrics/0.log" Apr 24 22:49:23.918625 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:23.918567 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-7jvfr_867be6f7-bbb4-46de-b65e-2de56e6995cb/kube-rbac-proxy-main/0.log" Apr 24 22:49:23.945316 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:23.945290 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-7jvfr_867be6f7-bbb4-46de-b65e-2de56e6995cb/kube-rbac-proxy-self/0.log" Apr 24 22:49:23.997323 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:23.997297 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-nw82c_c7a56dd6-0aa1-4189-9b4e-e176c8e9aece/monitoring-plugin/0.log" Apr 24 22:49:24.220459 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:24.220421 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tpkcp_e1635d2a-643f-4246-8704-77815f21915e/node-exporter/0.log" Apr 24 22:49:24.246416 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:24.246385 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tpkcp_e1635d2a-643f-4246-8704-77815f21915e/kube-rbac-proxy/0.log" Apr 24 22:49:24.282794 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:24.282760 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tpkcp_e1635d2a-643f-4246-8704-77815f21915e/init-textfile/0.log" Apr 24 22:49:24.540200 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:24.540165 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-c49zn_dc88cfaa-dbaf-4194-a0fe-e1ca69e89e4e/prometheus-operator/0.log" Apr 24 22:49:24.559658 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:24.559624 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-c49zn_dc88cfaa-dbaf-4194-a0fe-e1ca69e89e4e/kube-rbac-proxy/0.log" Apr 24 22:49:24.581514 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:24.581483 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-t4xmx_7c297fbf-2d2d-4390-9b2e-c7ef5d4a38bd/prometheus-operator-admission-webhook/0.log" Apr 24 22:49:24.612410 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:24.612379 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-65c9867bbc-b7tk9_e0147744-e2f5-44c3-87b7-4c69075afa43/telemeter-client/0.log" Apr 24 22:49:24.633352 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:24.633322 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-65c9867bbc-b7tk9_e0147744-e2f5-44c3-87b7-4c69075afa43/reload/0.log" Apr 24 22:49:24.657540 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:24.657512 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-65c9867bbc-b7tk9_e0147744-e2f5-44c3-87b7-4c69075afa43/kube-rbac-proxy/0.log" Apr 24 22:49:26.316780 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:26.316742 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6nnq9_6aaf0fb0-1f4a-46f7-a1db-82394fa8792a/console-operator/2.log" Apr 24 22:49:26.324429 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:26.324401 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6nnq9_6aaf0fb0-1f4a-46f7-a1db-82394fa8792a/console-operator/3.log" Apr 24 22:49:26.723680 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:26.723601 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-b4d6b6964-qfdj4_30ab53a3-5d80-4449-af21-7cb13656bbb7/console/0.log" Apr 24 22:49:27.064018 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:27.063980 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5lfh5/perf-node-gather-daemonset-qc6vn"] Apr 24 22:49:27.064506 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:27.064485 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42fb01e1-e838-4921-99ab-4917ff3b07c0" containerName="copy" Apr 24 22:49:27.064602 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:27.064511 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="42fb01e1-e838-4921-99ab-4917ff3b07c0" containerName="copy" Apr 24 22:49:27.064602 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:27.064530 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42fb01e1-e838-4921-99ab-4917ff3b07c0" containerName="gather" Apr 24 22:49:27.064602 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:27.064540 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="42fb01e1-e838-4921-99ab-4917ff3b07c0" containerName="gather" Apr 24 22:49:27.064770 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:27.064631 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="42fb01e1-e838-4921-99ab-4917ff3b07c0" containerName="copy" Apr 24 22:49:27.064770 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:27.064649 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="42fb01e1-e838-4921-99ab-4917ff3b07c0" containerName="gather" Apr 24 22:49:27.069618 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:27.069587 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-qc6vn" Apr 24 22:49:27.076035 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:27.076003 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5lfh5/perf-node-gather-daemonset-qc6vn"] Apr 24 22:49:27.131700 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:27.131667 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pjcb\" (UniqueName: \"kubernetes.io/projected/b759c3b6-e508-4178-b0ff-dd8fa2fe3db1-kube-api-access-4pjcb\") pod \"perf-node-gather-daemonset-qc6vn\" (UID: \"b759c3b6-e508-4178-b0ff-dd8fa2fe3db1\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-qc6vn" Apr 24 22:49:27.131893 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:27.131727 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b759c3b6-e508-4178-b0ff-dd8fa2fe3db1-podres\") pod \"perf-node-gather-daemonset-qc6vn\" (UID: \"b759c3b6-e508-4178-b0ff-dd8fa2fe3db1\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-qc6vn" Apr 24 22:49:27.131893 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:27.131772 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b759c3b6-e508-4178-b0ff-dd8fa2fe3db1-sys\") pod \"perf-node-gather-daemonset-qc6vn\" (UID: \"b759c3b6-e508-4178-b0ff-dd8fa2fe3db1\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-qc6vn" Apr 24 22:49:27.131893 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:27.131791 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b759c3b6-e508-4178-b0ff-dd8fa2fe3db1-proc\") pod \"perf-node-gather-daemonset-qc6vn\" (UID: \"b759c3b6-e508-4178-b0ff-dd8fa2fe3db1\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-qc6vn" Apr 24 22:49:27.132050 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:27.131926 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b759c3b6-e508-4178-b0ff-dd8fa2fe3db1-lib-modules\") pod \"perf-node-gather-daemonset-qc6vn\" (UID: \"b759c3b6-e508-4178-b0ff-dd8fa2fe3db1\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-qc6vn" Apr 24 22:49:27.221490 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:27.221453 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-z5pwv_489b688c-0857-4d11-95f2-90cdf9578a51/volume-data-source-validator/0.log" Apr 24 22:49:27.232740 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:27.232715 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b759c3b6-e508-4178-b0ff-dd8fa2fe3db1-lib-modules\") pod \"perf-node-gather-daemonset-qc6vn\" (UID: \"b759c3b6-e508-4178-b0ff-dd8fa2fe3db1\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-qc6vn" Apr 24 22:49:27.232907 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:27.232754 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4pjcb\" (UniqueName: \"kubernetes.io/projected/b759c3b6-e508-4178-b0ff-dd8fa2fe3db1-kube-api-access-4pjcb\") pod \"perf-node-gather-daemonset-qc6vn\" (UID: \"b759c3b6-e508-4178-b0ff-dd8fa2fe3db1\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-qc6vn" Apr 24 22:49:27.232907 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:27.232788 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b759c3b6-e508-4178-b0ff-dd8fa2fe3db1-podres\") pod \"perf-node-gather-daemonset-qc6vn\" (UID: \"b759c3b6-e508-4178-b0ff-dd8fa2fe3db1\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-qc6vn" Apr 24 22:49:27.232907 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:27.232827 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b759c3b6-e508-4178-b0ff-dd8fa2fe3db1-sys\") pod \"perf-node-gather-daemonset-qc6vn\" (UID: \"b759c3b6-e508-4178-b0ff-dd8fa2fe3db1\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-qc6vn" Apr 24 22:49:27.232907 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:27.232854 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b759c3b6-e508-4178-b0ff-dd8fa2fe3db1-proc\") pod \"perf-node-gather-daemonset-qc6vn\" (UID: \"b759c3b6-e508-4178-b0ff-dd8fa2fe3db1\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-qc6vn" Apr 24 22:49:27.233071 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:27.232910 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b759c3b6-e508-4178-b0ff-dd8fa2fe3db1-lib-modules\") pod \"perf-node-gather-daemonset-qc6vn\" (UID: \"b759c3b6-e508-4178-b0ff-dd8fa2fe3db1\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-qc6vn" Apr 24 22:49:27.233071 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:27.232928 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b759c3b6-e508-4178-b0ff-dd8fa2fe3db1-sys\") pod \"perf-node-gather-daemonset-qc6vn\" (UID: \"b759c3b6-e508-4178-b0ff-dd8fa2fe3db1\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-qc6vn" Apr 24 22:49:27.233071 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:27.232953 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b759c3b6-e508-4178-b0ff-dd8fa2fe3db1-proc\") pod \"perf-node-gather-daemonset-qc6vn\" (UID: \"b759c3b6-e508-4178-b0ff-dd8fa2fe3db1\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-qc6vn" Apr 24 22:49:27.233071 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:27.232993 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b759c3b6-e508-4178-b0ff-dd8fa2fe3db1-podres\") pod \"perf-node-gather-daemonset-qc6vn\" (UID: \"b759c3b6-e508-4178-b0ff-dd8fa2fe3db1\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-qc6vn" Apr 24 22:49:27.241902 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:27.241849 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pjcb\" (UniqueName: \"kubernetes.io/projected/b759c3b6-e508-4178-b0ff-dd8fa2fe3db1-kube-api-access-4pjcb\") pod \"perf-node-gather-daemonset-qc6vn\" (UID: \"b759c3b6-e508-4178-b0ff-dd8fa2fe3db1\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-qc6vn" Apr 24 22:49:27.387951 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:27.387860 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-qc6vn" Apr 24 22:49:27.533384 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:27.532923 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5lfh5/perf-node-gather-daemonset-qc6vn"] Apr 24 22:49:27.540341 ip-10-0-129-176 kubenswrapper[2574]: W0424 22:49:27.539640 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb759c3b6_e508_4178_b0ff_dd8fa2fe3db1.slice/crio-21d4a10c6930a425a14daad118eadc7b3327c746b73333fd543148b9561e457a WatchSource:0}: Error finding container 21d4a10c6930a425a14daad118eadc7b3327c746b73333fd543148b9561e457a: Status 404 returned error can't find the container with id 21d4a10c6930a425a14daad118eadc7b3327c746b73333fd543148b9561e457a Apr 24 22:49:27.995758 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:27.995683 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-mfpb9_b3ad74e7-fb86-475b-88a4-5b2f7848cd68/dns/0.log" Apr 24 22:49:28.013193 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:28.013163 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-mfpb9_b3ad74e7-fb86-475b-88a4-5b2f7848cd68/kube-rbac-proxy/0.log" Apr 24 22:49:28.088331 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:28.088301 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-t25xz_31be179c-c441-48a4-8779-593458646c77/dns-node-resolver/0.log" Apr 24 22:49:28.315899 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:28.315852 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-qc6vn" event={"ID":"b759c3b6-e508-4178-b0ff-dd8fa2fe3db1","Type":"ContainerStarted","Data":"32df91550ac2e2d0217678fe145c5e57398b3b0af85508eb63871bf223d2a6eb"} Apr 24 22:49:28.315899 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:28.315904 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-qc6vn" event={"ID":"b759c3b6-e508-4178-b0ff-dd8fa2fe3db1","Type":"ContainerStarted","Data":"21d4a10c6930a425a14daad118eadc7b3327c746b73333fd543148b9561e457a"} Apr 24 22:49:28.316126 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:28.316012 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-qc6vn" Apr 24 22:49:28.499145 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:28.499118 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-846bc59fcf-t65jg_9bb13040-efc6-4698-8a0a-b1270f5d0998/registry/0.log" Apr 24 22:49:28.559899 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:28.559843 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-tb9tr_ed31fcf6-85e1-43d1-86ff-16bb6763e3e6/node-ca/0.log" Apr 24 22:49:29.296935 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:29.296904 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-6898f6cd9b-c68zq_b3432e60-bf25-4ee8-876a-a1ee3c4b5846/router/0.log" Apr 24 22:49:29.743321 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:29.743256 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-q7rg6_6f622f7a-7d90-4dd1-af3f-3f27ebad181a/serve-healthcheck-canary/0.log" Apr 24 22:49:30.297108 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:30.297078 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xs49w_c7c3582c-44dd-492c-b4ba-7bd36140280c/kube-rbac-proxy/0.log" Apr 24 22:49:30.316323 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:30.316302 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xs49w_c7c3582c-44dd-492c-b4ba-7bd36140280c/exporter/0.log" Apr 24 22:49:30.340500 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:30.340472 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xs49w_c7c3582c-44dd-492c-b4ba-7bd36140280c/extractor/0.log" Apr 24 22:49:32.345031 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:32.344999 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-859c5c9fd7-2flfv_92c004d9-1387-4f01-a97b-ce4eb2153936/manager/0.log" Apr 24 22:49:32.386661 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:32.386633 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-prjrt_535c70f5-87b5-4a78-8666-36f332fe14fc/openshift-lws-operator/0.log" Apr 24 22:49:34.331028 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:34.330954 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-qc6vn" Apr 24 22:49:34.347456 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:34.347414 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-qc6vn" podStartSLOduration=7.347401426 podStartE2EDuration="7.347401426s" podCreationTimestamp="2026-04-24 22:49:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:49:28.333896204 +0000 UTC m=+1154.383752080" watchObservedRunningTime="2026-04-24 22:49:34.347401426 +0000 UTC m=+1160.397257335" Apr 24 22:49:36.168892 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:36.168847 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-v29tm_9f025f22-b0e0-48ff-8928-ed22d22ab622/kube-storage-version-migrator-operator/1.log" Apr 24 22:49:36.170851 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:36.170804 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-v29tm_9f025f22-b0e0-48ff-8928-ed22d22ab622/kube-storage-version-migrator-operator/0.log" Apr 24 22:49:37.259811 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:37.259782 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2mz4f_4fb2f48b-e735-4e72-896c-724bf453529c/kube-multus-additional-cni-plugins/0.log" Apr 24 22:49:37.277669 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:37.277642 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2mz4f_4fb2f48b-e735-4e72-896c-724bf453529c/egress-router-binary-copy/0.log" Apr 24 22:49:37.296105 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:37.296082 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2mz4f_4fb2f48b-e735-4e72-896c-724bf453529c/cni-plugins/0.log" Apr 24 22:49:37.313392 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:37.313369 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2mz4f_4fb2f48b-e735-4e72-896c-724bf453529c/bond-cni-plugin/0.log" Apr 24 22:49:37.331218 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:37.331196 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2mz4f_4fb2f48b-e735-4e72-896c-724bf453529c/routeoverride-cni/0.log" Apr 24 22:49:37.349307 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:37.349279 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2mz4f_4fb2f48b-e735-4e72-896c-724bf453529c/whereabouts-cni-bincopy/0.log" Apr 24 22:49:37.370559 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:37.370543 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2mz4f_4fb2f48b-e735-4e72-896c-724bf453529c/whereabouts-cni/0.log" Apr 24 22:49:37.735414 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:37.735385 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jdw5l_ef462afb-90e9-45bc-9e65-2b9c01d7f73a/kube-multus/0.log" Apr 24 22:49:37.819264 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:37.819231 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-n8ntw_fd77a426-e63b-4027-97b7-e9893fd72601/network-metrics-daemon/0.log" Apr 24 22:49:37.835851 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:37.835817 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-n8ntw_fd77a426-e63b-4027-97b7-e9893fd72601/kube-rbac-proxy/0.log" Apr 24 22:49:39.219381 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:39.219347 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xsvks_b188c294-c06f-4cc9-ab29-5edd0333288d/ovn-controller/0.log" Apr 24 22:49:39.246286 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:39.246254 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xsvks_b188c294-c06f-4cc9-ab29-5edd0333288d/ovn-acl-logging/0.log" Apr 24 22:49:39.265511 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:39.265481 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xsvks_b188c294-c06f-4cc9-ab29-5edd0333288d/kube-rbac-proxy-node/0.log" Apr 24 22:49:39.283321 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:39.283291 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xsvks_b188c294-c06f-4cc9-ab29-5edd0333288d/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 22:49:39.298654 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:39.298634 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xsvks_b188c294-c06f-4cc9-ab29-5edd0333288d/northd/0.log" Apr 24 22:49:39.316845 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:39.316825 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xsvks_b188c294-c06f-4cc9-ab29-5edd0333288d/nbdb/0.log" Apr 24 22:49:39.337683 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:39.337657 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xsvks_b188c294-c06f-4cc9-ab29-5edd0333288d/sbdb/0.log" Apr 24 22:49:39.517614 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:39.517539 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xsvks_b188c294-c06f-4cc9-ab29-5edd0333288d/ovnkube-controller/0.log" Apr 24 22:49:40.554654 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:40.554626 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-pwmng_7679822c-7212-4ac5-9de4-a4486c413686/check-endpoints/0.log" Apr 24 22:49:40.596136 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:40.596103 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-lh47p_bc853d31-7ceb-415d-a71b-f18ee833a50c/network-check-target-container/0.log" Apr 24 22:49:41.547222 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:41.547187 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-nzrnj_bc479082-3849-4902-830a-4a785973b983/iptables-alerter/0.log" Apr 24 22:49:42.157969 ip-10-0-129-176 kubenswrapper[2574]: I0424 22:49:42.157939 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-tlfts_04b95921-75f8-4261-8992-f655aedb0790/tuned/0.log"