Apr 24 14:21:52.410545 ip-10-0-137-95 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 14:21:52.410562 ip-10-0-137-95 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 14:21:52.410572 ip-10-0-137-95 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 14:21:52.410947 ip-10-0-137-95 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 14:22:02.598561 ip-10-0-137-95 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 14:22:02.598581 ip-10-0-137-95 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 6d6b9cb6561d472ab2ee3958868f8046 -- Apr 24 14:24:23.748275 ip-10-0-137-95 systemd[1]: Starting Kubernetes Kubelet... Apr 24 14:24:24.207115 ip-10-0-137-95 kubenswrapper[2571]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 14:24:24.207115 ip-10-0-137-95 kubenswrapper[2571]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 14:24:24.207115 ip-10-0-137-95 kubenswrapper[2571]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 14:24:24.207115 ip-10-0-137-95 kubenswrapper[2571]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 14:24:24.207115 ip-10-0-137-95 kubenswrapper[2571]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 14:24:24.208898 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.208815 2571 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 14:24:24.212052 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212036 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:24:24.212052 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212052 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:24:24.212113 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212057 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:24:24.212113 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212060 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:24:24.212113 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212062 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:24:24.212113 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212066 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:24:24.212113 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212068 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:24:24.212113 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212072 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:24:24.212113 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212075 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:24:24.212113 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212077 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:24:24.212113 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212080 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:24:24.212113 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212083 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:24:24.212113 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212085 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:24:24.212113 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212088 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:24:24.212113 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212090 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:24:24.212113 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212093 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:24:24.212113 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212096 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:24:24.212113 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212099 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:24:24.212113 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212101 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:24:24.212113 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212104 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:24:24.212113 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212113 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:24:24.212113 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212117 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:24:24.212582 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212120 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:24:24.212582 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212123 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:24:24.212582 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212126 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:24:24.212582 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212129 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:24:24.212582 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212131 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:24:24.212582 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212133 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:24:24.212582 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212136 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:24:24.212582 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212139 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:24:24.212582 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212141 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:24:24.212582 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212143 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:24:24.212582 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212146 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:24:24.212582 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212149 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:24:24.212582 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212152 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:24:24.212582 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212155 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:24:24.212582 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212157 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:24:24.212582 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212160 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:24:24.212582 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212162 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:24:24.212582 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212165 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:24:24.212582 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212167 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:24:24.212582 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212170 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:24:24.213089 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212173 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:24:24.213089 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212176 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:24:24.213089 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212178 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:24:24.213089 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212181 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:24:24.213089 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212184 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:24:24.213089 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212187 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:24:24.213089 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212190 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:24:24.213089 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212192 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:24:24.213089 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212194 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:24:24.213089 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212197 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:24:24.213089 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212199 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:24:24.213089 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212202 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:24:24.213089 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212205 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:24:24.213089 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212208 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:24:24.213089 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212211 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:24:24.213089 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212214 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:24:24.213089 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212219 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:24:24.213089 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212223 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:24:24.213089 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212226 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:24:24.213549 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212230 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:24:24.213549 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212233 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:24:24.213549 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212235 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:24:24.213549 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212238 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:24:24.213549 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212240 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:24:24.213549 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212242 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:24:24.213549 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212245 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:24:24.213549 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212247 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:24:24.213549 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212250 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:24:24.213549 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212252 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:24:24.213549 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212256 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:24:24.213549 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212260 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:24:24.213549 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212263 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:24:24.213549 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212265 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:24:24.213549 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212268 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:24:24.213549 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212270 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:24:24.213549 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212273 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:24:24.213549 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212275 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:24:24.213549 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212278 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:24:24.213549 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212281 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:24:24.214107 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212283 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:24:24.214107 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212286 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:24:24.214107 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212288 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:24:24.214107 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212291 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:24:24.214107 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212293 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:24:24.214107 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212677 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:24:24.214107 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212683 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:24:24.214107 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212686 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:24:24.214107 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212689 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:24:24.214107 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212692 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:24:24.214107 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212695 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:24:24.214107 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212698 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:24:24.214107 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212701 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:24:24.214107 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212703 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:24:24.214107 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212707 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:24:24.214107 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212711 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:24:24.214107 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212714 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:24:24.214107 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212716 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:24:24.214107 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212719 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:24:24.214107 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212722 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:24:24.214620 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212725 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:24:24.214620 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212727 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:24:24.214620 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212729 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:24:24.214620 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212732 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:24:24.214620 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212734 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:24:24.214620 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212737 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:24:24.214620 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212740 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:24:24.214620 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212742 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:24:24.214620 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212745 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:24:24.214620 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212747 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:24:24.214620 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212750 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:24:24.214620 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212753 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:24:24.214620 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212756 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:24:24.214620 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212758 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:24:24.214620 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212761 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:24:24.214620 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212763 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:24:24.214620 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212766 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:24:24.214620 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212769 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:24:24.214620 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212772 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:24:24.214620 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212774 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:24:24.215181 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212777 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:24:24.215181 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212780 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:24:24.215181 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212782 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:24:24.215181 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212785 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:24:24.215181 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212787 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:24:24.215181 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212789 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:24:24.215181 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212792 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:24:24.215181 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212794 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:24:24.215181 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212797 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:24:24.215181 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212799 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:24:24.215181 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212802 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:24:24.215181 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212804 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:24:24.215181 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212807 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:24:24.215181 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212809 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:24:24.215181 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212812 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:24:24.215181 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212814 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:24:24.215181 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212816 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:24:24.215181 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212819 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:24:24.215181 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212821 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:24:24.215181 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212824 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:24:24.215692 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212826 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:24:24.215692 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212830 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:24:24.215692 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212833 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:24:24.215692 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212836 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:24:24.215692 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212838 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:24:24.215692 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212841 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:24:24.215692 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212844 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:24:24.215692 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212847 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:24:24.215692 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212849 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:24:24.215692 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212852 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:24:24.215692 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212855 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:24:24.215692 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212857 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:24:24.215692 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212860 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:24:24.215692 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212862 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:24:24.215692 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212865 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:24:24.215692 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212868 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:24:24.215692 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212872 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:24:24.215692 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212874 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:24:24.215692 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212877 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:24:24.216175 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212880 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:24:24.216175 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212882 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:24:24.216175 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212886 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:24:24.216175 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212888 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:24:24.216175 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212891 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:24:24.216175 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212893 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:24:24.216175 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212896 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:24:24.216175 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212898 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:24:24.216175 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212901 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:24:24.216175 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212903 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:24:24.216175 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212906 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:24:24.216175 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.212908 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:24:24.216175 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.212977 2571 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 14:24:24.216175 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213002 2571 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 14:24:24.216175 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213009 2571 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 14:24:24.216175 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213014 2571 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 14:24:24.216175 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213018 2571 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 14:24:24.216175 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213022 2571 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 14:24:24.216175 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213026 2571 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 14:24:24.216175 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213031 2571 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 14:24:24.216175 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213035 2571 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 14:24:24.216698 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213038 2571 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 14:24:24.216698 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213041 2571 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 14:24:24.216698 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213045 2571 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 14:24:24.216698 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213048 2571 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 14:24:24.216698 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213051 2571 flags.go:64] FLAG: --cgroup-root="" Apr 24 14:24:24.216698 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213054 2571 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 14:24:24.216698 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213057 2571 flags.go:64] FLAG: --client-ca-file="" Apr 24 14:24:24.216698 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213060 2571 flags.go:64] FLAG: --cloud-config="" Apr 24 14:24:24.216698 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213063 2571 flags.go:64] FLAG: --cloud-provider="external" Apr 24 14:24:24.216698 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213066 2571 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 14:24:24.216698 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213070 2571 flags.go:64] FLAG: --cluster-domain="" Apr 24 14:24:24.216698 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213073 2571 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 14:24:24.216698 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213076 2571 flags.go:64] FLAG: --config-dir="" Apr 24 14:24:24.216698 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213079 2571 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 14:24:24.216698 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213082 2571 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 14:24:24.216698 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213086 2571 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 14:24:24.216698 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213090 2571 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 14:24:24.216698 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213093 2571 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 14:24:24.216698 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213097 2571 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 14:24:24.216698 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213100 2571 flags.go:64] FLAG: --contention-profiling="false" Apr 24 14:24:24.216698 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213103 2571 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 14:24:24.216698 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213106 2571 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 14:24:24.216698 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213110 2571 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 14:24:24.216698 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213113 2571 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 14:24:24.216698 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213117 2571 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 14:24:24.217322 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213121 2571 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 14:24:24.217322 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213125 2571 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 14:24:24.217322 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213127 2571 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 14:24:24.217322 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213131 2571 flags.go:64] FLAG: --enable-server="true" Apr 24 14:24:24.217322 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213134 2571 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 14:24:24.217322 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213138 2571 flags.go:64] FLAG: --event-burst="100" Apr 24 14:24:24.217322 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213142 2571 flags.go:64] FLAG: --event-qps="50" Apr 24 14:24:24.217322 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213145 2571 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 14:24:24.217322 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213148 2571 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 14:24:24.217322 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213151 2571 flags.go:64] FLAG: --eviction-hard="" Apr 24 14:24:24.217322 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213155 2571 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 14:24:24.217322 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213157 2571 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 14:24:24.217322 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213161 2571 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 14:24:24.217322 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213164 2571 flags.go:64] FLAG: --eviction-soft="" Apr 24 14:24:24.217322 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213167 2571 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 14:24:24.217322 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213169 2571 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 14:24:24.217322 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213172 2571 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 14:24:24.217322 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213175 2571 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 14:24:24.217322 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213178 2571 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 14:24:24.217322 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213181 2571 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 14:24:24.217322 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213184 2571 flags.go:64] FLAG: --feature-gates="" Apr 24 14:24:24.217322 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213188 2571 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 14:24:24.217322 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213191 2571 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 14:24:24.217322 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213194 2571 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 14:24:24.217322 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213197 2571 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 14:24:24.217931 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213200 2571 flags.go:64] FLAG: --healthz-port="10248" Apr 24 14:24:24.217931 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213203 2571 flags.go:64] FLAG: --help="false" Apr 24 14:24:24.217931 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213208 2571 flags.go:64] FLAG: --hostname-override="ip-10-0-137-95.ec2.internal" Apr 24 14:24:24.217931 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213211 2571 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 14:24:24.217931 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213214 2571 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 14:24:24.217931 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213217 2571 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 14:24:24.217931 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213220 2571 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 14:24:24.217931 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213223 2571 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 14:24:24.217931 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213227 2571 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 14:24:24.217931 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213230 2571 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 14:24:24.217931 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213233 2571 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 14:24:24.217931 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213236 2571 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 14:24:24.217931 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213239 2571 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 14:24:24.217931 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213242 2571 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 14:24:24.217931 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213245 2571 flags.go:64] FLAG: --kube-reserved="" Apr 24 14:24:24.217931 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213248 2571 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 14:24:24.217931 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213251 2571 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 14:24:24.217931 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213254 2571 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 14:24:24.217931 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213257 2571 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 14:24:24.217931 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213259 2571 flags.go:64] FLAG: --lock-file="" Apr 24 14:24:24.217931 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213262 2571 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 14:24:24.217931 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213265 2571 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 14:24:24.217931 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213268 2571 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 14:24:24.217931 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213273 2571 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 14:24:24.218523 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213276 2571 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 14:24:24.218523 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213279 2571 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 14:24:24.218523 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213282 2571 flags.go:64] FLAG: --logging-format="text" Apr 24 14:24:24.218523 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213285 2571 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 14:24:24.218523 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213288 2571 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 14:24:24.218523 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213291 2571 flags.go:64] FLAG: --manifest-url="" Apr 24 14:24:24.218523 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213294 2571 flags.go:64] FLAG: --manifest-url-header="" Apr 24 14:24:24.218523 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213298 2571 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 14:24:24.218523 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213301 2571 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 14:24:24.218523 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213305 2571 flags.go:64] FLAG: --max-pods="110" Apr 24 14:24:24.218523 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213310 2571 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 14:24:24.218523 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213313 2571 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 14:24:24.218523 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213316 2571 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 14:24:24.218523 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213319 2571 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 14:24:24.218523 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213322 2571 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 14:24:24.218523 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213325 2571 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 14:24:24.218523 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213328 2571 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 14:24:24.218523 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213336 2571 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 14:24:24.218523 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213339 2571 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 14:24:24.218523 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213342 2571 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 14:24:24.218523 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213345 2571 flags.go:64] FLAG: --pod-cidr="" Apr 24 14:24:24.218523 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213348 2571 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 14:24:24.218523 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213353 2571 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 14:24:24.219094 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213357 2571 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 14:24:24.219094 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213360 2571 flags.go:64] FLAG: --pods-per-core="0" Apr 24 14:24:24.219094 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213362 2571 flags.go:64] FLAG: --port="10250" Apr 24 14:24:24.219094 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213365 2571 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 14:24:24.219094 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213368 2571 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0fe44c314758898c4" Apr 24 14:24:24.219094 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213372 2571 flags.go:64] FLAG: --qos-reserved="" Apr 24 14:24:24.219094 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213374 2571 flags.go:64] FLAG: --read-only-port="10255" Apr 24 14:24:24.219094 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213377 2571 flags.go:64] FLAG: --register-node="true" Apr 24 14:24:24.219094 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213380 2571 flags.go:64] FLAG: --register-schedulable="true" Apr 24 14:24:24.219094 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213383 2571 flags.go:64] FLAG: --register-with-taints="" Apr 24 14:24:24.219094 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213387 2571 flags.go:64] FLAG: --registry-burst="10" Apr 24 14:24:24.219094 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213390 2571 flags.go:64] FLAG: --registry-qps="5" Apr 24 14:24:24.219094 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213393 2571 flags.go:64] FLAG: --reserved-cpus="" Apr 24 14:24:24.219094 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213395 2571 flags.go:64] FLAG: --reserved-memory="" Apr 24 14:24:24.219094 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213399 2571 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 14:24:24.219094 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213402 2571 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 14:24:24.219094 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213405 2571 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 14:24:24.219094 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213408 2571 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 14:24:24.219094 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213411 2571 flags.go:64] FLAG: --runonce="false" Apr 24 14:24:24.219094 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213416 2571 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 14:24:24.219094 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213419 2571 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 14:24:24.219094 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213422 2571 flags.go:64] FLAG: --seccomp-default="false" Apr 24 14:24:24.219094 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213425 2571 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 14:24:24.219094 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213427 2571 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 14:24:24.219094 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213431 2571 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 14:24:24.219094 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213435 2571 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 14:24:24.219783 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213438 2571 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 14:24:24.219783 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213441 2571 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 14:24:24.219783 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213444 2571 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 14:24:24.219783 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213447 2571 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 14:24:24.219783 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213450 2571 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 14:24:24.219783 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213453 2571 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 14:24:24.219783 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213456 2571 flags.go:64] FLAG: --system-cgroups="" Apr 24 14:24:24.219783 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213459 2571 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 14:24:24.219783 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213464 2571 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 14:24:24.219783 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213467 2571 flags.go:64] FLAG: --tls-cert-file="" Apr 24 14:24:24.219783 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213470 2571 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 14:24:24.219783 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213477 2571 flags.go:64] FLAG: --tls-min-version="" Apr 24 14:24:24.219783 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213480 2571 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 14:24:24.219783 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213483 2571 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 14:24:24.219783 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213486 2571 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 14:24:24.219783 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213489 2571 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 14:24:24.219783 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213492 2571 flags.go:64] FLAG: --v="2" Apr 24 14:24:24.219783 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213496 2571 flags.go:64] FLAG: --version="false" Apr 24 14:24:24.219783 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213500 2571 flags.go:64] FLAG: --vmodule="" Apr 24 14:24:24.219783 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213504 2571 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 14:24:24.219783 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.213507 2571 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 14:24:24.219783 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213599 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:24:24.219783 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213607 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:24:24.219783 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213610 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:24:24.219783 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213613 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:24:24.220400 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213620 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:24:24.220400 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213623 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:24:24.220400 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213626 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:24:24.220400 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213629 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:24:24.220400 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213631 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:24:24.220400 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213634 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:24:24.220400 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213637 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:24:24.220400 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213639 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:24:24.220400 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213642 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:24:24.220400 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213645 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:24:24.220400 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213648 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:24:24.220400 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213651 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:24:24.220400 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213653 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:24:24.220400 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213656 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:24:24.220400 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213658 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:24:24.220400 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213661 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:24:24.220400 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213664 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:24:24.220400 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213667 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:24:24.220400 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213669 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:24:24.220870 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213672 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:24:24.220870 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213674 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:24:24.220870 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213677 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:24:24.220870 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213679 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:24:24.220870 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213682 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:24:24.220870 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213684 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:24:24.220870 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213688 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:24:24.220870 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213691 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:24:24.220870 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213694 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:24:24.220870 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213696 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:24:24.220870 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213700 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:24:24.220870 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213703 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:24:24.220870 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213707 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:24:24.220870 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213711 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:24:24.220870 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213714 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:24:24.220870 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213717 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:24:24.220870 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213720 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:24:24.220870 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213723 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:24:24.220870 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213733 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:24:24.221368 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213736 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:24:24.221368 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213739 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:24:24.221368 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213741 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:24:24.221368 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213744 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:24:24.221368 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213748 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:24:24.221368 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213751 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:24:24.221368 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213753 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:24:24.221368 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213756 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:24:24.221368 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213759 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:24:24.221368 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213761 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:24:24.221368 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213764 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:24:24.221368 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213766 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:24:24.221368 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213769 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:24:24.221368 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213771 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:24:24.221368 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213774 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:24:24.221368 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213776 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:24:24.221368 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213779 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:24:24.221368 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213781 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:24:24.221368 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213784 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:24:24.221368 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213787 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:24:24.221868 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213789 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:24:24.221868 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213792 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:24:24.221868 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213794 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:24:24.221868 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213798 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:24:24.221868 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213801 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:24:24.221868 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213803 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:24:24.221868 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213807 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:24:24.221868 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213809 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:24:24.221868 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213812 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:24:24.221868 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213815 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:24:24.221868 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213817 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:24:24.221868 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213820 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:24:24.221868 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213822 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:24:24.221868 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213825 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:24:24.221868 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213827 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:24:24.221868 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213830 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:24:24.221868 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213833 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:24:24.221868 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213836 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:24:24.221868 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213838 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:24:24.221868 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213841 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:24:24.222376 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213843 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:24:24.222376 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213846 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:24:24.222376 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213849 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:24:24.222376 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.213851 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:24:24.222376 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.214536 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 14:24:24.222376 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.220964 2571 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 14:24:24.222376 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.220997 2571 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 14:24:24.222376 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221045 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:24:24.222376 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221050 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:24:24.222376 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221053 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:24:24.222376 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221056 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:24:24.222376 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221060 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:24:24.222376 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221063 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:24:24.222376 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221065 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:24:24.222376 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221068 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:24:24.222376 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221071 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:24:24.222805 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221074 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:24:24.222805 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221078 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:24:24.222805 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221081 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:24:24.222805 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221084 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:24:24.222805 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221087 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:24:24.222805 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221089 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:24:24.222805 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221092 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:24:24.222805 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221094 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:24:24.222805 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221098 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:24:24.222805 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221100 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:24:24.222805 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221103 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:24:24.222805 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221105 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:24:24.222805 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221108 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:24:24.222805 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221112 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:24:24.222805 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221116 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:24:24.222805 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221119 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:24:24.222805 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221122 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:24:24.222805 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221125 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:24:24.222805 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221128 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:24:24.223365 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221130 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:24:24.223365 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221132 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:24:24.223365 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221135 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:24:24.223365 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221138 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:24:24.223365 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221141 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:24:24.223365 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221143 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:24:24.223365 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221146 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:24:24.223365 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221149 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:24:24.223365 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221151 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:24:24.223365 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221154 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:24:24.223365 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221157 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:24:24.223365 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221159 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:24:24.223365 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221162 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:24:24.223365 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221164 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:24:24.223365 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221167 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:24:24.223365 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221170 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:24:24.223365 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221172 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:24:24.223365 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221174 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:24:24.223365 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221177 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:24:24.223843 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221179 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:24:24.223843 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221182 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:24:24.223843 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221184 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:24:24.223843 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221187 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:24:24.223843 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221190 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:24:24.223843 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221193 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:24:24.223843 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221195 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:24:24.223843 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221197 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:24:24.223843 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221200 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:24:24.223843 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221202 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:24:24.223843 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221205 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:24:24.223843 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221208 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:24:24.223843 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221210 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:24:24.223843 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221213 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:24:24.223843 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221216 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:24:24.223843 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221220 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:24:24.223843 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221223 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:24:24.223843 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221226 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:24:24.223843 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221228 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:24:24.224332 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221231 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:24:24.224332 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221233 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:24:24.224332 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221236 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:24:24.224332 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221238 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:24:24.224332 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221241 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:24:24.224332 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221243 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:24:24.224332 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221245 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:24:24.224332 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221248 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:24:24.224332 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221251 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:24:24.224332 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221253 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:24:24.224332 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221256 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:24:24.224332 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221259 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:24:24.224332 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221261 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:24:24.224332 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221264 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:24:24.224332 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221266 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:24:24.224332 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221269 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:24:24.224332 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221271 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:24:24.224332 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221274 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:24:24.224332 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221276 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:24:24.224332 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221279 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:24:24.224820 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.221283 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 14:24:24.224820 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221389 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:24:24.224820 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221393 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:24:24.224820 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221396 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:24:24.224820 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221400 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:24:24.224820 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221403 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:24:24.224820 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221405 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:24:24.224820 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221408 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:24:24.224820 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221411 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:24:24.224820 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221413 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:24:24.224820 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221416 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:24:24.224820 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221419 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:24:24.224820 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221421 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:24:24.224820 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221424 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:24:24.224820 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221426 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:24:24.224820 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221429 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:24:24.225235 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221431 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:24:24.225235 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221434 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:24:24.225235 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221436 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:24:24.225235 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221439 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:24:24.225235 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221442 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:24:24.225235 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221445 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:24:24.225235 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221448 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:24:24.225235 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221450 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:24:24.225235 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221453 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:24:24.225235 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221455 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:24:24.225235 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221457 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:24:24.225235 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221460 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:24:24.225235 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221462 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:24:24.225235 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221465 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:24:24.225235 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221468 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:24:24.225235 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221471 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:24:24.225235 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221473 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:24:24.225235 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221476 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:24:24.225235 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221479 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:24:24.225235 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221481 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:24:24.225740 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221484 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:24:24.225740 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221486 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:24:24.225740 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221490 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:24:24.225740 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221494 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:24:24.225740 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221498 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:24:24.225740 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221500 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:24:24.225740 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221503 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:24:24.225740 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221505 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:24:24.225740 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221508 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:24:24.225740 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221510 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:24:24.225740 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221513 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:24:24.225740 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221515 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:24:24.225740 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221518 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:24:24.225740 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221520 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:24:24.225740 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221523 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:24:24.225740 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221526 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:24:24.225740 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221528 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:24:24.225740 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221531 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:24:24.225740 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221534 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:24:24.226211 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221536 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:24:24.226211 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221540 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:24:24.226211 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221544 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:24:24.226211 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221547 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:24:24.226211 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221550 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:24:24.226211 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221553 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:24:24.226211 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221555 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:24:24.226211 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221558 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:24:24.226211 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221560 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:24:24.226211 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221563 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:24:24.226211 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221566 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:24:24.226211 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221568 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:24:24.226211 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221570 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:24:24.226211 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221573 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:24:24.226211 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221576 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:24:24.226211 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221578 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:24:24.226211 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221581 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:24:24.226211 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221583 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:24:24.226211 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221586 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:24:24.226671 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221588 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:24:24.226671 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221591 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:24:24.226671 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221593 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:24:24.226671 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221596 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:24:24.226671 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221598 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:24:24.226671 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221601 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:24:24.226671 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221603 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:24:24.226671 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221606 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:24:24.226671 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221608 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:24:24.226671 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221611 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:24:24.226671 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221613 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:24:24.226671 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221615 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:24:24.226671 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:24.221618 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:24:24.226671 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.221623 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 14:24:24.226671 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.222638 2571 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 14:24:24.227064 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.226836 2571 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 14:24:24.227864 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.227852 2571 server.go:1019] "Starting client certificate rotation" Apr 24 14:24:24.227967 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.227949 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 14:24:24.228013 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.228001 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 14:24:24.254125 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.254105 2571 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 14:24:24.260574 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.260553 2571 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 14:24:24.274836 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.274820 2571 log.go:25] "Validated CRI v1 runtime API" Apr 24 14:24:24.281034 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.281016 2571 log.go:25] "Validated CRI v1 image API" Apr 24 14:24:24.282249 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.282236 2571 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 14:24:24.288405 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.288380 2571 fs.go:135] Filesystem UUIDs: map[447807b0-422a-4e78-bf02-43ed033a265a:/dev/nvme0n1p3 5369b697-0594-4db7-9767-842a79d3b744:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 24 14:24:24.288489 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.288403 2571 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 14:24:24.293645 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.293628 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 14:24:24.294338 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.294233 2571 manager.go:217] Machine: {Timestamp:2026-04-24 14:24:24.292069786 +0000 UTC m=+0.422047336 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3189697 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec219254c88426d55b003c99d8254fb9 SystemUUID:ec219254-c884-26d5-5b00-3c99d8254fb9 BootID:6d6b9cb6-561d-472a-b2ee-3958868f8046 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:29:ec:75:b5:3b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:29:ec:75:b5:3b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:02:bf:41:68:36:46 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 14:24:24.294338 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.294335 2571 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 14:24:24.294430 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.294414 2571 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 14:24:24.294776 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.294751 2571 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 14:24:24.294908 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.294778 2571 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-95.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 14:24:24.294949 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.294916 2571 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 14:24:24.294949 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.294925 2571 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 14:24:24.294949 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.294938 2571 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 14:24:24.295711 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.295701 2571 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 14:24:24.297065 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.297055 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 24 14:24:24.297168 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.297160 2571 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 14:24:24.300597 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.300586 2571 kubelet.go:491] "Attempting to sync node with API server" Apr 24 14:24:24.300646 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.300606 2571 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 14:24:24.300646 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.300623 2571 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 14:24:24.300646 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.300634 2571 kubelet.go:397] "Adding apiserver pod source" Apr 24 14:24:24.300646 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.300644 2571 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 14:24:24.301604 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.301593 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 14:24:24.301654 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.301610 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 14:24:24.304785 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.304770 2571 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 14:24:24.306715 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.306702 2571 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 14:24:24.308028 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.308015 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 14:24:24.308064 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.308037 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 14:24:24.308064 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.308048 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 14:24:24.308064 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.308057 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 14:24:24.308152 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.308067 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 14:24:24.308152 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.308076 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 14:24:24.308152 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.308085 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 14:24:24.308152 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.308094 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 14:24:24.308152 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.308103 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 14:24:24.308152 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.308115 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 14:24:24.308152 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.308137 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 14:24:24.308330 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.308158 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 14:24:24.309091 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.309078 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 14:24:24.309129 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.309093 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 14:24:24.312907 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.312885 2571 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 14:24:24.313033 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.312934 2571 server.go:1295] "Started kubelet" Apr 24 14:24:24.313855 ip-10-0-137-95 systemd[1]: Started Kubernetes Kubelet. Apr 24 14:24:24.314509 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.314322 2571 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 14:24:24.314509 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.314428 2571 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 14:24:24.314611 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.314530 2571 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 14:24:24.315310 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.315183 2571 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-137-95.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 14:24:24.315310 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:24.315298 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-137-95.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 14:24:24.315415 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:24.315396 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 14:24:24.316502 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.316365 2571 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 14:24:24.317837 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.317822 2571 server.go:317] "Adding debug handlers to kubelet server" Apr 24 14:24:24.324412 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.324392 2571 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 14:24:24.324412 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.324395 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 14:24:24.325183 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:24.324032 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-95.ec2.internal.18a95111abcb8580 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-95.ec2.internal,UID:ip-10-0-137-95.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-137-95.ec2.internal,},FirstTimestamp:2026-04-24 14:24:24.312907136 +0000 UTC m=+0.442884689,LastTimestamp:2026-04-24 14:24:24.312907136 +0000 UTC m=+0.442884689,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-95.ec2.internal,}" Apr 24 14:24:24.325266 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.325255 2571 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 14:24:24.325323 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.325270 2571 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 14:24:24.325323 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:24.325272 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-95.ec2.internal\" not found" Apr 24 14:24:24.325323 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.325302 2571 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 14:24:24.325457 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.325356 2571 reconstruct.go:97] "Volume reconstruction finished" Apr 24 14:24:24.325457 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.325365 2571 reconciler.go:26] "Reconciler: start to sync state" Apr 24 14:24:24.325627 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.325613 2571 factory.go:153] Registering CRI-O factory Apr 24 14:24:24.325665 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.325633 2571 factory.go:223] Registration of the crio container factory successfully Apr 24 14:24:24.325699 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.325677 2571 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 14:24:24.325699 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.325687 2571 factory.go:55] Registering systemd factory Apr 24 14:24:24.325699 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.325695 2571 factory.go:223] Registration of the systemd container factory successfully Apr 24 14:24:24.325813 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.325715 2571 factory.go:103] Registering Raw factory Apr 24 14:24:24.325813 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.325740 2571 manager.go:1196] Started watching for new ooms in manager Apr 24 14:24:24.326086 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.326071 2571 manager.go:319] Starting recovery of all containers Apr 24 14:24:24.326953 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:24.326918 2571 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 14:24:24.334312 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:24.334286 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 14:24:24.334396 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:24.334307 2571 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-137-95.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 14:24:24.336059 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.336045 2571 manager.go:324] Recovery completed Apr 24 14:24:24.340218 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.340170 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:24:24.343418 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.343403 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-95.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:24:24.343487 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.343431 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-95.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:24:24.343487 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.343444 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-95.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:24:24.343926 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.343913 2571 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 14:24:24.343926 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.343924 2571 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 14:24:24.344026 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.343939 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 24 14:24:24.344892 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:24.344825 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-95.ec2.internal.18a95111ad9d102d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-95.ec2.internal,UID:ip-10-0-137-95.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-137-95.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-137-95.ec2.internal,},FirstTimestamp:2026-04-24 14:24:24.343416877 +0000 UTC m=+0.473394431,LastTimestamp:2026-04-24 14:24:24.343416877 +0000 UTC m=+0.473394431,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-95.ec2.internal,}" Apr 24 14:24:24.347543 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.347529 2571 policy_none.go:49] "None policy: Start" Apr 24 14:24:24.347609 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.347547 2571 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 14:24:24.347609 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.347570 2571 state_mem.go:35] "Initializing new in-memory state store" Apr 24 14:24:24.350251 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.350233 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-28s7c" Apr 24 14:24:24.355393 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:24.355329 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-95.ec2.internal.18a95111ad9d64f4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-95.ec2.internal,UID:ip-10-0-137-95.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-137-95.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-137-95.ec2.internal,},FirstTimestamp:2026-04-24 14:24:24.34343858 +0000 UTC m=+0.473416135,LastTimestamp:2026-04-24 14:24:24.34343858 +0000 UTC m=+0.473416135,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-95.ec2.internal,}" Apr 24 14:24:24.357149 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.357132 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-28s7c" Apr 24 14:24:24.383520 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.383505 2571 manager.go:341] "Starting Device Plugin manager" Apr 24 14:24:24.406396 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:24.383540 2571 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 14:24:24.406396 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.383550 2571 server.go:85] "Starting device plugin registration server" Apr 24 14:24:24.406396 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.383832 2571 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 14:24:24.406396 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.383841 2571 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 14:24:24.406396 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.383940 2571 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 14:24:24.406396 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.384040 2571 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 14:24:24.406396 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.384050 2571 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 14:24:24.406396 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:24.384431 2571 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 14:24:24.406396 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:24.384464 2571 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-95.ec2.internal\" not found" Apr 24 14:24:24.454548 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.454509 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 14:24:24.455672 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.455655 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 14:24:24.455754 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.455684 2571 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 14:24:24.455754 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.455706 2571 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 14:24:24.455754 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.455716 2571 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 14:24:24.455889 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:24.455754 2571 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 14:24:24.459603 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.459529 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:24:24.484809 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.484793 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:24:24.485603 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.485590 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-95.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:24:24.485658 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.485622 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-95.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:24:24.485658 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.485633 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-95.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:24:24.485658 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.485654 2571 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-95.ec2.internal" Apr 24 14:24:24.496818 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.496800 2571 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-95.ec2.internal" Apr 24 14:24:24.496871 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:24.496820 2571 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-95.ec2.internal\": node \"ip-10-0-137-95.ec2.internal\" not found" Apr 24 14:24:24.543176 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:24.543148 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-95.ec2.internal\" not found" Apr 24 14:24:24.556369 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.556344 2571 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-137-95.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-95.ec2.internal"] Apr 24 14:24:24.556476 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.556416 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:24:24.557283 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.557267 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-95.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:24:24.557380 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.557299 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-95.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:24:24.557380 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.557315 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-95.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:24:24.559562 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.559546 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:24:24.559700 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.559686 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-95.ec2.internal" Apr 24 14:24:24.559751 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.559717 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:24:24.560192 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.560176 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-95.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:24:24.560263 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.560192 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-95.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:24:24.560263 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.560203 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-95.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:24:24.560263 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.560214 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-95.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:24:24.560263 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.560229 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-95.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:24:24.560405 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.560217 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-95.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:24:24.562466 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.562452 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-95.ec2.internal" Apr 24 14:24:24.562514 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.562476 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:24:24.563092 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.563076 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-95.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:24:24.563176 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.563100 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-95.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:24:24.563176 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.563109 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-95.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:24:24.585538 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:24.585518 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-95.ec2.internal\" not found" node="ip-10-0-137-95.ec2.internal" Apr 24 14:24:24.589765 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:24.589748 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-95.ec2.internal\" not found" node="ip-10-0-137-95.ec2.internal" Apr 24 14:24:24.626780 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.626762 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c07dfa5590a30bfb9dbec92d1fcd686b-config\") pod \"kube-apiserver-proxy-ip-10-0-137-95.ec2.internal\" (UID: \"c07dfa5590a30bfb9dbec92d1fcd686b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-95.ec2.internal" Apr 24 14:24:24.626870 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.626785 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/46b053d51bc500e88f49a6371b8d013f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-95.ec2.internal\" (UID: \"46b053d51bc500e88f49a6371b8d013f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-95.ec2.internal" Apr 24 14:24:24.626870 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.626811 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/46b053d51bc500e88f49a6371b8d013f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-95.ec2.internal\" (UID: \"46b053d51bc500e88f49a6371b8d013f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-95.ec2.internal" Apr 24 14:24:24.644227 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:24.644208 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-95.ec2.internal\" not found" Apr 24 14:24:24.727595 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.727531 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c07dfa5590a30bfb9dbec92d1fcd686b-config\") pod \"kube-apiserver-proxy-ip-10-0-137-95.ec2.internal\" (UID: \"c07dfa5590a30bfb9dbec92d1fcd686b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-95.ec2.internal" Apr 24 14:24:24.727595 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.727561 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/46b053d51bc500e88f49a6371b8d013f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-95.ec2.internal\" (UID: \"46b053d51bc500e88f49a6371b8d013f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-95.ec2.internal" Apr 24 14:24:24.727595 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.727580 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/46b053d51bc500e88f49a6371b8d013f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-95.ec2.internal\" (UID: \"46b053d51bc500e88f49a6371b8d013f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-95.ec2.internal" Apr 24 14:24:24.727750 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.727632 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/46b053d51bc500e88f49a6371b8d013f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-95.ec2.internal\" (UID: \"46b053d51bc500e88f49a6371b8d013f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-95.ec2.internal" Apr 24 14:24:24.727750 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.727639 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/46b053d51bc500e88f49a6371b8d013f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-95.ec2.internal\" (UID: \"46b053d51bc500e88f49a6371b8d013f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-95.ec2.internal" Apr 24 14:24:24.727750 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.727684 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c07dfa5590a30bfb9dbec92d1fcd686b-config\") pod \"kube-apiserver-proxy-ip-10-0-137-95.ec2.internal\" (UID: \"c07dfa5590a30bfb9dbec92d1fcd686b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-95.ec2.internal" Apr 24 14:24:24.744627 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:24.744611 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-95.ec2.internal\" not found" Apr 24 14:24:24.845464 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:24.845414 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-95.ec2.internal\" not found" Apr 24 14:24:24.887627 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.887596 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-95.ec2.internal" Apr 24 14:24:24.892167 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:24.892140 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-95.ec2.internal" Apr 24 14:24:24.946102 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:24.946069 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-95.ec2.internal\" not found" Apr 24 14:24:25.046660 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:25.046597 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-95.ec2.internal\" not found" Apr 24 14:24:25.147170 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:25.147144 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-95.ec2.internal\" not found" Apr 24 14:24:25.227627 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:25.227597 2571 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 14:24:25.228112 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:25.227768 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 14:24:25.247877 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:25.247849 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-95.ec2.internal\" not found" Apr 24 14:24:25.324785 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:25.324757 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 14:24:25.333844 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:25.333823 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 14:24:25.348217 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:25.348193 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-95.ec2.internal\" not found" Apr 24 14:24:25.354013 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:25.353964 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-wcnpw" Apr 24 14:24:25.359047 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:25.359021 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 14:19:24 +0000 UTC" deadline="2028-01-24 05:10:27.943677136 +0000 UTC" Apr 24 14:24:25.359047 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:25.359044 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15350h46m2.584635667s" Apr 24 14:24:25.362466 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:25.362452 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-wcnpw" Apr 24 14:24:25.430493 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:25.430463 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc07dfa5590a30bfb9dbec92d1fcd686b.slice/crio-9230ebe39be8ead311dc49964222694b2eccce21f5f70706c8976497d385ec39 WatchSource:0}: Error finding container 9230ebe39be8ead311dc49964222694b2eccce21f5f70706c8976497d385ec39: Status 404 returned error can't find the container with id 9230ebe39be8ead311dc49964222694b2eccce21f5f70706c8976497d385ec39 Apr 24 14:24:25.431140 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:25.431111 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46b053d51bc500e88f49a6371b8d013f.slice/crio-85cc6eb75ffa33fa737eaa97a6c45abc17952c0db1470d1872455c93d7f4b5a9 WatchSource:0}: Error finding container 85cc6eb75ffa33fa737eaa97a6c45abc17952c0db1470d1872455c93d7f4b5a9: Status 404 returned error can't find the container with id 85cc6eb75ffa33fa737eaa97a6c45abc17952c0db1470d1872455c93d7f4b5a9 Apr 24 14:24:25.435054 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:25.435039 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 14:24:25.448916 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:25.448895 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-95.ec2.internal\" not found" Apr 24 14:24:25.458697 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:25.458659 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-95.ec2.internal" event={"ID":"46b053d51bc500e88f49a6371b8d013f","Type":"ContainerStarted","Data":"85cc6eb75ffa33fa737eaa97a6c45abc17952c0db1470d1872455c93d7f4b5a9"} Apr 24 14:24:25.459545 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:25.459527 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-95.ec2.internal" event={"ID":"c07dfa5590a30bfb9dbec92d1fcd686b","Type":"ContainerStarted","Data":"9230ebe39be8ead311dc49964222694b2eccce21f5f70706c8976497d385ec39"} Apr 24 14:24:25.549170 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:25.549144 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-95.ec2.internal\" not found" Apr 24 14:24:25.649675 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:25.649648 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-95.ec2.internal\" not found" Apr 24 14:24:25.682884 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:25.682861 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:24:25.701057 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:25.701032 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:24:25.703291 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:25.702114 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:24:25.724608 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:25.724592 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-95.ec2.internal" Apr 24 14:24:25.735715 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:25.735699 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 14:24:25.736630 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:25.736619 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-95.ec2.internal" Apr 24 14:24:25.749622 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:25.749603 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 14:24:26.302243 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.302206 2571 apiserver.go:52] "Watching apiserver" Apr 24 14:24:26.307395 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.307372 2571 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 14:24:26.309538 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.309512 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-137-95.ec2.internal","openshift-cluster-node-tuning-operator/tuned-fhn6c","openshift-image-registry/node-ca-hgntn","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-95.ec2.internal","openshift-multus/multus-additional-cni-plugins-42l4z","openshift-multus/multus-gwdlv","openshift-network-diagnostics/network-check-target-dgzq2","kube-system/konnectivity-agent-v9wvt","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4bhjw","openshift-dns/node-resolver-bmpbb","openshift-multus/network-metrics-daemon-fsnj5","openshift-network-operator/iptables-alerter-j9xv9","openshift-ovn-kubernetes/ovnkube-node-7m9zt"] Apr 24 14:24:26.312680 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.312656 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fsnj5" Apr 24 14:24:26.312772 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:26.312746 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fsnj5" podUID="a022a0ca-5e80-43a6-8ee0-69dcf197d1a8" Apr 24 14:24:26.314719 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.314699 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.316569 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.316547 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 14:24:26.316685 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.316580 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 14:24:26.316685 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.316553 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-6lvp6\"" Apr 24 14:24:26.316936 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.316915 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hgntn" Apr 24 14:24:26.318578 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.318558 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 14:24:26.318735 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.318717 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 14:24:26.318831 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.318721 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-6brmq\"" Apr 24 14:24:26.319115 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.319096 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 14:24:26.319485 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.319462 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.321068 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.321051 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 14:24:26.321375 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.321356 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 14:24:26.321442 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.321386 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 14:24:26.321442 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.321356 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-l8zq2\"" Apr 24 14:24:26.321551 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.321394 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 14:24:26.321551 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.321484 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 14:24:26.321807 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.321789 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-42l4z" Apr 24 14:24:26.322160 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.321940 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 14:24:26.323703 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.323686 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 14:24:26.324006 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.323965 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 14:24:26.324589 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.324007 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 14:24:26.324589 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.324051 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 14:24:26.324589 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.324234 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.324589 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.324056 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-9fsgn\"" Apr 24 14:24:26.324589 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.324061 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 14:24:26.326440 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.326252 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 14:24:26.326440 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.326381 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-hzwhh\"" Apr 24 14:24:26.326765 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.326743 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dgzq2" Apr 24 14:24:26.326854 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:26.326834 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dgzq2" podUID="4ebd1686-821a-4fb0-b091-8a636b80f78e" Apr 24 14:24:26.330029 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.330010 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-v9wvt" Apr 24 14:24:26.331998 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.331971 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 14:24:26.332092 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.332031 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 14:24:26.332433 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.332414 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4bhjw" Apr 24 14:24:26.332956 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.332940 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-pkpvs\"" Apr 24 14:24:26.334215 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.334193 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 14:24:26.334364 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.334204 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-systemd-units\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.334364 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.334226 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 14:24:26.334364 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.334261 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6c91c545-ee17-43ad-8a08-42be9b2cda48-tuning-conf-dir\") pod \"multus-additional-cni-plugins-42l4z\" (UID: \"6c91c545-ee17-43ad-8a08-42be9b2cda48\") " pod="openshift-multus/multus-additional-cni-plugins-42l4z" Apr 24 14:24:26.334364 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.334294 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9ce961e4-1141-4ebb-83ff-6ef501b710dd-etc-sysconfig\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.334364 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.334330 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 14:24:26.334364 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.334334 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9ce961e4-1141-4ebb-83ff-6ef501b710dd-etc-sysctl-d\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.334720 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.334376 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/87bcc27f-4b5f-48a3-9ae3-f93cb520eea0-serviceca\") pod \"node-ca-hgntn\" (UID: \"87bcc27f-4b5f-48a3-9ae3-f93cb520eea0\") " pod="openshift-image-registry/node-ca-hgntn" Apr 24 14:24:26.334720 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.334407 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-host-kubelet\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.334720 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.334432 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a7b9926d-4f53-4532-8669-16af4fc30cfd-ovnkube-config\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.334720 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.334455 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a7b9926d-4f53-4532-8669-16af4fc30cfd-ovnkube-script-lib\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.334720 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.334481 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf78d\" (UniqueName: \"kubernetes.io/projected/4ebd1686-821a-4fb0-b091-8a636b80f78e-kube-api-access-mf78d\") pod \"network-check-target-dgzq2\" (UID: \"4ebd1686-821a-4fb0-b091-8a636b80f78e\") " pod="openshift-network-diagnostics/network-check-target-dgzq2" Apr 24 14:24:26.334720 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.334505 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9ce961e4-1141-4ebb-83ff-6ef501b710dd-etc-sysctl-conf\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.334720 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.334527 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9ce961e4-1141-4ebb-83ff-6ef501b710dd-etc-systemd\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.334720 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.334528 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-rxp5q\"" Apr 24 14:24:26.334720 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.334566 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9ce961e4-1141-4ebb-83ff-6ef501b710dd-tmp\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.334720 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.334598 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9ce961e4-1141-4ebb-83ff-6ef501b710dd-etc-modprobe-d\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.334720 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.334621 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/87bcc27f-4b5f-48a3-9ae3-f93cb520eea0-host\") pod \"node-ca-hgntn\" (UID: \"87bcc27f-4b5f-48a3-9ae3-f93cb520eea0\") " pod="openshift-image-registry/node-ca-hgntn" Apr 24 14:24:26.334720 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.334641 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-host-slash\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.334720 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.334671 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-run-systemd\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.334720 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.334693 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-run-ovn\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.334720 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.334715 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a7b9926d-4f53-4532-8669-16af4fc30cfd-env-overrides\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.335379 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.334739 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q97xb\" (UniqueName: \"kubernetes.io/projected/a7b9926d-4f53-4532-8669-16af4fc30cfd-kube-api-access-q97xb\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.335379 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.334766 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k24kx\" (UniqueName: \"kubernetes.io/projected/6c91c545-ee17-43ad-8a08-42be9b2cda48-kube-api-access-k24kx\") pod \"multus-additional-cni-plugins-42l4z\" (UID: \"6c91c545-ee17-43ad-8a08-42be9b2cda48\") " pod="openshift-multus/multus-additional-cni-plugins-42l4z" Apr 24 14:24:26.335379 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.334790 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9ce961e4-1141-4ebb-83ff-6ef501b710dd-sys\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.335379 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.334813 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-j9xv9" Apr 24 14:24:26.335379 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.334813 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9ce961e4-1141-4ebb-83ff-6ef501b710dd-etc-tuned\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.335379 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.335146 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-host-run-multus-certs\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.335379 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.335173 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-var-lib-openvswitch\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.335379 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.335216 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a7b9926d-4f53-4532-8669-16af4fc30cfd-ovn-node-metrics-cert\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.335379 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.335240 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-system-cni-dir\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.335379 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.335262 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9ce961e4-1141-4ebb-83ff-6ef501b710dd-host\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.335379 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.335286 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w7wq\" (UniqueName: \"kubernetes.io/projected/87bcc27f-4b5f-48a3-9ae3-f93cb520eea0-kube-api-access-8w7wq\") pod \"node-ca-hgntn\" (UID: \"87bcc27f-4b5f-48a3-9ae3-f93cb520eea0\") " pod="openshift-image-registry/node-ca-hgntn" Apr 24 14:24:26.335379 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.335311 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-run-openvswitch\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.335379 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.335335 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6c91c545-ee17-43ad-8a08-42be9b2cda48-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-42l4z\" (UID: \"6c91c545-ee17-43ad-8a08-42be9b2cda48\") " pod="openshift-multus/multus-additional-cni-plugins-42l4z" Apr 24 14:24:26.335379 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.335379 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-cni-binary-copy\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.335948 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.335408 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-log-socket\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.335948 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.335456 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-multus-socket-dir-parent\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.335948 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.335482 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrh5q\" (UniqueName: \"kubernetes.io/projected/9ce961e4-1141-4ebb-83ff-6ef501b710dd-kube-api-access-hrh5q\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.335948 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.335506 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-host-run-netns\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.335948 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.335530 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-cnibin\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.335948 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.335551 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-os-release\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.335948 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.335575 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-multus-conf-dir\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.335948 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.335597 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-etc-openvswitch\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.335948 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.335620 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6c91c545-ee17-43ad-8a08-42be9b2cda48-cnibin\") pod \"multus-additional-cni-plugins-42l4z\" (UID: \"6c91c545-ee17-43ad-8a08-42be9b2cda48\") " pod="openshift-multus/multus-additional-cni-plugins-42l4z" Apr 24 14:24:26.335948 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.335637 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6c91c545-ee17-43ad-8a08-42be9b2cda48-os-release\") pod \"multus-additional-cni-plugins-42l4z\" (UID: \"6c91c545-ee17-43ad-8a08-42be9b2cda48\") " pod="openshift-multus/multus-additional-cni-plugins-42l4z" Apr 24 14:24:26.335948 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.335660 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-hostroot\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.335948 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.335690 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ce961e4-1141-4ebb-83ff-6ef501b710dd-etc-kubernetes\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.335948 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.335731 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a022a0ca-5e80-43a6-8ee0-69dcf197d1a8-metrics-certs\") pod \"network-metrics-daemon-fsnj5\" (UID: \"a022a0ca-5e80-43a6-8ee0-69dcf197d1a8\") " pod="openshift-multus/network-metrics-daemon-fsnj5" Apr 24 14:24:26.335948 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.335760 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-host-cni-bin\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.335948 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.335794 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-host-cni-netd\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.335948 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.335858 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6c91c545-ee17-43ad-8a08-42be9b2cda48-cni-binary-copy\") pod \"multus-additional-cni-plugins-42l4z\" (UID: \"6c91c545-ee17-43ad-8a08-42be9b2cda48\") " pod="openshift-multus/multus-additional-cni-plugins-42l4z" Apr 24 14:24:26.335948 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.335890 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-host-run-k8s-cni-cncf-io\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.336736 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.335918 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-host-run-netns\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.336736 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.335943 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-host-var-lib-kubelet\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.336736 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.336022 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-multus-daemon-config\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.336736 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.336054 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-etc-kubernetes\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.336736 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.336099 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6c91c545-ee17-43ad-8a08-42be9b2cda48-system-cni-dir\") pod \"multus-additional-cni-plugins-42l4z\" (UID: \"6c91c545-ee17-43ad-8a08-42be9b2cda48\") " pod="openshift-multus/multus-additional-cni-plugins-42l4z" Apr 24 14:24:26.336736 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.336130 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-node-log\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.336736 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.336157 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6c91c545-ee17-43ad-8a08-42be9b2cda48-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-42l4z\" (UID: \"6c91c545-ee17-43ad-8a08-42be9b2cda48\") " pod="openshift-multus/multus-additional-cni-plugins-42l4z" Apr 24 14:24:26.336736 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.336183 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-host-var-lib-cni-multus\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.336736 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.336204 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9ce961e4-1141-4ebb-83ff-6ef501b710dd-lib-modules\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.336736 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.336244 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.336736 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.336318 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-host-var-lib-cni-bin\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.336736 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.336354 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9ce961e4-1141-4ebb-83ff-6ef501b710dd-run\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.336736 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.336384 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ce961e4-1141-4ebb-83ff-6ef501b710dd-var-lib-kubelet\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.336736 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.336416 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nsqj\" (UniqueName: \"kubernetes.io/projected/a022a0ca-5e80-43a6-8ee0-69dcf197d1a8-kube-api-access-8nsqj\") pod \"network-metrics-daemon-fsnj5\" (UID: \"a022a0ca-5e80-43a6-8ee0-69dcf197d1a8\") " pod="openshift-multus/network-metrics-daemon-fsnj5" Apr 24 14:24:26.336736 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.336443 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-host-run-ovn-kubernetes\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.336736 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.336466 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-multus-cni-dir\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.338017 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.336489 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqlsf\" (UniqueName: \"kubernetes.io/projected/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-kube-api-access-lqlsf\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.338017 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.336541 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 14:24:26.338017 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.336823 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 14:24:26.338017 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.337141 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 14:24:26.338017 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.337158 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bmpbb" Apr 24 14:24:26.338017 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.337290 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-dtb4c\"" Apr 24 14:24:26.339009 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.338953 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 14:24:26.339104 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.339030 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-x9zq7\"" Apr 24 14:24:26.339104 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.339077 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 14:24:26.363250 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.363209 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 14:19:25 +0000 UTC" deadline="2027-12-13 03:35:42.515502777 +0000 UTC" Apr 24 14:24:26.363250 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.363249 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14341h11m16.152257371s" Apr 24 14:24:26.427019 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.426977 2571 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 14:24:26.437435 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.437403 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lqlsf\" (UniqueName: \"kubernetes.io/projected/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-kube-api-access-lqlsf\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.437582 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.437449 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/15cb0256-87b8-40a0-812d-7931f453c264-agent-certs\") pod \"konnectivity-agent-v9wvt\" (UID: \"15cb0256-87b8-40a0-812d-7931f453c264\") " pod="kube-system/konnectivity-agent-v9wvt" Apr 24 14:24:26.437582 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.437481 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0b999a46-88fd-4d1a-a6e6-11c90708c270-registration-dir\") pod \"aws-ebs-csi-driver-node-4bhjw\" (UID: \"0b999a46-88fd-4d1a-a6e6-11c90708c270\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4bhjw" Apr 24 14:24:26.437582 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.437511 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-systemd-units\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.437582 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.437556 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6c91c545-ee17-43ad-8a08-42be9b2cda48-tuning-conf-dir\") pod \"multus-additional-cni-plugins-42l4z\" (UID: \"6c91c545-ee17-43ad-8a08-42be9b2cda48\") " pod="openshift-multus/multus-additional-cni-plugins-42l4z" Apr 24 14:24:26.437582 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.437569 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-systemd-units\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.437838 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.437591 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9ce961e4-1141-4ebb-83ff-6ef501b710dd-etc-sysconfig\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.437838 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.437609 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9ce961e4-1141-4ebb-83ff-6ef501b710dd-etc-sysctl-d\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.437838 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.437627 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/87bcc27f-4b5f-48a3-9ae3-f93cb520eea0-serviceca\") pod \"node-ca-hgntn\" (UID: \"87bcc27f-4b5f-48a3-9ae3-f93cb520eea0\") " pod="openshift-image-registry/node-ca-hgntn" Apr 24 14:24:26.437838 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.437679 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9ce961e4-1141-4ebb-83ff-6ef501b710dd-etc-sysconfig\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.437838 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.437732 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cq4c\" (UniqueName: \"kubernetes.io/projected/b97fcfbf-fa5a-4b45-8446-f25172d545bb-kube-api-access-2cq4c\") pod \"node-resolver-bmpbb\" (UID: \"b97fcfbf-fa5a-4b45-8446-f25172d545bb\") " pod="openshift-dns/node-resolver-bmpbb" Apr 24 14:24:26.437838 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.437756 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6c91c545-ee17-43ad-8a08-42be9b2cda48-tuning-conf-dir\") pod \"multus-additional-cni-plugins-42l4z\" (UID: \"6c91c545-ee17-43ad-8a08-42be9b2cda48\") " pod="openshift-multus/multus-additional-cni-plugins-42l4z" Apr 24 14:24:26.437838 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.437768 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-host-kubelet\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.437838 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.437780 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9ce961e4-1141-4ebb-83ff-6ef501b710dd-etc-sysctl-d\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.437838 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.437795 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a7b9926d-4f53-4532-8669-16af4fc30cfd-ovnkube-config\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.437838 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.437816 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-host-kubelet\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.437838 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.437821 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a7b9926d-4f53-4532-8669-16af4fc30cfd-ovnkube-script-lib\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.438395 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.437862 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mf78d\" (UniqueName: \"kubernetes.io/projected/4ebd1686-821a-4fb0-b091-8a636b80f78e-kube-api-access-mf78d\") pod \"network-check-target-dgzq2\" (UID: \"4ebd1686-821a-4fb0-b091-8a636b80f78e\") " pod="openshift-network-diagnostics/network-check-target-dgzq2" Apr 24 14:24:26.438395 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.437886 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9ce961e4-1141-4ebb-83ff-6ef501b710dd-etc-sysctl-conf\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.438395 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.437913 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9ce961e4-1141-4ebb-83ff-6ef501b710dd-etc-systemd\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.438395 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.437936 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9ce961e4-1141-4ebb-83ff-6ef501b710dd-tmp\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.438395 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.437962 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9ce961e4-1141-4ebb-83ff-6ef501b710dd-etc-modprobe-d\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.438395 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438004 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/87bcc27f-4b5f-48a3-9ae3-f93cb520eea0-host\") pod \"node-ca-hgntn\" (UID: \"87bcc27f-4b5f-48a3-9ae3-f93cb520eea0\") " pod="openshift-image-registry/node-ca-hgntn" Apr 24 14:24:26.438395 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438028 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-host-slash\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.438395 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438051 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-run-systemd\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.438395 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438077 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-run-ovn\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.438395 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438089 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/87bcc27f-4b5f-48a3-9ae3-f93cb520eea0-serviceca\") pod \"node-ca-hgntn\" (UID: \"87bcc27f-4b5f-48a3-9ae3-f93cb520eea0\") " pod="openshift-image-registry/node-ca-hgntn" Apr 24 14:24:26.438395 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438101 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a7b9926d-4f53-4532-8669-16af4fc30cfd-env-overrides\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.438395 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438127 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q97xb\" (UniqueName: \"kubernetes.io/projected/a7b9926d-4f53-4532-8669-16af4fc30cfd-kube-api-access-q97xb\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.438395 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438150 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k24kx\" (UniqueName: \"kubernetes.io/projected/6c91c545-ee17-43ad-8a08-42be9b2cda48-kube-api-access-k24kx\") pod \"multus-additional-cni-plugins-42l4z\" (UID: \"6c91c545-ee17-43ad-8a08-42be9b2cda48\") " pod="openshift-multus/multus-additional-cni-plugins-42l4z" Apr 24 14:24:26.438395 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438170 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9ce961e4-1141-4ebb-83ff-6ef501b710dd-etc-systemd\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.438395 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438176 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9ce961e4-1141-4ebb-83ff-6ef501b710dd-sys\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.438395 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438206 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9ce961e4-1141-4ebb-83ff-6ef501b710dd-etc-tuned\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.438395 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438209 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9ce961e4-1141-4ebb-83ff-6ef501b710dd-etc-modprobe-d\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.438395 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438222 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-host-run-multus-certs\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.439268 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438222 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9ce961e4-1141-4ebb-83ff-6ef501b710dd-sys\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.439268 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438249 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-host-run-multus-certs\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.439268 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438288 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-var-lib-openvswitch\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.439268 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438316 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a7b9926d-4f53-4532-8669-16af4fc30cfd-ovn-node-metrics-cert\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.439268 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438339 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-system-cni-dir\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.439268 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438362 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9ce961e4-1141-4ebb-83ff-6ef501b710dd-etc-sysctl-conf\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.439268 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438365 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9ce961e4-1141-4ebb-83ff-6ef501b710dd-host\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.439268 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438400 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8w7wq\" (UniqueName: \"kubernetes.io/projected/87bcc27f-4b5f-48a3-9ae3-f93cb520eea0-kube-api-access-8w7wq\") pod \"node-ca-hgntn\" (UID: \"87bcc27f-4b5f-48a3-9ae3-f93cb520eea0\") " pod="openshift-image-registry/node-ca-hgntn" Apr 24 14:24:26.439268 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438408 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9ce961e4-1141-4ebb-83ff-6ef501b710dd-host\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.439268 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438412 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a7b9926d-4f53-4532-8669-16af4fc30cfd-ovnkube-script-lib\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.439268 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438427 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-run-openvswitch\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.439268 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438450 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b999a46-88fd-4d1a-a6e6-11c90708c270-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4bhjw\" (UID: \"0b999a46-88fd-4d1a-a6e6-11c90708c270\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4bhjw" Apr 24 14:24:26.439268 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438452 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/87bcc27f-4b5f-48a3-9ae3-f93cb520eea0-host\") pod \"node-ca-hgntn\" (UID: \"87bcc27f-4b5f-48a3-9ae3-f93cb520eea0\") " pod="openshift-image-registry/node-ca-hgntn" Apr 24 14:24:26.439268 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438472 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a7b9926d-4f53-4532-8669-16af4fc30cfd-ovnkube-config\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.439268 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438484 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-run-openvswitch\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.439268 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438533 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6c91c545-ee17-43ad-8a08-42be9b2cda48-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-42l4z\" (UID: \"6c91c545-ee17-43ad-8a08-42be9b2cda48\") " pod="openshift-multus/multus-additional-cni-plugins-42l4z" Apr 24 14:24:26.439268 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438610 2571 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 14:24:26.439268 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438643 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-cni-binary-copy\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.440064 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438681 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-run-systemd\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.440064 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438688 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/15cb0256-87b8-40a0-812d-7931f453c264-konnectivity-ca\") pod \"konnectivity-agent-v9wvt\" (UID: \"15cb0256-87b8-40a0-812d-7931f453c264\") " pod="kube-system/konnectivity-agent-v9wvt" Apr 24 14:24:26.440064 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438716 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-host-slash\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.440064 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438640 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-var-lib-openvswitch\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.440064 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438717 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0b999a46-88fd-4d1a-a6e6-11c90708c270-socket-dir\") pod \"aws-ebs-csi-driver-node-4bhjw\" (UID: \"0b999a46-88fd-4d1a-a6e6-11c90708c270\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4bhjw" Apr 24 14:24:26.440064 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438861 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-log-socket\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.440064 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438890 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-multus-socket-dir-parent\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.440064 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438917 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hrh5q\" (UniqueName: \"kubernetes.io/projected/9ce961e4-1141-4ebb-83ff-6ef501b710dd-kube-api-access-hrh5q\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.440064 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438937 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-log-socket\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.440064 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.438962 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-run-ovn\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.440064 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.439022 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-multus-socket-dir-parent\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.440064 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.439153 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b97fcfbf-fa5a-4b45-8446-f25172d545bb-tmp-dir\") pod \"node-resolver-bmpbb\" (UID: \"b97fcfbf-fa5a-4b45-8446-f25172d545bb\") " pod="openshift-dns/node-resolver-bmpbb" Apr 24 14:24:26.440064 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.439166 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-system-cni-dir\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.440064 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.439191 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0b999a46-88fd-4d1a-a6e6-11c90708c270-device-dir\") pod \"aws-ebs-csi-driver-node-4bhjw\" (UID: \"0b999a46-88fd-4d1a-a6e6-11c90708c270\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4bhjw" Apr 24 14:24:26.440064 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.439210 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6c91c545-ee17-43ad-8a08-42be9b2cda48-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-42l4z\" (UID: \"6c91c545-ee17-43ad-8a08-42be9b2cda48\") " pod="openshift-multus/multus-additional-cni-plugins-42l4z" Apr 24 14:24:26.440064 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.439220 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-cni-binary-copy\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.440064 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.439228 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-host-run-netns\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.441386 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.439259 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-cnibin\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.441386 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.439273 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-host-run-netns\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.441386 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.439291 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a7b9926d-4f53-4532-8669-16af4fc30cfd-env-overrides\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.441386 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.439305 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-cnibin\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.441386 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.439380 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-os-release\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.441386 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.439422 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-multus-conf-dir\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.441386 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.439446 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-etc-openvswitch\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.441386 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.439472 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6c91c545-ee17-43ad-8a08-42be9b2cda48-cnibin\") pod \"multus-additional-cni-plugins-42l4z\" (UID: \"6c91c545-ee17-43ad-8a08-42be9b2cda48\") " pod="openshift-multus/multus-additional-cni-plugins-42l4z" Apr 24 14:24:26.441386 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.439499 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6c91c545-ee17-43ad-8a08-42be9b2cda48-os-release\") pod \"multus-additional-cni-plugins-42l4z\" (UID: \"6c91c545-ee17-43ad-8a08-42be9b2cda48\") " pod="openshift-multus/multus-additional-cni-plugins-42l4z" Apr 24 14:24:26.441386 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.439477 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-os-release\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.441386 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.439513 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-multus-conf-dir\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.441386 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.439528 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-hostroot\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.441386 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.439553 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ce961e4-1141-4ebb-83ff-6ef501b710dd-etc-kubernetes\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.441386 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.439553 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-etc-openvswitch\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.441386 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.439582 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/865ca2f7-3380-490c-b40d-e9c4fb7c799a-iptables-alerter-script\") pod \"iptables-alerter-j9xv9\" (UID: \"865ca2f7-3380-490c-b40d-e9c4fb7c799a\") " pod="openshift-network-operator/iptables-alerter-j9xv9" Apr 24 14:24:26.441386 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.439584 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6c91c545-ee17-43ad-8a08-42be9b2cda48-os-release\") pod \"multus-additional-cni-plugins-42l4z\" (UID: \"6c91c545-ee17-43ad-8a08-42be9b2cda48\") " pod="openshift-multus/multus-additional-cni-plugins-42l4z" Apr 24 14:24:26.441386 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.439596 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6c91c545-ee17-43ad-8a08-42be9b2cda48-cnibin\") pod \"multus-additional-cni-plugins-42l4z\" (UID: \"6c91c545-ee17-43ad-8a08-42be9b2cda48\") " pod="openshift-multus/multus-additional-cni-plugins-42l4z" Apr 24 14:24:26.441386 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.439620 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-hostroot\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.442267 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.439637 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ce961e4-1141-4ebb-83ff-6ef501b710dd-etc-kubernetes\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.442267 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.439653 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrd4x\" (UniqueName: \"kubernetes.io/projected/865ca2f7-3380-490c-b40d-e9c4fb7c799a-kube-api-access-vrd4x\") pod \"iptables-alerter-j9xv9\" (UID: \"865ca2f7-3380-490c-b40d-e9c4fb7c799a\") " pod="openshift-network-operator/iptables-alerter-j9xv9" Apr 24 14:24:26.442267 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.439679 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0b999a46-88fd-4d1a-a6e6-11c90708c270-etc-selinux\") pod \"aws-ebs-csi-driver-node-4bhjw\" (UID: \"0b999a46-88fd-4d1a-a6e6-11c90708c270\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4bhjw" Apr 24 14:24:26.442267 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.439711 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a022a0ca-5e80-43a6-8ee0-69dcf197d1a8-metrics-certs\") pod \"network-metrics-daemon-fsnj5\" (UID: \"a022a0ca-5e80-43a6-8ee0-69dcf197d1a8\") " pod="openshift-multus/network-metrics-daemon-fsnj5" Apr 24 14:24:26.442267 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.439746 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-host-cni-bin\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.442267 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.439780 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-host-cni-netd\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.442267 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:26.439809 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:26.442267 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.439821 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-host-cni-bin\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.442267 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.439830 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-host-cni-netd\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.442267 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.439855 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6c91c545-ee17-43ad-8a08-42be9b2cda48-cni-binary-copy\") pod \"multus-additional-cni-plugins-42l4z\" (UID: \"6c91c545-ee17-43ad-8a08-42be9b2cda48\") " pod="openshift-multus/multus-additional-cni-plugins-42l4z" Apr 24 14:24:26.442267 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:26.439882 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a022a0ca-5e80-43a6-8ee0-69dcf197d1a8-metrics-certs podName:a022a0ca-5e80-43a6-8ee0-69dcf197d1a8 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:26.939851956 +0000 UTC m=+3.069829502 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a022a0ca-5e80-43a6-8ee0-69dcf197d1a8-metrics-certs") pod "network-metrics-daemon-fsnj5" (UID: "a022a0ca-5e80-43a6-8ee0-69dcf197d1a8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:26.442267 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.439911 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-host-run-k8s-cni-cncf-io\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.442267 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.439938 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-host-run-netns\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.442267 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.439962 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-host-var-lib-kubelet\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.442267 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.440007 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0b999a46-88fd-4d1a-a6e6-11c90708c270-sys-fs\") pod \"aws-ebs-csi-driver-node-4bhjw\" (UID: \"0b999a46-88fd-4d1a-a6e6-11c90708c270\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4bhjw" Apr 24 14:24:26.442267 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.440025 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-host-run-netns\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.442267 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.440036 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-multus-daemon-config\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.442893 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.440067 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-etc-kubernetes\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.442893 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.440081 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-host-var-lib-kubelet\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.442893 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.440094 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6c91c545-ee17-43ad-8a08-42be9b2cda48-system-cni-dir\") pod \"multus-additional-cni-plugins-42l4z\" (UID: \"6c91c545-ee17-43ad-8a08-42be9b2cda48\") " pod="openshift-multus/multus-additional-cni-plugins-42l4z" Apr 24 14:24:26.442893 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.440123 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-node-log\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.442893 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.440123 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-host-run-k8s-cni-cncf-io\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.442893 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.440151 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6c91c545-ee17-43ad-8a08-42be9b2cda48-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-42l4z\" (UID: \"6c91c545-ee17-43ad-8a08-42be9b2cda48\") " pod="openshift-multus/multus-additional-cni-plugins-42l4z" Apr 24 14:24:26.442893 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.440171 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-etc-kubernetes\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.442893 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.440180 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-host-var-lib-cni-multus\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.442893 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.440312 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-host-var-lib-cni-multus\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.442893 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.440345 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9ce961e4-1141-4ebb-83ff-6ef501b710dd-lib-modules\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.442893 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.440373 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.442893 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.440398 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-host-var-lib-cni-bin\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.442893 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.440457 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6c91c545-ee17-43ad-8a08-42be9b2cda48-system-cni-dir\") pod \"multus-additional-cni-plugins-42l4z\" (UID: \"6c91c545-ee17-43ad-8a08-42be9b2cda48\") " pod="openshift-multus/multus-additional-cni-plugins-42l4z" Apr 24 14:24:26.442893 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.440458 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.442893 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.440486 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-host-var-lib-cni-bin\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.442893 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.440495 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-node-log\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.442893 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.440524 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9ce961e4-1141-4ebb-83ff-6ef501b710dd-run\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.443416 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.440556 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ce961e4-1141-4ebb-83ff-6ef501b710dd-var-lib-kubelet\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.443416 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.440562 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9ce961e4-1141-4ebb-83ff-6ef501b710dd-lib-modules\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.443416 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.440578 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-multus-daemon-config\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.443416 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.440590 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b97fcfbf-fa5a-4b45-8446-f25172d545bb-hosts-file\") pod \"node-resolver-bmpbb\" (UID: \"b97fcfbf-fa5a-4b45-8446-f25172d545bb\") " pod="openshift-dns/node-resolver-bmpbb" Apr 24 14:24:26.443416 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.440604 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9ce961e4-1141-4ebb-83ff-6ef501b710dd-run\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.443416 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.440638 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/865ca2f7-3380-490c-b40d-e9c4fb7c799a-host-slash\") pod \"iptables-alerter-j9xv9\" (UID: \"865ca2f7-3380-490c-b40d-e9c4fb7c799a\") " pod="openshift-network-operator/iptables-alerter-j9xv9" Apr 24 14:24:26.443416 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.440666 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57xb9\" (UniqueName: \"kubernetes.io/projected/0b999a46-88fd-4d1a-a6e6-11c90708c270-kube-api-access-57xb9\") pod \"aws-ebs-csi-driver-node-4bhjw\" (UID: \"0b999a46-88fd-4d1a-a6e6-11c90708c270\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4bhjw" Apr 24 14:24:26.443416 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.440718 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ce961e4-1141-4ebb-83ff-6ef501b710dd-var-lib-kubelet\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.443416 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.440745 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nsqj\" (UniqueName: \"kubernetes.io/projected/a022a0ca-5e80-43a6-8ee0-69dcf197d1a8-kube-api-access-8nsqj\") pod \"network-metrics-daemon-fsnj5\" (UID: \"a022a0ca-5e80-43a6-8ee0-69dcf197d1a8\") " pod="openshift-multus/network-metrics-daemon-fsnj5" Apr 24 14:24:26.443416 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.440771 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-host-run-ovn-kubernetes\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.443416 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.440797 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-multus-cni-dir\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.443416 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.440920 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a7b9926d-4f53-4532-8669-16af4fc30cfd-host-run-ovn-kubernetes\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.443416 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.440974 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6c91c545-ee17-43ad-8a08-42be9b2cda48-cni-binary-copy\") pod \"multus-additional-cni-plugins-42l4z\" (UID: \"6c91c545-ee17-43ad-8a08-42be9b2cda48\") " pod="openshift-multus/multus-additional-cni-plugins-42l4z" Apr 24 14:24:26.443416 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.440975 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6c91c545-ee17-43ad-8a08-42be9b2cda48-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-42l4z\" (UID: \"6c91c545-ee17-43ad-8a08-42be9b2cda48\") " pod="openshift-multus/multus-additional-cni-plugins-42l4z" Apr 24 14:24:26.443416 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.441042 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-multus-cni-dir\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.443416 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.442491 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9ce961e4-1141-4ebb-83ff-6ef501b710dd-tmp\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.443416 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.442521 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9ce961e4-1141-4ebb-83ff-6ef501b710dd-etc-tuned\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.444059 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.442587 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a7b9926d-4f53-4532-8669-16af4fc30cfd-ovn-node-metrics-cert\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.448556 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:26.448536 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:26.448644 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:26.448563 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:26.448644 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:26.448578 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mf78d for pod openshift-network-diagnostics/network-check-target-dgzq2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:26.448644 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:26.448635 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4ebd1686-821a-4fb0-b091-8a636b80f78e-kube-api-access-mf78d podName:4ebd1686-821a-4fb0-b091-8a636b80f78e nodeName:}" failed. No retries permitted until 2026-04-24 14:24:26.948618256 +0000 UTC m=+3.078595810 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-mf78d" (UniqueName: "kubernetes.io/projected/4ebd1686-821a-4fb0-b091-8a636b80f78e-kube-api-access-mf78d") pod "network-check-target-dgzq2" (UID: "4ebd1686-821a-4fb0-b091-8a636b80f78e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:26.448895 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.448842 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqlsf\" (UniqueName: \"kubernetes.io/projected/5b200c2b-497d-4b24-9fcb-1ed9a2b007c1-kube-api-access-lqlsf\") pod \"multus-gwdlv\" (UID: \"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1\") " pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.450970 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.450951 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nsqj\" (UniqueName: \"kubernetes.io/projected/a022a0ca-5e80-43a6-8ee0-69dcf197d1a8-kube-api-access-8nsqj\") pod \"network-metrics-daemon-fsnj5\" (UID: \"a022a0ca-5e80-43a6-8ee0-69dcf197d1a8\") " pod="openshift-multus/network-metrics-daemon-fsnj5" Apr 24 14:24:26.451100 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.451072 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q97xb\" (UniqueName: \"kubernetes.io/projected/a7b9926d-4f53-4532-8669-16af4fc30cfd-kube-api-access-q97xb\") pod \"ovnkube-node-7m9zt\" (UID: \"a7b9926d-4f53-4532-8669-16af4fc30cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.451255 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.451239 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w7wq\" (UniqueName: \"kubernetes.io/projected/87bcc27f-4b5f-48a3-9ae3-f93cb520eea0-kube-api-access-8w7wq\") pod \"node-ca-hgntn\" (UID: \"87bcc27f-4b5f-48a3-9ae3-f93cb520eea0\") " pod="openshift-image-registry/node-ca-hgntn" Apr 24 14:24:26.451306 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.451242 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrh5q\" (UniqueName: \"kubernetes.io/projected/9ce961e4-1141-4ebb-83ff-6ef501b710dd-kube-api-access-hrh5q\") pod \"tuned-fhn6c\" (UID: \"9ce961e4-1141-4ebb-83ff-6ef501b710dd\") " pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.451448 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.451430 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k24kx\" (UniqueName: \"kubernetes.io/projected/6c91c545-ee17-43ad-8a08-42be9b2cda48-kube-api-access-k24kx\") pod \"multus-additional-cni-plugins-42l4z\" (UID: \"6c91c545-ee17-43ad-8a08-42be9b2cda48\") " pod="openshift-multus/multus-additional-cni-plugins-42l4z" Apr 24 14:24:26.541641 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.541598 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b999a46-88fd-4d1a-a6e6-11c90708c270-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4bhjw\" (UID: \"0b999a46-88fd-4d1a-a6e6-11c90708c270\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4bhjw" Apr 24 14:24:26.541641 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.541646 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/15cb0256-87b8-40a0-812d-7931f453c264-konnectivity-ca\") pod \"konnectivity-agent-v9wvt\" (UID: \"15cb0256-87b8-40a0-812d-7931f453c264\") " pod="kube-system/konnectivity-agent-v9wvt" Apr 24 14:24:26.541884 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.541673 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0b999a46-88fd-4d1a-a6e6-11c90708c270-socket-dir\") pod \"aws-ebs-csi-driver-node-4bhjw\" (UID: \"0b999a46-88fd-4d1a-a6e6-11c90708c270\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4bhjw" Apr 24 14:24:26.541884 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.541697 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b97fcfbf-fa5a-4b45-8446-f25172d545bb-tmp-dir\") pod \"node-resolver-bmpbb\" (UID: \"b97fcfbf-fa5a-4b45-8446-f25172d545bb\") " pod="openshift-dns/node-resolver-bmpbb" Apr 24 14:24:26.541884 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.541731 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0b999a46-88fd-4d1a-a6e6-11c90708c270-device-dir\") pod \"aws-ebs-csi-driver-node-4bhjw\" (UID: \"0b999a46-88fd-4d1a-a6e6-11c90708c270\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4bhjw" Apr 24 14:24:26.541884 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.541739 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b999a46-88fd-4d1a-a6e6-11c90708c270-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4bhjw\" (UID: \"0b999a46-88fd-4d1a-a6e6-11c90708c270\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4bhjw" Apr 24 14:24:26.541884 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.541764 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/865ca2f7-3380-490c-b40d-e9c4fb7c799a-iptables-alerter-script\") pod \"iptables-alerter-j9xv9\" (UID: \"865ca2f7-3380-490c-b40d-e9c4fb7c799a\") " pod="openshift-network-operator/iptables-alerter-j9xv9" Apr 24 14:24:26.541884 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.541790 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vrd4x\" (UniqueName: \"kubernetes.io/projected/865ca2f7-3380-490c-b40d-e9c4fb7c799a-kube-api-access-vrd4x\") pod \"iptables-alerter-j9xv9\" (UID: \"865ca2f7-3380-490c-b40d-e9c4fb7c799a\") " pod="openshift-network-operator/iptables-alerter-j9xv9" Apr 24 14:24:26.541884 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.541814 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0b999a46-88fd-4d1a-a6e6-11c90708c270-etc-selinux\") pod \"aws-ebs-csi-driver-node-4bhjw\" (UID: \"0b999a46-88fd-4d1a-a6e6-11c90708c270\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4bhjw" Apr 24 14:24:26.541884 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.541854 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0b999a46-88fd-4d1a-a6e6-11c90708c270-sys-fs\") pod \"aws-ebs-csi-driver-node-4bhjw\" (UID: \"0b999a46-88fd-4d1a-a6e6-11c90708c270\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4bhjw" Apr 24 14:24:26.541884 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.541877 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0b999a46-88fd-4d1a-a6e6-11c90708c270-socket-dir\") pod \"aws-ebs-csi-driver-node-4bhjw\" (UID: \"0b999a46-88fd-4d1a-a6e6-11c90708c270\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4bhjw" Apr 24 14:24:26.541884 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.541888 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b97fcfbf-fa5a-4b45-8446-f25172d545bb-hosts-file\") pod \"node-resolver-bmpbb\" (UID: \"b97fcfbf-fa5a-4b45-8446-f25172d545bb\") " pod="openshift-dns/node-resolver-bmpbb" Apr 24 14:24:26.542408 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.541912 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/865ca2f7-3380-490c-b40d-e9c4fb7c799a-host-slash\") pod \"iptables-alerter-j9xv9\" (UID: \"865ca2f7-3380-490c-b40d-e9c4fb7c799a\") " pod="openshift-network-operator/iptables-alerter-j9xv9" Apr 24 14:24:26.542408 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.541936 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-57xb9\" (UniqueName: \"kubernetes.io/projected/0b999a46-88fd-4d1a-a6e6-11c90708c270-kube-api-access-57xb9\") pod \"aws-ebs-csi-driver-node-4bhjw\" (UID: \"0b999a46-88fd-4d1a-a6e6-11c90708c270\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4bhjw" Apr 24 14:24:26.542408 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.541954 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0b999a46-88fd-4d1a-a6e6-11c90708c270-sys-fs\") pod \"aws-ebs-csi-driver-node-4bhjw\" (UID: \"0b999a46-88fd-4d1a-a6e6-11c90708c270\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4bhjw" Apr 24 14:24:26.542408 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.541959 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0b999a46-88fd-4d1a-a6e6-11c90708c270-etc-selinux\") pod \"aws-ebs-csi-driver-node-4bhjw\" (UID: \"0b999a46-88fd-4d1a-a6e6-11c90708c270\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4bhjw" Apr 24 14:24:26.542408 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.542029 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/865ca2f7-3380-490c-b40d-e9c4fb7c799a-host-slash\") pod \"iptables-alerter-j9xv9\" (UID: \"865ca2f7-3380-490c-b40d-e9c4fb7c799a\") " pod="openshift-network-operator/iptables-alerter-j9xv9" Apr 24 14:24:26.542408 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.542073 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b97fcfbf-fa5a-4b45-8446-f25172d545bb-hosts-file\") pod \"node-resolver-bmpbb\" (UID: \"b97fcfbf-fa5a-4b45-8446-f25172d545bb\") " pod="openshift-dns/node-resolver-bmpbb" Apr 24 14:24:26.542408 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.541966 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/15cb0256-87b8-40a0-812d-7931f453c264-agent-certs\") pod \"konnectivity-agent-v9wvt\" (UID: \"15cb0256-87b8-40a0-812d-7931f453c264\") " pod="kube-system/konnectivity-agent-v9wvt" Apr 24 14:24:26.542408 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.542105 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b97fcfbf-fa5a-4b45-8446-f25172d545bb-tmp-dir\") pod \"node-resolver-bmpbb\" (UID: \"b97fcfbf-fa5a-4b45-8446-f25172d545bb\") " pod="openshift-dns/node-resolver-bmpbb" Apr 24 14:24:26.542408 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.542112 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0b999a46-88fd-4d1a-a6e6-11c90708c270-registration-dir\") pod \"aws-ebs-csi-driver-node-4bhjw\" (UID: \"0b999a46-88fd-4d1a-a6e6-11c90708c270\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4bhjw" Apr 24 14:24:26.542408 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.542151 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0b999a46-88fd-4d1a-a6e6-11c90708c270-registration-dir\") pod \"aws-ebs-csi-driver-node-4bhjw\" (UID: \"0b999a46-88fd-4d1a-a6e6-11c90708c270\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4bhjw" Apr 24 14:24:26.542408 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.542170 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2cq4c\" (UniqueName: \"kubernetes.io/projected/b97fcfbf-fa5a-4b45-8446-f25172d545bb-kube-api-access-2cq4c\") pod \"node-resolver-bmpbb\" (UID: \"b97fcfbf-fa5a-4b45-8446-f25172d545bb\") " pod="openshift-dns/node-resolver-bmpbb" Apr 24 14:24:26.542408 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.542170 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0b999a46-88fd-4d1a-a6e6-11c90708c270-device-dir\") pod \"aws-ebs-csi-driver-node-4bhjw\" (UID: \"0b999a46-88fd-4d1a-a6e6-11c90708c270\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4bhjw" Apr 24 14:24:26.542408 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.542331 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/865ca2f7-3380-490c-b40d-e9c4fb7c799a-iptables-alerter-script\") pod \"iptables-alerter-j9xv9\" (UID: \"865ca2f7-3380-490c-b40d-e9c4fb7c799a\") " pod="openshift-network-operator/iptables-alerter-j9xv9" Apr 24 14:24:26.542408 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.542371 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/15cb0256-87b8-40a0-812d-7931f453c264-konnectivity-ca\") pod \"konnectivity-agent-v9wvt\" (UID: \"15cb0256-87b8-40a0-812d-7931f453c264\") " pod="kube-system/konnectivity-agent-v9wvt" Apr 24 14:24:26.544964 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.544938 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/15cb0256-87b8-40a0-812d-7931f453c264-agent-certs\") pod \"konnectivity-agent-v9wvt\" (UID: \"15cb0256-87b8-40a0-812d-7931f453c264\") " pod="kube-system/konnectivity-agent-v9wvt" Apr 24 14:24:26.551273 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.551247 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cq4c\" (UniqueName: \"kubernetes.io/projected/b97fcfbf-fa5a-4b45-8446-f25172d545bb-kube-api-access-2cq4c\") pod \"node-resolver-bmpbb\" (UID: \"b97fcfbf-fa5a-4b45-8446-f25172d545bb\") " pod="openshift-dns/node-resolver-bmpbb" Apr 24 14:24:26.551427 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.551411 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrd4x\" (UniqueName: \"kubernetes.io/projected/865ca2f7-3380-490c-b40d-e9c4fb7c799a-kube-api-access-vrd4x\") pod \"iptables-alerter-j9xv9\" (UID: \"865ca2f7-3380-490c-b40d-e9c4fb7c799a\") " pod="openshift-network-operator/iptables-alerter-j9xv9" Apr 24 14:24:26.551479 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.551463 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-57xb9\" (UniqueName: \"kubernetes.io/projected/0b999a46-88fd-4d1a-a6e6-11c90708c270-kube-api-access-57xb9\") pod \"aws-ebs-csi-driver-node-4bhjw\" (UID: \"0b999a46-88fd-4d1a-a6e6-11c90708c270\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4bhjw" Apr 24 14:24:26.630544 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.630506 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" Apr 24 14:24:26.636268 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.636251 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hgntn" Apr 24 14:24:26.645043 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.645018 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:26.649648 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.649629 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-42l4z" Apr 24 14:24:26.656254 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.656235 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gwdlv" Apr 24 14:24:26.662844 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.662824 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-v9wvt" Apr 24 14:24:26.669151 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.669135 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4bhjw" Apr 24 14:24:26.678674 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.678654 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-j9xv9" Apr 24 14:24:26.684161 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.684136 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bmpbb" Apr 24 14:24:26.749861 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.749835 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:24:26.946366 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:26.946281 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a022a0ca-5e80-43a6-8ee0-69dcf197d1a8-metrics-certs\") pod \"network-metrics-daemon-fsnj5\" (UID: \"a022a0ca-5e80-43a6-8ee0-69dcf197d1a8\") " pod="openshift-multus/network-metrics-daemon-fsnj5" Apr 24 14:24:26.946521 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:26.946410 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:26.946521 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:26.946479 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a022a0ca-5e80-43a6-8ee0-69dcf197d1a8-metrics-certs podName:a022a0ca-5e80-43a6-8ee0-69dcf197d1a8 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:27.946459271 +0000 UTC m=+4.076436834 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a022a0ca-5e80-43a6-8ee0-69dcf197d1a8-metrics-certs") pod "network-metrics-daemon-fsnj5" (UID: "a022a0ca-5e80-43a6-8ee0-69dcf197d1a8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:27.047022 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:27.046965 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mf78d\" (UniqueName: \"kubernetes.io/projected/4ebd1686-821a-4fb0-b091-8a636b80f78e-kube-api-access-mf78d\") pod \"network-check-target-dgzq2\" (UID: \"4ebd1686-821a-4fb0-b091-8a636b80f78e\") " pod="openshift-network-diagnostics/network-check-target-dgzq2" Apr 24 14:24:27.047198 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:27.047129 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:27.047198 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:27.047154 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:27.047198 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:27.047170 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mf78d for pod openshift-network-diagnostics/network-check-target-dgzq2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:27.047357 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:27.047229 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4ebd1686-821a-4fb0-b091-8a636b80f78e-kube-api-access-mf78d podName:4ebd1686-821a-4fb0-b091-8a636b80f78e nodeName:}" failed. No retries permitted until 2026-04-24 14:24:28.047215248 +0000 UTC m=+4.177192799 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-mf78d" (UniqueName: "kubernetes.io/projected/4ebd1686-821a-4fb0-b091-8a636b80f78e-kube-api-access-mf78d") pod "network-check-target-dgzq2" (UID: "4ebd1686-821a-4fb0-b091-8a636b80f78e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:27.081236 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:27.081197 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87bcc27f_4b5f_48a3_9ae3_f93cb520eea0.slice/crio-cf63b46364d27f379267d68cf4d65ebff046735c0afe4d4695494ae77d7bfa54 WatchSource:0}: Error finding container cf63b46364d27f379267d68cf4d65ebff046735c0afe4d4695494ae77d7bfa54: Status 404 returned error can't find the container with id cf63b46364d27f379267d68cf4d65ebff046735c0afe4d4695494ae77d7bfa54 Apr 24 14:24:27.082355 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:27.082318 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c91c545_ee17_43ad_8a08_42be9b2cda48.slice/crio-f68f6ee7c4d00c418673a9a2a43375d57152790b386b641c4982cf24794c79d8 WatchSource:0}: Error finding container f68f6ee7c4d00c418673a9a2a43375d57152790b386b641c4982cf24794c79d8: Status 404 returned error can't find the container with id f68f6ee7c4d00c418673a9a2a43375d57152790b386b641c4982cf24794c79d8 Apr 24 14:24:27.083322 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:27.083287 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15cb0256_87b8_40a0_812d_7931f453c264.slice/crio-f94836bc1c1f2c04403f611b425a1014bb81db769ec3abfbb8d0b551ce19ca1f WatchSource:0}: Error finding container f94836bc1c1f2c04403f611b425a1014bb81db769ec3abfbb8d0b551ce19ca1f: Status 404 returned error can't find the container with id f94836bc1c1f2c04403f611b425a1014bb81db769ec3abfbb8d0b551ce19ca1f Apr 24 14:24:27.084092 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:27.084065 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b999a46_88fd_4d1a_a6e6_11c90708c270.slice/crio-a68431d73b21617a65d89424f56c5e967e78fa9828bfd24bed16b8579a164d43 WatchSource:0}: Error finding container a68431d73b21617a65d89424f56c5e967e78fa9828bfd24bed16b8579a164d43: Status 404 returned error can't find the container with id a68431d73b21617a65d89424f56c5e967e78fa9828bfd24bed16b8579a164d43 Apr 24 14:24:27.086744 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:27.086544 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7b9926d_4f53_4532_8669_16af4fc30cfd.slice/crio-e4791da58663b1124a4efe4d3f415ec4a291bf1d7c69d47965243bb193fedd0a WatchSource:0}: Error finding container e4791da58663b1124a4efe4d3f415ec4a291bf1d7c69d47965243bb193fedd0a: Status 404 returned error can't find the container with id e4791da58663b1124a4efe4d3f415ec4a291bf1d7c69d47965243bb193fedd0a Apr 24 14:24:27.087904 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:27.087812 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod865ca2f7_3380_490c_b40d_e9c4fb7c799a.slice/crio-6c54352bffd96574dcf7fd26d44f9c5ca6783bc324ec92d4e402d0ffa375cdad WatchSource:0}: Error finding container 6c54352bffd96574dcf7fd26d44f9c5ca6783bc324ec92d4e402d0ffa375cdad: Status 404 returned error can't find the container with id 6c54352bffd96574dcf7fd26d44f9c5ca6783bc324ec92d4e402d0ffa375cdad Apr 24 14:24:27.111192 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:27.111169 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b200c2b_497d_4b24_9fcb_1ed9a2b007c1.slice/crio-10d2f8c4e1c66e242e423fc90bd7de2504af4654e06ab7a055072ddd329c3a3f WatchSource:0}: Error finding container 10d2f8c4e1c66e242e423fc90bd7de2504af4654e06ab7a055072ddd329c3a3f: Status 404 returned error can't find the container with id 10d2f8c4e1c66e242e423fc90bd7de2504af4654e06ab7a055072ddd329c3a3f Apr 24 14:24:27.112022 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:27.112000 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ce961e4_1141_4ebb_83ff_6ef501b710dd.slice/crio-c35f2ac2563b38a8ddb681cd859d3f3587fc1219f3b486b442fbc64b2d8d7e09 WatchSource:0}: Error finding container c35f2ac2563b38a8ddb681cd859d3f3587fc1219f3b486b442fbc64b2d8d7e09: Status 404 returned error can't find the container with id c35f2ac2563b38a8ddb681cd859d3f3587fc1219f3b486b442fbc64b2d8d7e09 Apr 24 14:24:27.112909 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:24:27.112886 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb97fcfbf_fa5a_4b45_8446_f25172d545bb.slice/crio-9c770fe19eaa22c4df891daac52243e76f2b03e4ab01cf4ae70b1fb20b74ba83 WatchSource:0}: Error finding container 9c770fe19eaa22c4df891daac52243e76f2b03e4ab01cf4ae70b1fb20b74ba83: Status 404 returned error can't find the container with id 9c770fe19eaa22c4df891daac52243e76f2b03e4ab01cf4ae70b1fb20b74ba83 Apr 24 14:24:27.364324 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:27.364289 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 14:19:25 +0000 UTC" deadline="2027-10-06 20:50:56.835427606 +0000 UTC" Apr 24 14:24:27.364324 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:27.364320 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12726h26m29.471109933s" Apr 24 14:24:27.456058 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:27.456022 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fsnj5" Apr 24 14:24:27.456244 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:27.456208 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fsnj5" podUID="a022a0ca-5e80-43a6-8ee0-69dcf197d1a8" Apr 24 14:24:27.466111 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:27.466051 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" event={"ID":"9ce961e4-1141-4ebb-83ff-6ef501b710dd","Type":"ContainerStarted","Data":"c35f2ac2563b38a8ddb681cd859d3f3587fc1219f3b486b442fbc64b2d8d7e09"} Apr 24 14:24:27.473831 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:27.473779 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-42l4z" event={"ID":"6c91c545-ee17-43ad-8a08-42be9b2cda48","Type":"ContainerStarted","Data":"f68f6ee7c4d00c418673a9a2a43375d57152790b386b641c4982cf24794c79d8"} Apr 24 14:24:27.477285 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:27.477243 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hgntn" event={"ID":"87bcc27f-4b5f-48a3-9ae3-f93cb520eea0","Type":"ContainerStarted","Data":"cf63b46364d27f379267d68cf4d65ebff046735c0afe4d4695494ae77d7bfa54"} Apr 24 14:24:27.479175 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:27.479110 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bmpbb" event={"ID":"b97fcfbf-fa5a-4b45-8446-f25172d545bb","Type":"ContainerStarted","Data":"9c770fe19eaa22c4df891daac52243e76f2b03e4ab01cf4ae70b1fb20b74ba83"} Apr 24 14:24:27.482666 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:27.482630 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-j9xv9" event={"ID":"865ca2f7-3380-490c-b40d-e9c4fb7c799a","Type":"ContainerStarted","Data":"6c54352bffd96574dcf7fd26d44f9c5ca6783bc324ec92d4e402d0ffa375cdad"} Apr 24 14:24:27.484782 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:27.484741 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gwdlv" event={"ID":"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1","Type":"ContainerStarted","Data":"10d2f8c4e1c66e242e423fc90bd7de2504af4654e06ab7a055072ddd329c3a3f"} Apr 24 14:24:27.487543 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:27.487487 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" event={"ID":"a7b9926d-4f53-4532-8669-16af4fc30cfd","Type":"ContainerStarted","Data":"e4791da58663b1124a4efe4d3f415ec4a291bf1d7c69d47965243bb193fedd0a"} Apr 24 14:24:27.490286 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:27.490259 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4bhjw" event={"ID":"0b999a46-88fd-4d1a-a6e6-11c90708c270","Type":"ContainerStarted","Data":"a68431d73b21617a65d89424f56c5e967e78fa9828bfd24bed16b8579a164d43"} Apr 24 14:24:27.496816 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:27.496342 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-v9wvt" event={"ID":"15cb0256-87b8-40a0-812d-7931f453c264","Type":"ContainerStarted","Data":"f94836bc1c1f2c04403f611b425a1014bb81db769ec3abfbb8d0b551ce19ca1f"} Apr 24 14:24:27.500698 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:27.500079 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-95.ec2.internal" event={"ID":"c07dfa5590a30bfb9dbec92d1fcd686b","Type":"ContainerStarted","Data":"6db38dd174e1b7e397f692f9d5f48f01be66a4d57edc55b1605ac40493d3eff7"} Apr 24 14:24:27.517521 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:27.517291 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-95.ec2.internal" podStartSLOduration=2.517273662 podStartE2EDuration="2.517273662s" podCreationTimestamp="2026-04-24 14:24:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:24:27.51610671 +0000 UTC m=+3.646084271" watchObservedRunningTime="2026-04-24 14:24:27.517273662 +0000 UTC m=+3.647251250" Apr 24 14:24:27.953764 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:27.953714 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a022a0ca-5e80-43a6-8ee0-69dcf197d1a8-metrics-certs\") pod \"network-metrics-daemon-fsnj5\" (UID: \"a022a0ca-5e80-43a6-8ee0-69dcf197d1a8\") " pod="openshift-multus/network-metrics-daemon-fsnj5" Apr 24 14:24:27.968417 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:27.966165 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:27.968417 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:27.966257 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a022a0ca-5e80-43a6-8ee0-69dcf197d1a8-metrics-certs podName:a022a0ca-5e80-43a6-8ee0-69dcf197d1a8 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:29.966234833 +0000 UTC m=+6.096212387 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a022a0ca-5e80-43a6-8ee0-69dcf197d1a8-metrics-certs") pod "network-metrics-daemon-fsnj5" (UID: "a022a0ca-5e80-43a6-8ee0-69dcf197d1a8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:28.054257 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:28.054220 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mf78d\" (UniqueName: \"kubernetes.io/projected/4ebd1686-821a-4fb0-b091-8a636b80f78e-kube-api-access-mf78d\") pod \"network-check-target-dgzq2\" (UID: \"4ebd1686-821a-4fb0-b091-8a636b80f78e\") " pod="openshift-network-diagnostics/network-check-target-dgzq2" Apr 24 14:24:28.054458 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:28.054439 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:28.054543 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:28.054465 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:28.054543 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:28.054478 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mf78d for pod openshift-network-diagnostics/network-check-target-dgzq2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:28.054543 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:28.054534 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4ebd1686-821a-4fb0-b091-8a636b80f78e-kube-api-access-mf78d podName:4ebd1686-821a-4fb0-b091-8a636b80f78e nodeName:}" failed. No retries permitted until 2026-04-24 14:24:30.054515018 +0000 UTC m=+6.184492570 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-mf78d" (UniqueName: "kubernetes.io/projected/4ebd1686-821a-4fb0-b091-8a636b80f78e-kube-api-access-mf78d") pod "network-check-target-dgzq2" (UID: "4ebd1686-821a-4fb0-b091-8a636b80f78e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:28.459703 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:28.459231 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dgzq2" Apr 24 14:24:28.459703 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:28.459349 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dgzq2" podUID="4ebd1686-821a-4fb0-b091-8a636b80f78e" Apr 24 14:24:28.512683 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:28.511562 2571 generic.go:358] "Generic (PLEG): container finished" podID="46b053d51bc500e88f49a6371b8d013f" containerID="4bda48dd02454354a4d90dbadd47414ed7e943eb340c7c619f80dca3d685f471" exitCode=0 Apr 24 14:24:28.512683 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:28.512455 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-95.ec2.internal" event={"ID":"46b053d51bc500e88f49a6371b8d013f","Type":"ContainerDied","Data":"4bda48dd02454354a4d90dbadd47414ed7e943eb340c7c619f80dca3d685f471"} Apr 24 14:24:29.459792 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:29.459301 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fsnj5" Apr 24 14:24:29.459792 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:29.459443 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fsnj5" podUID="a022a0ca-5e80-43a6-8ee0-69dcf197d1a8" Apr 24 14:24:29.524841 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:29.524155 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-95.ec2.internal" event={"ID":"46b053d51bc500e88f49a6371b8d013f","Type":"ContainerStarted","Data":"6682bc49b55e5cb4015bb6a53356009335bb880c364a6cd1dc69fc26d8b3ea4b"} Apr 24 14:24:29.972428 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:29.972383 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a022a0ca-5e80-43a6-8ee0-69dcf197d1a8-metrics-certs\") pod \"network-metrics-daemon-fsnj5\" (UID: \"a022a0ca-5e80-43a6-8ee0-69dcf197d1a8\") " pod="openshift-multus/network-metrics-daemon-fsnj5" Apr 24 14:24:29.972608 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:29.972534 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:29.972608 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:29.972600 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a022a0ca-5e80-43a6-8ee0-69dcf197d1a8-metrics-certs podName:a022a0ca-5e80-43a6-8ee0-69dcf197d1a8 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:33.972579209 +0000 UTC m=+10.102556752 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a022a0ca-5e80-43a6-8ee0-69dcf197d1a8-metrics-certs") pod "network-metrics-daemon-fsnj5" (UID: "a022a0ca-5e80-43a6-8ee0-69dcf197d1a8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:30.073530 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:30.072831 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mf78d\" (UniqueName: \"kubernetes.io/projected/4ebd1686-821a-4fb0-b091-8a636b80f78e-kube-api-access-mf78d\") pod \"network-check-target-dgzq2\" (UID: \"4ebd1686-821a-4fb0-b091-8a636b80f78e\") " pod="openshift-network-diagnostics/network-check-target-dgzq2" Apr 24 14:24:30.073530 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:30.073079 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:30.073530 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:30.073099 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:30.073530 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:30.073112 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mf78d for pod openshift-network-diagnostics/network-check-target-dgzq2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:30.073530 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:30.073172 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4ebd1686-821a-4fb0-b091-8a636b80f78e-kube-api-access-mf78d podName:4ebd1686-821a-4fb0-b091-8a636b80f78e nodeName:}" failed. No retries permitted until 2026-04-24 14:24:34.073152853 +0000 UTC m=+10.203130415 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-mf78d" (UniqueName: "kubernetes.io/projected/4ebd1686-821a-4fb0-b091-8a636b80f78e-kube-api-access-mf78d") pod "network-check-target-dgzq2" (UID: "4ebd1686-821a-4fb0-b091-8a636b80f78e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:30.457580 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:30.457108 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dgzq2" Apr 24 14:24:30.457580 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:30.457239 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dgzq2" podUID="4ebd1686-821a-4fb0-b091-8a636b80f78e" Apr 24 14:24:31.455937 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:31.455902 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fsnj5" Apr 24 14:24:31.456388 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:31.456067 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fsnj5" podUID="a022a0ca-5e80-43a6-8ee0-69dcf197d1a8" Apr 24 14:24:32.456897 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:32.456365 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dgzq2" Apr 24 14:24:32.456897 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:32.456500 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dgzq2" podUID="4ebd1686-821a-4fb0-b091-8a636b80f78e" Apr 24 14:24:33.359253 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:33.359191 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-95.ec2.internal" podStartSLOduration=8.359171438 podStartE2EDuration="8.359171438s" podCreationTimestamp="2026-04-24 14:24:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:24:29.538587442 +0000 UTC m=+5.668565004" watchObservedRunningTime="2026-04-24 14:24:33.359171438 +0000 UTC m=+9.489149000" Apr 24 14:24:33.359702 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:33.359559 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-trzpj"] Apr 24 14:24:33.363957 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:33.363513 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-trzpj" Apr 24 14:24:33.363957 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:33.363591 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-trzpj" podUID="a06dcc9a-a5d4-44ad-8f76-6943b3f58258" Apr 24 14:24:33.401271 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:33.401235 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a06dcc9a-a5d4-44ad-8f76-6943b3f58258-kubelet-config\") pod \"global-pull-secret-syncer-trzpj\" (UID: \"a06dcc9a-a5d4-44ad-8f76-6943b3f58258\") " pod="kube-system/global-pull-secret-syncer-trzpj" Apr 24 14:24:33.401432 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:33.401320 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a06dcc9a-a5d4-44ad-8f76-6943b3f58258-original-pull-secret\") pod \"global-pull-secret-syncer-trzpj\" (UID: \"a06dcc9a-a5d4-44ad-8f76-6943b3f58258\") " pod="kube-system/global-pull-secret-syncer-trzpj" Apr 24 14:24:33.401432 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:33.401346 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a06dcc9a-a5d4-44ad-8f76-6943b3f58258-dbus\") pod \"global-pull-secret-syncer-trzpj\" (UID: \"a06dcc9a-a5d4-44ad-8f76-6943b3f58258\") " pod="kube-system/global-pull-secret-syncer-trzpj" Apr 24 14:24:33.457519 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:33.456953 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fsnj5" Apr 24 14:24:33.457519 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:33.457132 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fsnj5" podUID="a022a0ca-5e80-43a6-8ee0-69dcf197d1a8" Apr 24 14:24:33.501905 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:33.501874 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a06dcc9a-a5d4-44ad-8f76-6943b3f58258-original-pull-secret\") pod \"global-pull-secret-syncer-trzpj\" (UID: \"a06dcc9a-a5d4-44ad-8f76-6943b3f58258\") " pod="kube-system/global-pull-secret-syncer-trzpj" Apr 24 14:24:33.501905 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:33.501911 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a06dcc9a-a5d4-44ad-8f76-6943b3f58258-dbus\") pod \"global-pull-secret-syncer-trzpj\" (UID: \"a06dcc9a-a5d4-44ad-8f76-6943b3f58258\") " pod="kube-system/global-pull-secret-syncer-trzpj" Apr 24 14:24:33.502144 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:33.501993 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a06dcc9a-a5d4-44ad-8f76-6943b3f58258-kubelet-config\") pod \"global-pull-secret-syncer-trzpj\" (UID: \"a06dcc9a-a5d4-44ad-8f76-6943b3f58258\") " pod="kube-system/global-pull-secret-syncer-trzpj" Apr 24 14:24:33.502144 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:33.502114 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a06dcc9a-a5d4-44ad-8f76-6943b3f58258-kubelet-config\") pod \"global-pull-secret-syncer-trzpj\" (UID: \"a06dcc9a-a5d4-44ad-8f76-6943b3f58258\") " pod="kube-system/global-pull-secret-syncer-trzpj" Apr 24 14:24:33.502144 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:33.502130 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:33.502293 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:33.502202 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a06dcc9a-a5d4-44ad-8f76-6943b3f58258-original-pull-secret podName:a06dcc9a-a5d4-44ad-8f76-6943b3f58258 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:34.002182695 +0000 UTC m=+10.132160245 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a06dcc9a-a5d4-44ad-8f76-6943b3f58258-original-pull-secret") pod "global-pull-secret-syncer-trzpj" (UID: "a06dcc9a-a5d4-44ad-8f76-6943b3f58258") : object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:33.502293 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:33.502256 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a06dcc9a-a5d4-44ad-8f76-6943b3f58258-dbus\") pod \"global-pull-secret-syncer-trzpj\" (UID: \"a06dcc9a-a5d4-44ad-8f76-6943b3f58258\") " pod="kube-system/global-pull-secret-syncer-trzpj" Apr 24 14:24:34.005811 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:34.005750 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a022a0ca-5e80-43a6-8ee0-69dcf197d1a8-metrics-certs\") pod \"network-metrics-daemon-fsnj5\" (UID: \"a022a0ca-5e80-43a6-8ee0-69dcf197d1a8\") " pod="openshift-multus/network-metrics-daemon-fsnj5" Apr 24 14:24:34.005811 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:34.005810 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a06dcc9a-a5d4-44ad-8f76-6943b3f58258-original-pull-secret\") pod \"global-pull-secret-syncer-trzpj\" (UID: \"a06dcc9a-a5d4-44ad-8f76-6943b3f58258\") " pod="kube-system/global-pull-secret-syncer-trzpj" Apr 24 14:24:34.006108 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:34.005902 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:34.006108 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:34.005921 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:34.006108 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:34.005973 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a022a0ca-5e80-43a6-8ee0-69dcf197d1a8-metrics-certs podName:a022a0ca-5e80-43a6-8ee0-69dcf197d1a8 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:42.005952427 +0000 UTC m=+18.135929979 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a022a0ca-5e80-43a6-8ee0-69dcf197d1a8-metrics-certs") pod "network-metrics-daemon-fsnj5" (UID: "a022a0ca-5e80-43a6-8ee0-69dcf197d1a8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:34.006108 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:34.006008 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a06dcc9a-a5d4-44ad-8f76-6943b3f58258-original-pull-secret podName:a06dcc9a-a5d4-44ad-8f76-6943b3f58258 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:35.005998151 +0000 UTC m=+11.135975691 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a06dcc9a-a5d4-44ad-8f76-6943b3f58258-original-pull-secret") pod "global-pull-secret-syncer-trzpj" (UID: "a06dcc9a-a5d4-44ad-8f76-6943b3f58258") : object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:34.106405 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:34.106313 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mf78d\" (UniqueName: \"kubernetes.io/projected/4ebd1686-821a-4fb0-b091-8a636b80f78e-kube-api-access-mf78d\") pod \"network-check-target-dgzq2\" (UID: \"4ebd1686-821a-4fb0-b091-8a636b80f78e\") " pod="openshift-network-diagnostics/network-check-target-dgzq2" Apr 24 14:24:34.106600 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:34.106476 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:34.106600 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:34.106497 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:34.106600 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:34.106507 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mf78d for pod openshift-network-diagnostics/network-check-target-dgzq2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:34.106600 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:34.106554 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4ebd1686-821a-4fb0-b091-8a636b80f78e-kube-api-access-mf78d podName:4ebd1686-821a-4fb0-b091-8a636b80f78e nodeName:}" failed. No retries permitted until 2026-04-24 14:24:42.106539907 +0000 UTC m=+18.236517451 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-mf78d" (UniqueName: "kubernetes.io/projected/4ebd1686-821a-4fb0-b091-8a636b80f78e-kube-api-access-mf78d") pod "network-check-target-dgzq2" (UID: "4ebd1686-821a-4fb0-b091-8a636b80f78e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:34.457317 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:34.457283 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dgzq2" Apr 24 14:24:34.457497 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:34.457408 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dgzq2" podUID="4ebd1686-821a-4fb0-b091-8a636b80f78e" Apr 24 14:24:35.014185 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:35.014149 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a06dcc9a-a5d4-44ad-8f76-6943b3f58258-original-pull-secret\") pod \"global-pull-secret-syncer-trzpj\" (UID: \"a06dcc9a-a5d4-44ad-8f76-6943b3f58258\") " pod="kube-system/global-pull-secret-syncer-trzpj" Apr 24 14:24:35.014661 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:35.014287 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:35.014661 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:35.014367 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a06dcc9a-a5d4-44ad-8f76-6943b3f58258-original-pull-secret podName:a06dcc9a-a5d4-44ad-8f76-6943b3f58258 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:37.014346288 +0000 UTC m=+13.144323842 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a06dcc9a-a5d4-44ad-8f76-6943b3f58258-original-pull-secret") pod "global-pull-secret-syncer-trzpj" (UID: "a06dcc9a-a5d4-44ad-8f76-6943b3f58258") : object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:35.456147 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:35.456116 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-trzpj" Apr 24 14:24:35.456147 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:35.456132 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fsnj5" Apr 24 14:24:35.456383 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:35.456247 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-trzpj" podUID="a06dcc9a-a5d4-44ad-8f76-6943b3f58258" Apr 24 14:24:35.456444 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:35.456377 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fsnj5" podUID="a022a0ca-5e80-43a6-8ee0-69dcf197d1a8" Apr 24 14:24:36.456215 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:36.456092 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dgzq2" Apr 24 14:24:36.456707 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:36.456232 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dgzq2" podUID="4ebd1686-821a-4fb0-b091-8a636b80f78e" Apr 24 14:24:37.028911 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:37.028869 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a06dcc9a-a5d4-44ad-8f76-6943b3f58258-original-pull-secret\") pod \"global-pull-secret-syncer-trzpj\" (UID: \"a06dcc9a-a5d4-44ad-8f76-6943b3f58258\") " pod="kube-system/global-pull-secret-syncer-trzpj" Apr 24 14:24:37.029118 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:37.029031 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:37.029118 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:37.029096 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a06dcc9a-a5d4-44ad-8f76-6943b3f58258-original-pull-secret podName:a06dcc9a-a5d4-44ad-8f76-6943b3f58258 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:41.029079875 +0000 UTC m=+17.159057430 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a06dcc9a-a5d4-44ad-8f76-6943b3f58258-original-pull-secret") pod "global-pull-secret-syncer-trzpj" (UID: "a06dcc9a-a5d4-44ad-8f76-6943b3f58258") : object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:37.456099 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:37.456060 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-trzpj" Apr 24 14:24:37.456297 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:37.456060 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fsnj5" Apr 24 14:24:37.456297 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:37.456195 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-trzpj" podUID="a06dcc9a-a5d4-44ad-8f76-6943b3f58258" Apr 24 14:24:37.456297 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:37.456281 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fsnj5" podUID="a022a0ca-5e80-43a6-8ee0-69dcf197d1a8" Apr 24 14:24:38.456893 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:38.456852 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dgzq2" Apr 24 14:24:38.457345 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:38.457050 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dgzq2" podUID="4ebd1686-821a-4fb0-b091-8a636b80f78e" Apr 24 14:24:39.456847 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:39.456810 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-trzpj" Apr 24 14:24:39.457032 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:39.456818 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fsnj5" Apr 24 14:24:39.457032 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:39.456936 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-trzpj" podUID="a06dcc9a-a5d4-44ad-8f76-6943b3f58258" Apr 24 14:24:39.457390 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:39.457047 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fsnj5" podUID="a022a0ca-5e80-43a6-8ee0-69dcf197d1a8" Apr 24 14:24:40.456301 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:40.456265 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dgzq2" Apr 24 14:24:40.456499 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:40.456427 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dgzq2" podUID="4ebd1686-821a-4fb0-b091-8a636b80f78e" Apr 24 14:24:41.059265 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:41.059224 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a06dcc9a-a5d4-44ad-8f76-6943b3f58258-original-pull-secret\") pod \"global-pull-secret-syncer-trzpj\" (UID: \"a06dcc9a-a5d4-44ad-8f76-6943b3f58258\") " pod="kube-system/global-pull-secret-syncer-trzpj" Apr 24 14:24:41.059676 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:41.059339 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:41.059676 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:41.059399 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a06dcc9a-a5d4-44ad-8f76-6943b3f58258-original-pull-secret podName:a06dcc9a-a5d4-44ad-8f76-6943b3f58258 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:49.059385456 +0000 UTC m=+25.189362994 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a06dcc9a-a5d4-44ad-8f76-6943b3f58258-original-pull-secret") pod "global-pull-secret-syncer-trzpj" (UID: "a06dcc9a-a5d4-44ad-8f76-6943b3f58258") : object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:41.456145 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:41.456106 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fsnj5" Apr 24 14:24:41.456308 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:41.456106 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-trzpj" Apr 24 14:24:41.456308 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:41.456258 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fsnj5" podUID="a022a0ca-5e80-43a6-8ee0-69dcf197d1a8" Apr 24 14:24:41.456425 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:41.456315 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-trzpj" podUID="a06dcc9a-a5d4-44ad-8f76-6943b3f58258" Apr 24 14:24:42.066025 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:42.065796 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a022a0ca-5e80-43a6-8ee0-69dcf197d1a8-metrics-certs\") pod \"network-metrics-daemon-fsnj5\" (UID: \"a022a0ca-5e80-43a6-8ee0-69dcf197d1a8\") " pod="openshift-multus/network-metrics-daemon-fsnj5" Apr 24 14:24:42.066420 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:42.065942 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:42.066420 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:42.066156 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a022a0ca-5e80-43a6-8ee0-69dcf197d1a8-metrics-certs podName:a022a0ca-5e80-43a6-8ee0-69dcf197d1a8 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:58.066138637 +0000 UTC m=+34.196116191 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a022a0ca-5e80-43a6-8ee0-69dcf197d1a8-metrics-certs") pod "network-metrics-daemon-fsnj5" (UID: "a022a0ca-5e80-43a6-8ee0-69dcf197d1a8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:42.167230 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:42.167194 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mf78d\" (UniqueName: \"kubernetes.io/projected/4ebd1686-821a-4fb0-b091-8a636b80f78e-kube-api-access-mf78d\") pod \"network-check-target-dgzq2\" (UID: \"4ebd1686-821a-4fb0-b091-8a636b80f78e\") " pod="openshift-network-diagnostics/network-check-target-dgzq2" Apr 24 14:24:42.167382 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:42.167310 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:42.167382 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:42.167323 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:42.167382 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:42.167332 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mf78d for pod openshift-network-diagnostics/network-check-target-dgzq2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:42.167382 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:42.167378 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4ebd1686-821a-4fb0-b091-8a636b80f78e-kube-api-access-mf78d podName:4ebd1686-821a-4fb0-b091-8a636b80f78e nodeName:}" failed. No retries permitted until 2026-04-24 14:24:58.167365529 +0000 UTC m=+34.297343081 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-mf78d" (UniqueName: "kubernetes.io/projected/4ebd1686-821a-4fb0-b091-8a636b80f78e-kube-api-access-mf78d") pod "network-check-target-dgzq2" (UID: "4ebd1686-821a-4fb0-b091-8a636b80f78e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:42.456612 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:42.456576 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dgzq2" Apr 24 14:24:42.456782 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:42.456674 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dgzq2" podUID="4ebd1686-821a-4fb0-b091-8a636b80f78e" Apr 24 14:24:43.456055 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:43.456024 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-trzpj" Apr 24 14:24:43.456467 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:43.456024 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fsnj5" Apr 24 14:24:43.456467 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:43.456135 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-trzpj" podUID="a06dcc9a-a5d4-44ad-8f76-6943b3f58258" Apr 24 14:24:43.456467 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:43.456201 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fsnj5" podUID="a022a0ca-5e80-43a6-8ee0-69dcf197d1a8" Apr 24 14:24:44.458365 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:44.458344 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dgzq2" Apr 24 14:24:44.458674 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:44.458453 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dgzq2" podUID="4ebd1686-821a-4fb0-b091-8a636b80f78e" Apr 24 14:24:44.548493 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:44.548312 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-42l4z" event={"ID":"6c91c545-ee17-43ad-8a08-42be9b2cda48","Type":"ContainerStarted","Data":"0b4874919dde9deb2be9f475fe181dad1598d22cbf35c4d9aed9a0e821eda1c8"} Apr 24 14:24:44.549677 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:44.549650 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hgntn" event={"ID":"87bcc27f-4b5f-48a3-9ae3-f93cb520eea0","Type":"ContainerStarted","Data":"edcf3ea32220cbf4a6f66b61465f19dfe6392a6a2eeac0086a0f3f533351d1ee"} Apr 24 14:24:44.550822 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:44.550795 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bmpbb" event={"ID":"b97fcfbf-fa5a-4b45-8446-f25172d545bb","Type":"ContainerStarted","Data":"fc35a659e452294e3f377a70ff4fdf78097e70ff4d70c861b5739b9e5314ddac"} Apr 24 14:24:44.552066 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:44.552044 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gwdlv" event={"ID":"5b200c2b-497d-4b24-9fcb-1ed9a2b007c1","Type":"ContainerStarted","Data":"1f6c7bcc6ff48dbcdfb8e1e083545d1ad4d0ca0bb4b58ce5eb4d463c9fc6b9cf"} Apr 24 14:24:44.553784 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:44.553762 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7m9zt_a7b9926d-4f53-4532-8669-16af4fc30cfd/ovn-acl-logging/0.log" Apr 24 14:24:44.554073 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:44.554053 2571 generic.go:358] "Generic (PLEG): container finished" podID="a7b9926d-4f53-4532-8669-16af4fc30cfd" containerID="9cbe8597b41376b0af489fd46cb6644b71cbd25ef2f52237e323431cea41cfbe" exitCode=1 Apr 24 14:24:44.554135 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:44.554109 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" event={"ID":"a7b9926d-4f53-4532-8669-16af4fc30cfd","Type":"ContainerDied","Data":"9cbe8597b41376b0af489fd46cb6644b71cbd25ef2f52237e323431cea41cfbe"} Apr 24 14:24:44.554180 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:44.554134 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" event={"ID":"a7b9926d-4f53-4532-8669-16af4fc30cfd","Type":"ContainerStarted","Data":"a615080a3bdd1ff37a0b6702e38bd1173ae8ea94f21c40a970d16466453ef5a1"} Apr 24 14:24:44.555274 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:44.555250 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4bhjw" event={"ID":"0b999a46-88fd-4d1a-a6e6-11c90708c270","Type":"ContainerStarted","Data":"895a01598620e8d456997dc6c32ae1cb99bc5ee5071d81f824f6380ec3cf9deb"} Apr 24 14:24:44.556457 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:44.556431 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-v9wvt" event={"ID":"15cb0256-87b8-40a0-812d-7931f453c264","Type":"ContainerStarted","Data":"a4d8f41366c197395f24e9bc02b14a2ef69a35783a0cfa84367bf50e4faad37a"} Apr 24 14:24:44.557569 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:44.557549 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" event={"ID":"9ce961e4-1141-4ebb-83ff-6ef501b710dd","Type":"ContainerStarted","Data":"63966514a681f81ec3e8716c44790eb4b948aaf4d4f8b55b3114a13e84806119"} Apr 24 14:24:44.578120 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:44.578076 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-bmpbb" podStartSLOduration=2.6454733299999997 podStartE2EDuration="19.578064617s" podCreationTimestamp="2026-04-24 14:24:25 +0000 UTC" firstStartedPulling="2026-04-24 14:24:27.114766343 +0000 UTC m=+3.244743892" lastFinishedPulling="2026-04-24 14:24:44.047357635 +0000 UTC m=+20.177335179" observedRunningTime="2026-04-24 14:24:44.578049658 +0000 UTC m=+20.708027218" watchObservedRunningTime="2026-04-24 14:24:44.578064617 +0000 UTC m=+20.708042178" Apr 24 14:24:44.591309 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:44.591252 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-v9wvt" podStartSLOduration=11.697709341 podStartE2EDuration="20.591234402s" podCreationTimestamp="2026-04-24 14:24:24 +0000 UTC" firstStartedPulling="2026-04-24 14:24:27.085520317 +0000 UTC m=+3.215497870" lastFinishedPulling="2026-04-24 14:24:35.979045379 +0000 UTC m=+12.109022931" observedRunningTime="2026-04-24 14:24:44.590586563 +0000 UTC m=+20.720564123" watchObservedRunningTime="2026-04-24 14:24:44.591234402 +0000 UTC m=+20.721211961" Apr 24 14:24:44.604875 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:44.604836 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-fhn6c" podStartSLOduration=3.670106569 podStartE2EDuration="20.604823429s" podCreationTimestamp="2026-04-24 14:24:24 +0000 UTC" firstStartedPulling="2026-04-24 14:24:27.114337478 +0000 UTC m=+3.244315020" lastFinishedPulling="2026-04-24 14:24:44.049054328 +0000 UTC m=+20.179031880" observedRunningTime="2026-04-24 14:24:44.604511971 +0000 UTC m=+20.734489531" watchObservedRunningTime="2026-04-24 14:24:44.604823429 +0000 UTC m=+20.734800971" Apr 24 14:24:44.617056 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:44.617016 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-hgntn" podStartSLOduration=3.681641379 podStartE2EDuration="20.616980039s" podCreationTimestamp="2026-04-24 14:24:24 +0000 UTC" firstStartedPulling="2026-04-24 14:24:27.083044829 +0000 UTC m=+3.213022373" lastFinishedPulling="2026-04-24 14:24:44.018383481 +0000 UTC m=+20.148361033" observedRunningTime="2026-04-24 14:24:44.616653019 +0000 UTC m=+20.746630575" watchObservedRunningTime="2026-04-24 14:24:44.616980039 +0000 UTC m=+20.746957598" Apr 24 14:24:44.637241 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:44.636499 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-gwdlv" podStartSLOduration=3.692192153 podStartE2EDuration="20.636480135s" podCreationTimestamp="2026-04-24 14:24:24 +0000 UTC" firstStartedPulling="2026-04-24 14:24:27.114220619 +0000 UTC m=+3.244198172" lastFinishedPulling="2026-04-24 14:24:44.058508612 +0000 UTC m=+20.188486154" observedRunningTime="2026-04-24 14:24:44.632864331 +0000 UTC m=+20.762842099" watchObservedRunningTime="2026-04-24 14:24:44.636480135 +0000 UTC m=+20.766457701" Apr 24 14:24:45.285865 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:45.285538 2571 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 14:24:45.397438 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:45.397356 2571 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T14:24:45.285769826Z","UUID":"2b23d838-13ae-4e61-b22d-ae3ac44b7e7b","Handler":null,"Name":"","Endpoint":""} Apr 24 14:24:45.398827 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:45.398808 2571 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 14:24:45.398916 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:45.398832 2571 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 14:24:45.456091 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:45.456062 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fsnj5" Apr 24 14:24:45.456201 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:45.456074 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-trzpj" Apr 24 14:24:45.456201 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:45.456163 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fsnj5" podUID="a022a0ca-5e80-43a6-8ee0-69dcf197d1a8" Apr 24 14:24:45.456288 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:45.456238 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-trzpj" podUID="a06dcc9a-a5d4-44ad-8f76-6943b3f58258" Apr 24 14:24:45.560925 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:45.560895 2571 generic.go:358] "Generic (PLEG): container finished" podID="6c91c545-ee17-43ad-8a08-42be9b2cda48" containerID="0b4874919dde9deb2be9f475fe181dad1598d22cbf35c4d9aed9a0e821eda1c8" exitCode=0 Apr 24 14:24:45.561635 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:45.561005 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-42l4z" event={"ID":"6c91c545-ee17-43ad-8a08-42be9b2cda48","Type":"ContainerDied","Data":"0b4874919dde9deb2be9f475fe181dad1598d22cbf35c4d9aed9a0e821eda1c8"} Apr 24 14:24:45.562554 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:45.562455 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-j9xv9" event={"ID":"865ca2f7-3380-490c-b40d-e9c4fb7c799a","Type":"ContainerStarted","Data":"2aed4ee6e9cbcbfb3ee2142a34d27ca64df35f990aaea3a4904fec2cef7f4287"} Apr 24 14:24:45.564867 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:45.564840 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7m9zt_a7b9926d-4f53-4532-8669-16af4fc30cfd/ovn-acl-logging/0.log" Apr 24 14:24:45.565204 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:45.565182 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" event={"ID":"a7b9926d-4f53-4532-8669-16af4fc30cfd","Type":"ContainerStarted","Data":"1375dc025523bb3321570aae037ec4305d1604b3a08e04745b26500ccd938ab6"} Apr 24 14:24:45.565288 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:45.565213 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" event={"ID":"a7b9926d-4f53-4532-8669-16af4fc30cfd","Type":"ContainerStarted","Data":"22670647db56a4a20f9ce4e7346d5e09ba5907c2736ba9b1efcf5d53a9c92cdd"} Apr 24 14:24:45.565288 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:45.565225 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" event={"ID":"a7b9926d-4f53-4532-8669-16af4fc30cfd","Type":"ContainerStarted","Data":"ae95ab8b19875ea93ed7f641e578e70385d0a6575261a6d03c57dd34cec35c75"} Apr 24 14:24:45.565288 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:45.565237 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" event={"ID":"a7b9926d-4f53-4532-8669-16af4fc30cfd","Type":"ContainerStarted","Data":"cebc300ab4b3b4ea0737e0bee3da95aee0eb02c6d991321fbf8b974e5caed179"} Apr 24 14:24:45.566794 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:45.566771 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4bhjw" event={"ID":"0b999a46-88fd-4d1a-a6e6-11c90708c270","Type":"ContainerStarted","Data":"d2d4a86769a100a6eb461d6c6062f1473dd8f6d1edf30761e183e8454d280c19"} Apr 24 14:24:45.957389 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:45.957358 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-v9wvt" Apr 24 14:24:45.957922 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:45.957905 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-v9wvt" Apr 24 14:24:45.970279 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:45.970237 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-j9xv9" podStartSLOduration=5.034851509 podStartE2EDuration="21.970225108s" podCreationTimestamp="2026-04-24 14:24:24 +0000 UTC" firstStartedPulling="2026-04-24 14:24:27.110128242 +0000 UTC m=+3.240105786" lastFinishedPulling="2026-04-24 14:24:44.045501834 +0000 UTC m=+20.175479385" observedRunningTime="2026-04-24 14:24:45.591113561 +0000 UTC m=+21.721091134" watchObservedRunningTime="2026-04-24 14:24:45.970225108 +0000 UTC m=+22.100202667" Apr 24 14:24:46.459417 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:46.459388 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dgzq2" Apr 24 14:24:46.459573 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:46.459497 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dgzq2" podUID="4ebd1686-821a-4fb0-b091-8a636b80f78e" Apr 24 14:24:46.569174 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:46.569144 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-v9wvt" Apr 24 14:24:46.569616 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:46.569580 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-v9wvt" Apr 24 14:24:47.456512 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:47.456430 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-trzpj" Apr 24 14:24:47.456665 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:47.456434 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fsnj5" Apr 24 14:24:47.456665 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:47.456546 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-trzpj" podUID="a06dcc9a-a5d4-44ad-8f76-6943b3f58258" Apr 24 14:24:47.456665 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:47.456649 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fsnj5" podUID="a022a0ca-5e80-43a6-8ee0-69dcf197d1a8" Apr 24 14:24:47.573835 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:47.573812 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7m9zt_a7b9926d-4f53-4532-8669-16af4fc30cfd/ovn-acl-logging/0.log" Apr 24 14:24:47.574281 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:47.574197 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" event={"ID":"a7b9926d-4f53-4532-8669-16af4fc30cfd","Type":"ContainerStarted","Data":"23054e2ec53f1847d28b1a596aa2d2a02ad51bcbfe40ff463a4b632cf536239b"} Apr 24 14:24:47.576034 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:47.575998 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4bhjw" event={"ID":"0b999a46-88fd-4d1a-a6e6-11c90708c270","Type":"ContainerStarted","Data":"fcc4356fe50ad43c92bc5c3d24bf2668c3e45b157fd477c503697960883e937b"} Apr 24 14:24:47.601234 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:47.601186 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4bhjw" podStartSLOduration=4.182159149 podStartE2EDuration="23.601173873s" podCreationTimestamp="2026-04-24 14:24:24 +0000 UTC" firstStartedPulling="2026-04-24 14:24:27.088554647 +0000 UTC m=+3.218532187" lastFinishedPulling="2026-04-24 14:24:46.507569358 +0000 UTC m=+22.637546911" observedRunningTime="2026-04-24 14:24:47.601109932 +0000 UTC m=+23.731087492" watchObservedRunningTime="2026-04-24 14:24:47.601173873 +0000 UTC m=+23.731151432" Apr 24 14:24:48.459552 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:48.459524 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dgzq2" Apr 24 14:24:48.459718 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:48.459528 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fsnj5" Apr 24 14:24:48.459718 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:48.459637 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dgzq2" podUID="4ebd1686-821a-4fb0-b091-8a636b80f78e" Apr 24 14:24:48.459834 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:48.459745 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fsnj5" podUID="a022a0ca-5e80-43a6-8ee0-69dcf197d1a8" Apr 24 14:24:49.121347 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:49.121309 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a06dcc9a-a5d4-44ad-8f76-6943b3f58258-original-pull-secret\") pod \"global-pull-secret-syncer-trzpj\" (UID: \"a06dcc9a-a5d4-44ad-8f76-6943b3f58258\") " pod="kube-system/global-pull-secret-syncer-trzpj" Apr 24 14:24:49.121729 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:49.121455 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:49.121729 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:49.121539 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a06dcc9a-a5d4-44ad-8f76-6943b3f58258-original-pull-secret podName:a06dcc9a-a5d4-44ad-8f76-6943b3f58258 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:05.121518677 +0000 UTC m=+41.251496234 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a06dcc9a-a5d4-44ad-8f76-6943b3f58258-original-pull-secret") pod "global-pull-secret-syncer-trzpj" (UID: "a06dcc9a-a5d4-44ad-8f76-6943b3f58258") : object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:49.456962 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:49.456891 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-trzpj" Apr 24 14:24:49.457108 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:49.457011 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-trzpj" podUID="a06dcc9a-a5d4-44ad-8f76-6943b3f58258" Apr 24 14:24:50.458818 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:50.458652 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fsnj5" Apr 24 14:24:50.459509 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:50.458652 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dgzq2" Apr 24 14:24:50.459509 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:50.458906 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fsnj5" podUID="a022a0ca-5e80-43a6-8ee0-69dcf197d1a8" Apr 24 14:24:50.459509 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:50.458938 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dgzq2" podUID="4ebd1686-821a-4fb0-b091-8a636b80f78e" Apr 24 14:24:50.584073 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:50.584047 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7m9zt_a7b9926d-4f53-4532-8669-16af4fc30cfd/ovn-acl-logging/0.log" Apr 24 14:24:50.584393 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:50.584371 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" event={"ID":"a7b9926d-4f53-4532-8669-16af4fc30cfd","Type":"ContainerStarted","Data":"d78d877e8993d98e8a2049748506adc35cb1cc876024fe71c158f60a54c657e8"} Apr 24 14:24:50.584776 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:50.584706 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:50.584884 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:50.584847 2571 scope.go:117] "RemoveContainer" containerID="9cbe8597b41376b0af489fd46cb6644b71cbd25ef2f52237e323431cea41cfbe" Apr 24 14:24:50.586077 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:50.586049 2571 generic.go:358] "Generic (PLEG): container finished" podID="6c91c545-ee17-43ad-8a08-42be9b2cda48" containerID="c0d5971472dfb7201ffa1f8cc2e440873fd1fa8fb8e430eae0473b67747dbab1" exitCode=0 Apr 24 14:24:50.586167 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:50.586084 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-42l4z" event={"ID":"6c91c545-ee17-43ad-8a08-42be9b2cda48","Type":"ContainerDied","Data":"c0d5971472dfb7201ffa1f8cc2e440873fd1fa8fb8e430eae0473b67747dbab1"} Apr 24 14:24:50.600329 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:50.600310 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:51.456228 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:51.456191 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-trzpj" Apr 24 14:24:51.456382 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:51.456292 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-trzpj" podUID="a06dcc9a-a5d4-44ad-8f76-6943b3f58258" Apr 24 14:24:51.546060 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:51.545970 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-trzpj"] Apr 24 14:24:51.548969 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:51.548939 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fsnj5"] Apr 24 14:24:51.549129 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:51.549116 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fsnj5" Apr 24 14:24:51.549255 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:51.549233 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fsnj5" podUID="a022a0ca-5e80-43a6-8ee0-69dcf197d1a8" Apr 24 14:24:51.549703 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:51.549681 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-dgzq2"] Apr 24 14:24:51.549801 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:51.549794 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dgzq2" Apr 24 14:24:51.549889 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:51.549871 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dgzq2" podUID="4ebd1686-821a-4fb0-b091-8a636b80f78e" Apr 24 14:24:51.590153 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:51.590119 2571 generic.go:358] "Generic (PLEG): container finished" podID="6c91c545-ee17-43ad-8a08-42be9b2cda48" containerID="aeb7e86e370cee3d25304b19746b440131a59150460a394c083a5d8b4c6a860e" exitCode=0 Apr 24 14:24:51.590312 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:51.590208 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-42l4z" event={"ID":"6c91c545-ee17-43ad-8a08-42be9b2cda48","Type":"ContainerDied","Data":"aeb7e86e370cee3d25304b19746b440131a59150460a394c083a5d8b4c6a860e"} Apr 24 14:24:51.594014 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:51.593994 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7m9zt_a7b9926d-4f53-4532-8669-16af4fc30cfd/ovn-acl-logging/0.log" Apr 24 14:24:51.594307 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:51.594287 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" event={"ID":"a7b9926d-4f53-4532-8669-16af4fc30cfd","Type":"ContainerStarted","Data":"96a18be5b501e3e6b60f47d911106545d0444f6047713c93c96d82dbe03d28d6"} Apr 24 14:24:51.594363 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:51.594308 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-trzpj" Apr 24 14:24:51.594409 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:51.594392 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-trzpj" podUID="a06dcc9a-a5d4-44ad-8f76-6943b3f58258" Apr 24 14:24:51.594734 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:51.594712 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:51.594847 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:51.594741 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:51.608442 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:51.608424 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:24:51.652194 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:51.652155 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" podStartSLOduration=10.658532367 podStartE2EDuration="27.652141316s" podCreationTimestamp="2026-04-24 14:24:24 +0000 UTC" firstStartedPulling="2026-04-24 14:24:27.110047241 +0000 UTC m=+3.240024782" lastFinishedPulling="2026-04-24 14:24:44.103656194 +0000 UTC m=+20.233633731" observedRunningTime="2026-04-24 14:24:51.651633399 +0000 UTC m=+27.781610969" watchObservedRunningTime="2026-04-24 14:24:51.652141316 +0000 UTC m=+27.782118873" Apr 24 14:24:52.598135 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:52.597828 2571 generic.go:358] "Generic (PLEG): container finished" podID="6c91c545-ee17-43ad-8a08-42be9b2cda48" containerID="2d913feafceaf72c658186be34d4834120ccf0f1602f17c3cbe0547a73d9685d" exitCode=0 Apr 24 14:24:52.598490 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:52.597910 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-42l4z" event={"ID":"6c91c545-ee17-43ad-8a08-42be9b2cda48","Type":"ContainerDied","Data":"2d913feafceaf72c658186be34d4834120ccf0f1602f17c3cbe0547a73d9685d"} Apr 24 14:24:53.456787 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:53.456757 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fsnj5" Apr 24 14:24:53.456948 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:53.456758 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dgzq2" Apr 24 14:24:53.456948 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:53.456878 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fsnj5" podUID="a022a0ca-5e80-43a6-8ee0-69dcf197d1a8" Apr 24 14:24:53.457079 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:53.456956 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dgzq2" podUID="4ebd1686-821a-4fb0-b091-8a636b80f78e" Apr 24 14:24:53.457079 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:53.456758 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-trzpj" Apr 24 14:24:53.457079 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:53.457054 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-trzpj" podUID="a06dcc9a-a5d4-44ad-8f76-6943b3f58258" Apr 24 14:24:55.456625 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:55.456595 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dgzq2" Apr 24 14:24:55.457143 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:55.456669 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-trzpj" Apr 24 14:24:55.457143 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:55.456805 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-trzpj" podUID="a06dcc9a-a5d4-44ad-8f76-6943b3f58258" Apr 24 14:24:55.457143 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:55.456858 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fsnj5" Apr 24 14:24:55.457143 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:55.456942 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fsnj5" podUID="a022a0ca-5e80-43a6-8ee0-69dcf197d1a8" Apr 24 14:24:55.457143 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:55.457048 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dgzq2" podUID="4ebd1686-821a-4fb0-b091-8a636b80f78e" Apr 24 14:24:57.456433 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:57.456392 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fsnj5" Apr 24 14:24:57.456946 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:57.456392 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dgzq2" Apr 24 14:24:57.456946 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:57.456533 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fsnj5" podUID="a022a0ca-5e80-43a6-8ee0-69dcf197d1a8" Apr 24 14:24:57.456946 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:57.456392 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-trzpj" Apr 24 14:24:57.456946 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:57.456611 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dgzq2" podUID="4ebd1686-821a-4fb0-b091-8a636b80f78e" Apr 24 14:24:57.456946 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:57.456674 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-trzpj" podUID="a06dcc9a-a5d4-44ad-8f76-6943b3f58258" Apr 24 14:24:58.085950 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.085880 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a022a0ca-5e80-43a6-8ee0-69dcf197d1a8-metrics-certs\") pod \"network-metrics-daemon-fsnj5\" (UID: \"a022a0ca-5e80-43a6-8ee0-69dcf197d1a8\") " pod="openshift-multus/network-metrics-daemon-fsnj5" Apr 24 14:24:58.086097 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:58.086016 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:58.086097 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:58.086066 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a022a0ca-5e80-43a6-8ee0-69dcf197d1a8-metrics-certs podName:a022a0ca-5e80-43a6-8ee0-69dcf197d1a8 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:30.086052125 +0000 UTC m=+66.216029664 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a022a0ca-5e80-43a6-8ee0-69dcf197d1a8-metrics-certs") pod "network-metrics-daemon-fsnj5" (UID: "a022a0ca-5e80-43a6-8ee0-69dcf197d1a8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:58.171946 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.171920 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-95.ec2.internal" event="NodeReady" Apr 24 14:24:58.172088 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.172059 2571 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 14:24:58.186935 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.186916 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mf78d\" (UniqueName: \"kubernetes.io/projected/4ebd1686-821a-4fb0-b091-8a636b80f78e-kube-api-access-mf78d\") pod \"network-check-target-dgzq2\" (UID: \"4ebd1686-821a-4fb0-b091-8a636b80f78e\") " pod="openshift-network-diagnostics/network-check-target-dgzq2" Apr 24 14:24:58.187095 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:58.187076 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:58.187136 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:58.187101 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:58.187136 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:58.187110 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mf78d for pod openshift-network-diagnostics/network-check-target-dgzq2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:58.187203 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:58.187159 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4ebd1686-821a-4fb0-b091-8a636b80f78e-kube-api-access-mf78d podName:4ebd1686-821a-4fb0-b091-8a636b80f78e nodeName:}" failed. No retries permitted until 2026-04-24 14:25:30.187141892 +0000 UTC m=+66.317119447 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-mf78d" (UniqueName: "kubernetes.io/projected/4ebd1686-821a-4fb0-b091-8a636b80f78e-kube-api-access-mf78d") pod "network-check-target-dgzq2" (UID: "4ebd1686-821a-4fb0-b091-8a636b80f78e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:58.205219 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.205193 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-55fdbcc56d-lrqtj"] Apr 24 14:24:58.232045 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.232021 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-55fdbcc56d-lrqtj"] Apr 24 14:24:58.232045 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.232046 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-752tj"] Apr 24 14:24:58.232182 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.232155 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" Apr 24 14:24:58.234542 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.234508 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 14:24:58.234738 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.234704 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 14:24:58.234895 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.234870 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 14:24:58.235018 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.234911 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-tpm4v\"" Apr 24 14:24:58.240624 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.240602 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 14:24:58.256498 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.256478 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-k2t2r"] Apr 24 14:24:58.256657 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.256641 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-752tj" Apr 24 14:24:58.258615 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.258598 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 14:24:58.258615 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.258610 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7bxm2\"" Apr 24 14:24:58.258746 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.258598 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 14:24:58.281111 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.281092 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-752tj"] Apr 24 14:24:58.281111 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.281111 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-k2t2r"] Apr 24 14:24:58.281220 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.281188 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-k2t2r" Apr 24 14:24:58.283194 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.283175 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xfc9t\"" Apr 24 14:24:58.283194 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.283188 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 14:24:58.283348 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.283175 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 14:24:58.283348 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.283266 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 14:24:58.387584 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.387550 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zckcc\" (UniqueName: \"kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-kube-api-access-zckcc\") pod \"image-registry-55fdbcc56d-lrqtj\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" Apr 24 14:24:58.387584 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.387585 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjqhh\" (UniqueName: \"kubernetes.io/projected/5b9edb70-aaea-4a5d-bd77-289fb7865065-kube-api-access-jjqhh\") pod \"dns-default-752tj\" (UID: \"5b9edb70-aaea-4a5d-bd77-289fb7865065\") " pod="openshift-dns/dns-default-752tj" Apr 24 14:24:58.387755 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.387611 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-registry-tls\") pod \"image-registry-55fdbcc56d-lrqtj\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" Apr 24 14:24:58.387755 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.387656 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-trusted-ca\") pod \"image-registry-55fdbcc56d-lrqtj\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" Apr 24 14:24:58.387755 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.387678 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds6v9\" (UniqueName: \"kubernetes.io/projected/21ba7795-411f-46a3-93c5-fedef51a27ea-kube-api-access-ds6v9\") pod \"ingress-canary-k2t2r\" (UID: \"21ba7795-411f-46a3-93c5-fedef51a27ea\") " pod="openshift-ingress-canary/ingress-canary-k2t2r" Apr 24 14:24:58.387755 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.387718 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b9edb70-aaea-4a5d-bd77-289fb7865065-config-volume\") pod \"dns-default-752tj\" (UID: \"5b9edb70-aaea-4a5d-bd77-289fb7865065\") " pod="openshift-dns/dns-default-752tj" Apr 24 14:24:58.387874 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.387775 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-bound-sa-token\") pod \"image-registry-55fdbcc56d-lrqtj\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" Apr 24 14:24:58.387874 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.387807 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5b9edb70-aaea-4a5d-bd77-289fb7865065-tmp-dir\") pod \"dns-default-752tj\" (UID: \"5b9edb70-aaea-4a5d-bd77-289fb7865065\") " pod="openshift-dns/dns-default-752tj" Apr 24 14:24:58.387874 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.387835 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-image-registry-private-configuration\") pod \"image-registry-55fdbcc56d-lrqtj\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" Apr 24 14:24:58.387874 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.387852 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-installation-pull-secrets\") pod \"image-registry-55fdbcc56d-lrqtj\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" Apr 24 14:24:58.388082 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.387878 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-registry-certificates\") pod \"image-registry-55fdbcc56d-lrqtj\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" Apr 24 14:24:58.388082 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.387901 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21ba7795-411f-46a3-93c5-fedef51a27ea-cert\") pod \"ingress-canary-k2t2r\" (UID: \"21ba7795-411f-46a3-93c5-fedef51a27ea\") " pod="openshift-ingress-canary/ingress-canary-k2t2r" Apr 24 14:24:58.388082 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.387972 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-ca-trust-extracted\") pod \"image-registry-55fdbcc56d-lrqtj\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" Apr 24 14:24:58.388082 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.388033 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b9edb70-aaea-4a5d-bd77-289fb7865065-metrics-tls\") pod \"dns-default-752tj\" (UID: \"5b9edb70-aaea-4a5d-bd77-289fb7865065\") " pod="openshift-dns/dns-default-752tj" Apr 24 14:24:58.489316 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.489278 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-ca-trust-extracted\") pod \"image-registry-55fdbcc56d-lrqtj\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" Apr 24 14:24:58.489721 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.489352 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b9edb70-aaea-4a5d-bd77-289fb7865065-metrics-tls\") pod \"dns-default-752tj\" (UID: \"5b9edb70-aaea-4a5d-bd77-289fb7865065\") " pod="openshift-dns/dns-default-752tj" Apr 24 14:24:58.489721 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.489427 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zckcc\" (UniqueName: \"kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-kube-api-access-zckcc\") pod \"image-registry-55fdbcc56d-lrqtj\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" Apr 24 14:24:58.489721 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.489453 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jjqhh\" (UniqueName: \"kubernetes.io/projected/5b9edb70-aaea-4a5d-bd77-289fb7865065-kube-api-access-jjqhh\") pod \"dns-default-752tj\" (UID: \"5b9edb70-aaea-4a5d-bd77-289fb7865065\") " pod="openshift-dns/dns-default-752tj" Apr 24 14:24:58.489721 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.489481 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-registry-tls\") pod \"image-registry-55fdbcc56d-lrqtj\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" Apr 24 14:24:58.489721 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.489507 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-trusted-ca\") pod \"image-registry-55fdbcc56d-lrqtj\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" Apr 24 14:24:58.489721 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.489532 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ds6v9\" (UniqueName: \"kubernetes.io/projected/21ba7795-411f-46a3-93c5-fedef51a27ea-kube-api-access-ds6v9\") pod \"ingress-canary-k2t2r\" (UID: \"21ba7795-411f-46a3-93c5-fedef51a27ea\") " pod="openshift-ingress-canary/ingress-canary-k2t2r" Apr 24 14:24:58.489721 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.489559 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b9edb70-aaea-4a5d-bd77-289fb7865065-config-volume\") pod \"dns-default-752tj\" (UID: \"5b9edb70-aaea-4a5d-bd77-289fb7865065\") " pod="openshift-dns/dns-default-752tj" Apr 24 14:24:58.489721 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.489587 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-bound-sa-token\") pod \"image-registry-55fdbcc56d-lrqtj\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" Apr 24 14:24:58.489721 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.489604 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5b9edb70-aaea-4a5d-bd77-289fb7865065-tmp-dir\") pod \"dns-default-752tj\" (UID: \"5b9edb70-aaea-4a5d-bd77-289fb7865065\") " pod="openshift-dns/dns-default-752tj" Apr 24 14:24:58.489721 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.489631 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-image-registry-private-configuration\") pod \"image-registry-55fdbcc56d-lrqtj\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" Apr 24 14:24:58.489721 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.489647 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-installation-pull-secrets\") pod \"image-registry-55fdbcc56d-lrqtj\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" Apr 24 14:24:58.489721 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.489668 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-registry-certificates\") pod \"image-registry-55fdbcc56d-lrqtj\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" Apr 24 14:24:58.489721 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.489689 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21ba7795-411f-46a3-93c5-fedef51a27ea-cert\") pod \"ingress-canary-k2t2r\" (UID: \"21ba7795-411f-46a3-93c5-fedef51a27ea\") " pod="openshift-ingress-canary/ingress-canary-k2t2r" Apr 24 14:24:58.489721 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.489698 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-ca-trust-extracted\") pod \"image-registry-55fdbcc56d-lrqtj\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" Apr 24 14:24:58.490405 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:58.489822 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:24:58.490405 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:58.489876 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21ba7795-411f-46a3-93c5-fedef51a27ea-cert podName:21ba7795-411f-46a3-93c5-fedef51a27ea nodeName:}" failed. No retries permitted until 2026-04-24 14:24:58.989857632 +0000 UTC m=+35.119835177 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/21ba7795-411f-46a3-93c5-fedef51a27ea-cert") pod "ingress-canary-k2t2r" (UID: "21ba7795-411f-46a3-93c5-fedef51a27ea") : secret "canary-serving-cert" not found Apr 24 14:24:58.490405 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:58.489822 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:24:58.490405 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:58.490065 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b9edb70-aaea-4a5d-bd77-289fb7865065-metrics-tls podName:5b9edb70-aaea-4a5d-bd77-289fb7865065 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:58.990050498 +0000 UTC m=+35.120028036 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5b9edb70-aaea-4a5d-bd77-289fb7865065-metrics-tls") pod "dns-default-752tj" (UID: "5b9edb70-aaea-4a5d-bd77-289fb7865065") : secret "dns-default-metrics-tls" not found Apr 24 14:24:58.490405 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:58.490019 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 14:24:58.490405 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:58.490086 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55fdbcc56d-lrqtj: secret "image-registry-tls" not found Apr 24 14:24:58.490405 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:58.490115 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-registry-tls podName:0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:58.990105638 +0000 UTC m=+35.120083182 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-registry-tls") pod "image-registry-55fdbcc56d-lrqtj" (UID: "0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00") : secret "image-registry-tls" not found Apr 24 14:24:58.490801 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.490416 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5b9edb70-aaea-4a5d-bd77-289fb7865065-tmp-dir\") pod \"dns-default-752tj\" (UID: \"5b9edb70-aaea-4a5d-bd77-289fb7865065\") " pod="openshift-dns/dns-default-752tj" Apr 24 14:24:58.490801 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.490616 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b9edb70-aaea-4a5d-bd77-289fb7865065-config-volume\") pod \"dns-default-752tj\" (UID: \"5b9edb70-aaea-4a5d-bd77-289fb7865065\") " pod="openshift-dns/dns-default-752tj" Apr 24 14:24:58.490801 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.490787 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-registry-certificates\") pod \"image-registry-55fdbcc56d-lrqtj\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" Apr 24 14:24:58.490949 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.490895 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-trusted-ca\") pod \"image-registry-55fdbcc56d-lrqtj\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" Apr 24 14:24:58.494383 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.494355 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-image-registry-private-configuration\") pod \"image-registry-55fdbcc56d-lrqtj\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" Apr 24 14:24:58.494383 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.494361 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-installation-pull-secrets\") pod \"image-registry-55fdbcc56d-lrqtj\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" Apr 24 14:24:58.502244 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.502198 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjqhh\" (UniqueName: \"kubernetes.io/projected/5b9edb70-aaea-4a5d-bd77-289fb7865065-kube-api-access-jjqhh\") pod \"dns-default-752tj\" (UID: \"5b9edb70-aaea-4a5d-bd77-289fb7865065\") " pod="openshift-dns/dns-default-752tj" Apr 24 14:24:58.502411 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.502391 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds6v9\" (UniqueName: \"kubernetes.io/projected/21ba7795-411f-46a3-93c5-fedef51a27ea-kube-api-access-ds6v9\") pod \"ingress-canary-k2t2r\" (UID: \"21ba7795-411f-46a3-93c5-fedef51a27ea\") " pod="openshift-ingress-canary/ingress-canary-k2t2r" Apr 24 14:24:58.513811 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.513788 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zckcc\" (UniqueName: \"kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-kube-api-access-zckcc\") pod \"image-registry-55fdbcc56d-lrqtj\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" Apr 24 14:24:58.513904 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.513823 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-bound-sa-token\") pod \"image-registry-55fdbcc56d-lrqtj\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" Apr 24 14:24:58.611417 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.611389 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-42l4z" event={"ID":"6c91c545-ee17-43ad-8a08-42be9b2cda48","Type":"ContainerStarted","Data":"fecf6c0fcfd621b3887961256ddcaba8b884d61968cd1aa31f8d499927aa0ad8"} Apr 24 14:24:58.994351 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.994270 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21ba7795-411f-46a3-93c5-fedef51a27ea-cert\") pod \"ingress-canary-k2t2r\" (UID: \"21ba7795-411f-46a3-93c5-fedef51a27ea\") " pod="openshift-ingress-canary/ingress-canary-k2t2r" Apr 24 14:24:58.994351 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.994310 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b9edb70-aaea-4a5d-bd77-289fb7865065-metrics-tls\") pod \"dns-default-752tj\" (UID: \"5b9edb70-aaea-4a5d-bd77-289fb7865065\") " pod="openshift-dns/dns-default-752tj" Apr 24 14:24:58.994557 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:58.994375 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-registry-tls\") pod \"image-registry-55fdbcc56d-lrqtj\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" Apr 24 14:24:58.994557 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:58.994419 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:24:58.994557 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:58.994465 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 14:24:58.994557 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:58.994477 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55fdbcc56d-lrqtj: secret "image-registry-tls" not found Apr 24 14:24:58.994557 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:58.994493 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21ba7795-411f-46a3-93c5-fedef51a27ea-cert podName:21ba7795-411f-46a3-93c5-fedef51a27ea nodeName:}" failed. No retries permitted until 2026-04-24 14:24:59.994472769 +0000 UTC m=+36.124450314 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/21ba7795-411f-46a3-93c5-fedef51a27ea-cert") pod "ingress-canary-k2t2r" (UID: "21ba7795-411f-46a3-93c5-fedef51a27ea") : secret "canary-serving-cert" not found Apr 24 14:24:58.994557 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:58.994490 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:24:58.994557 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:58.994536 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b9edb70-aaea-4a5d-bd77-289fb7865065-metrics-tls podName:5b9edb70-aaea-4a5d-bd77-289fb7865065 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:59.994525759 +0000 UTC m=+36.124503297 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5b9edb70-aaea-4a5d-bd77-289fb7865065-metrics-tls") pod "dns-default-752tj" (UID: "5b9edb70-aaea-4a5d-bd77-289fb7865065") : secret "dns-default-metrics-tls" not found Apr 24 14:24:58.994557 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:24:58.994553 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-registry-tls podName:0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:59.99454749 +0000 UTC m=+36.124525028 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-registry-tls") pod "image-registry-55fdbcc56d-lrqtj" (UID: "0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00") : secret "image-registry-tls" not found Apr 24 14:24:59.456716 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:59.456683 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-trzpj" Apr 24 14:24:59.456848 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:59.456732 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dgzq2" Apr 24 14:24:59.456848 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:59.456788 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fsnj5" Apr 24 14:24:59.462143 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:59.462121 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 14:24:59.462250 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:59.462145 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 14:24:59.463257 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:59.462907 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 14:24:59.463257 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:59.462929 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 14:24:59.463257 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:59.463102 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-qh9cm\"" Apr 24 14:24:59.463257 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:59.463198 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-crqvq\"" Apr 24 14:24:59.616253 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:59.616222 2571 generic.go:358] "Generic (PLEG): container finished" podID="6c91c545-ee17-43ad-8a08-42be9b2cda48" containerID="fecf6c0fcfd621b3887961256ddcaba8b884d61968cd1aa31f8d499927aa0ad8" exitCode=0 Apr 24 14:24:59.616253 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:24:59.616257 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-42l4z" event={"ID":"6c91c545-ee17-43ad-8a08-42be9b2cda48","Type":"ContainerDied","Data":"fecf6c0fcfd621b3887961256ddcaba8b884d61968cd1aa31f8d499927aa0ad8"} Apr 24 14:25:00.003276 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:00.003192 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21ba7795-411f-46a3-93c5-fedef51a27ea-cert\") pod \"ingress-canary-k2t2r\" (UID: \"21ba7795-411f-46a3-93c5-fedef51a27ea\") " pod="openshift-ingress-canary/ingress-canary-k2t2r" Apr 24 14:25:00.003276 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:00.003238 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b9edb70-aaea-4a5d-bd77-289fb7865065-metrics-tls\") pod \"dns-default-752tj\" (UID: \"5b9edb70-aaea-4a5d-bd77-289fb7865065\") " pod="openshift-dns/dns-default-752tj" Apr 24 14:25:00.003472 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:25:00.003338 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:25:00.003472 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:00.003339 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-registry-tls\") pod \"image-registry-55fdbcc56d-lrqtj\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" Apr 24 14:25:00.003472 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:25:00.003386 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b9edb70-aaea-4a5d-bd77-289fb7865065-metrics-tls podName:5b9edb70-aaea-4a5d-bd77-289fb7865065 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:02.003372426 +0000 UTC m=+38.133349963 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5b9edb70-aaea-4a5d-bd77-289fb7865065-metrics-tls") pod "dns-default-752tj" (UID: "5b9edb70-aaea-4a5d-bd77-289fb7865065") : secret "dns-default-metrics-tls" not found Apr 24 14:25:00.003472 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:25:00.003429 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 14:25:00.003472 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:25:00.003443 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55fdbcc56d-lrqtj: secret "image-registry-tls" not found Apr 24 14:25:00.003472 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:25:00.003454 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:25:00.003742 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:25:00.003487 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-registry-tls podName:0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:02.00347583 +0000 UTC m=+38.133453368 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-registry-tls") pod "image-registry-55fdbcc56d-lrqtj" (UID: "0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00") : secret "image-registry-tls" not found Apr 24 14:25:00.003742 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:25:00.003498 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21ba7795-411f-46a3-93c5-fedef51a27ea-cert podName:21ba7795-411f-46a3-93c5-fedef51a27ea nodeName:}" failed. No retries permitted until 2026-04-24 14:25:02.00349272 +0000 UTC m=+38.133470258 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/21ba7795-411f-46a3-93c5-fedef51a27ea-cert") pod "ingress-canary-k2t2r" (UID: "21ba7795-411f-46a3-93c5-fedef51a27ea") : secret "canary-serving-cert" not found Apr 24 14:25:00.622704 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:00.622669 2571 generic.go:358] "Generic (PLEG): container finished" podID="6c91c545-ee17-43ad-8a08-42be9b2cda48" containerID="197d7f541e483d0bf1595fe66c11704a6c9c19cf997b4a7eb0d86ae7f23bd957" exitCode=0 Apr 24 14:25:00.623160 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:00.622716 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-42l4z" event={"ID":"6c91c545-ee17-43ad-8a08-42be9b2cda48","Type":"ContainerDied","Data":"197d7f541e483d0bf1595fe66c11704a6c9c19cf997b4a7eb0d86ae7f23bd957"} Apr 24 14:25:01.627663 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:01.627632 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-42l4z" event={"ID":"6c91c545-ee17-43ad-8a08-42be9b2cda48","Type":"ContainerStarted","Data":"f81e31b4faafddaf8fc3ba1b68147892a8929e5e41433a519fa31232b7dde459"} Apr 24 14:25:01.649449 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:01.649393 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-42l4z" podStartSLOduration=6.311867932 podStartE2EDuration="37.649379092s" podCreationTimestamp="2026-04-24 14:24:24 +0000 UTC" firstStartedPulling="2026-04-24 14:24:27.085356378 +0000 UTC m=+3.215333923" lastFinishedPulling="2026-04-24 14:24:58.422867542 +0000 UTC m=+34.552845083" observedRunningTime="2026-04-24 14:25:01.648311038 +0000 UTC m=+37.778288598" watchObservedRunningTime="2026-04-24 14:25:01.649379092 +0000 UTC m=+37.779356651" Apr 24 14:25:02.018505 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:02.018411 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21ba7795-411f-46a3-93c5-fedef51a27ea-cert\") pod \"ingress-canary-k2t2r\" (UID: \"21ba7795-411f-46a3-93c5-fedef51a27ea\") " pod="openshift-ingress-canary/ingress-canary-k2t2r" Apr 24 14:25:02.018505 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:02.018456 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b9edb70-aaea-4a5d-bd77-289fb7865065-metrics-tls\") pod \"dns-default-752tj\" (UID: \"5b9edb70-aaea-4a5d-bd77-289fb7865065\") " pod="openshift-dns/dns-default-752tj" Apr 24 14:25:02.018713 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:25:02.018554 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:25:02.018713 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:25:02.018558 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:25:02.018713 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:25:02.018601 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b9edb70-aaea-4a5d-bd77-289fb7865065-metrics-tls podName:5b9edb70-aaea-4a5d-bd77-289fb7865065 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:06.018587399 +0000 UTC m=+42.148564937 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5b9edb70-aaea-4a5d-bd77-289fb7865065-metrics-tls") pod "dns-default-752tj" (UID: "5b9edb70-aaea-4a5d-bd77-289fb7865065") : secret "dns-default-metrics-tls" not found Apr 24 14:25:02.018713 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:25:02.018615 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21ba7795-411f-46a3-93c5-fedef51a27ea-cert podName:21ba7795-411f-46a3-93c5-fedef51a27ea nodeName:}" failed. No retries permitted until 2026-04-24 14:25:06.018609023 +0000 UTC m=+42.148586561 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/21ba7795-411f-46a3-93c5-fedef51a27ea-cert") pod "ingress-canary-k2t2r" (UID: "21ba7795-411f-46a3-93c5-fedef51a27ea") : secret "canary-serving-cert" not found Apr 24 14:25:02.018713 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:02.018626 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-registry-tls\") pod \"image-registry-55fdbcc56d-lrqtj\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" Apr 24 14:25:02.018713 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:25:02.018703 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 14:25:02.018713 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:25:02.018711 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55fdbcc56d-lrqtj: secret "image-registry-tls" not found Apr 24 14:25:02.019053 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:25:02.018730 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-registry-tls podName:0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:06.018724478 +0000 UTC m=+42.148702015 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-registry-tls") pod "image-registry-55fdbcc56d-lrqtj" (UID: "0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00") : secret "image-registry-tls" not found Apr 24 14:25:05.141164 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:05.141129 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a06dcc9a-a5d4-44ad-8f76-6943b3f58258-original-pull-secret\") pod \"global-pull-secret-syncer-trzpj\" (UID: \"a06dcc9a-a5d4-44ad-8f76-6943b3f58258\") " pod="kube-system/global-pull-secret-syncer-trzpj" Apr 24 14:25:05.144198 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:05.144177 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a06dcc9a-a5d4-44ad-8f76-6943b3f58258-original-pull-secret\") pod \"global-pull-secret-syncer-trzpj\" (UID: \"a06dcc9a-a5d4-44ad-8f76-6943b3f58258\") " pod="kube-system/global-pull-secret-syncer-trzpj" Apr 24 14:25:05.166206 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:05.166187 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-trzpj" Apr 24 14:25:05.285638 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:05.285607 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-trzpj"] Apr 24 14:25:05.289117 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:25:05.289091 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda06dcc9a_a5d4_44ad_8f76_6943b3f58258.slice/crio-554c636e80fec876e44a26d07e588c72f87bb393a0442c1ba7cce6804b906a82 WatchSource:0}: Error finding container 554c636e80fec876e44a26d07e588c72f87bb393a0442c1ba7cce6804b906a82: Status 404 returned error can't find the container with id 554c636e80fec876e44a26d07e588c72f87bb393a0442c1ba7cce6804b906a82 Apr 24 14:25:05.636631 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:05.636596 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-trzpj" event={"ID":"a06dcc9a-a5d4-44ad-8f76-6943b3f58258","Type":"ContainerStarted","Data":"554c636e80fec876e44a26d07e588c72f87bb393a0442c1ba7cce6804b906a82"} Apr 24 14:25:06.047978 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:06.047893 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-registry-tls\") pod \"image-registry-55fdbcc56d-lrqtj\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" Apr 24 14:25:06.048156 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:06.047998 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21ba7795-411f-46a3-93c5-fedef51a27ea-cert\") pod \"ingress-canary-k2t2r\" (UID: \"21ba7795-411f-46a3-93c5-fedef51a27ea\") " pod="openshift-ingress-canary/ingress-canary-k2t2r" Apr 24 14:25:06.048156 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:06.048035 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b9edb70-aaea-4a5d-bd77-289fb7865065-metrics-tls\") pod \"dns-default-752tj\" (UID: \"5b9edb70-aaea-4a5d-bd77-289fb7865065\") " pod="openshift-dns/dns-default-752tj" Apr 24 14:25:06.048156 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:25:06.048068 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 14:25:06.048156 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:25:06.048084 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55fdbcc56d-lrqtj: secret "image-registry-tls" not found Apr 24 14:25:06.048156 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:25:06.048132 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:25:06.048156 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:25:06.048158 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-registry-tls podName:0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:14.048140039 +0000 UTC m=+50.178117580 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-registry-tls") pod "image-registry-55fdbcc56d-lrqtj" (UID: "0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00") : secret "image-registry-tls" not found Apr 24 14:25:06.048446 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:25:06.048160 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:25:06.048446 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:25:06.048199 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b9edb70-aaea-4a5d-bd77-289fb7865065-metrics-tls podName:5b9edb70-aaea-4a5d-bd77-289fb7865065 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:14.048168076 +0000 UTC m=+50.178145613 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5b9edb70-aaea-4a5d-bd77-289fb7865065-metrics-tls") pod "dns-default-752tj" (UID: "5b9edb70-aaea-4a5d-bd77-289fb7865065") : secret "dns-default-metrics-tls" not found Apr 24 14:25:06.048446 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:25:06.048225 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21ba7795-411f-46a3-93c5-fedef51a27ea-cert podName:21ba7795-411f-46a3-93c5-fedef51a27ea nodeName:}" failed. No retries permitted until 2026-04-24 14:25:14.04820553 +0000 UTC m=+50.178183094 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/21ba7795-411f-46a3-93c5-fedef51a27ea-cert") pod "ingress-canary-k2t2r" (UID: "21ba7795-411f-46a3-93c5-fedef51a27ea") : secret "canary-serving-cert" not found Apr 24 14:25:09.645477 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:09.645446 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-trzpj" event={"ID":"a06dcc9a-a5d4-44ad-8f76-6943b3f58258","Type":"ContainerStarted","Data":"a181a40da807000aea6a60852b7740181843802efb0535424cf166dd80b65eb8"} Apr 24 14:25:09.666485 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:09.666440 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-trzpj" podStartSLOduration=32.810029454 podStartE2EDuration="36.666426373s" podCreationTimestamp="2026-04-24 14:24:33 +0000 UTC" firstStartedPulling="2026-04-24 14:25:05.290642266 +0000 UTC m=+41.420619805" lastFinishedPulling="2026-04-24 14:25:09.147039185 +0000 UTC m=+45.277016724" observedRunningTime="2026-04-24 14:25:09.665695846 +0000 UTC m=+45.795673398" watchObservedRunningTime="2026-04-24 14:25:09.666426373 +0000 UTC m=+45.796403927" Apr 24 14:25:14.108610 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:14.108575 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-registry-tls\") pod \"image-registry-55fdbcc56d-lrqtj\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" Apr 24 14:25:14.109064 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:14.108639 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21ba7795-411f-46a3-93c5-fedef51a27ea-cert\") pod \"ingress-canary-k2t2r\" (UID: \"21ba7795-411f-46a3-93c5-fedef51a27ea\") " pod="openshift-ingress-canary/ingress-canary-k2t2r" Apr 24 14:25:14.109064 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:14.108663 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b9edb70-aaea-4a5d-bd77-289fb7865065-metrics-tls\") pod \"dns-default-752tj\" (UID: \"5b9edb70-aaea-4a5d-bd77-289fb7865065\") " pod="openshift-dns/dns-default-752tj" Apr 24 14:25:14.109064 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:25:14.108720 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 14:25:14.109064 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:25:14.108741 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55fdbcc56d-lrqtj: secret "image-registry-tls" not found Apr 24 14:25:14.109064 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:25:14.108749 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:25:14.109064 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:25:14.108769 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:25:14.109064 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:25:14.108803 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-registry-tls podName:0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:30.108783876 +0000 UTC m=+66.238761434 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-registry-tls") pod "image-registry-55fdbcc56d-lrqtj" (UID: "0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00") : secret "image-registry-tls" not found Apr 24 14:25:14.109064 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:25:14.108822 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21ba7795-411f-46a3-93c5-fedef51a27ea-cert podName:21ba7795-411f-46a3-93c5-fedef51a27ea nodeName:}" failed. No retries permitted until 2026-04-24 14:25:30.1088129 +0000 UTC m=+66.238790443 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/21ba7795-411f-46a3-93c5-fedef51a27ea-cert") pod "ingress-canary-k2t2r" (UID: "21ba7795-411f-46a3-93c5-fedef51a27ea") : secret "canary-serving-cert" not found Apr 24 14:25:14.109064 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:25:14.108835 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b9edb70-aaea-4a5d-bd77-289fb7865065-metrics-tls podName:5b9edb70-aaea-4a5d-bd77-289fb7865065 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:30.108827531 +0000 UTC m=+66.238805072 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5b9edb70-aaea-4a5d-bd77-289fb7865065-metrics-tls") pod "dns-default-752tj" (UID: "5b9edb70-aaea-4a5d-bd77-289fb7865065") : secret "dns-default-metrics-tls" not found Apr 24 14:25:23.614932 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:23.614905 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7m9zt" Apr 24 14:25:30.121438 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:30.121398 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b9edb70-aaea-4a5d-bd77-289fb7865065-metrics-tls\") pod \"dns-default-752tj\" (UID: \"5b9edb70-aaea-4a5d-bd77-289fb7865065\") " pod="openshift-dns/dns-default-752tj" Apr 24 14:25:30.121438 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:30.121444 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a022a0ca-5e80-43a6-8ee0-69dcf197d1a8-metrics-certs\") pod \"network-metrics-daemon-fsnj5\" (UID: \"a022a0ca-5e80-43a6-8ee0-69dcf197d1a8\") " pod="openshift-multus/network-metrics-daemon-fsnj5" Apr 24 14:25:30.121964 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:30.121481 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-registry-tls\") pod \"image-registry-55fdbcc56d-lrqtj\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" Apr 24 14:25:30.121964 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:30.121524 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21ba7795-411f-46a3-93c5-fedef51a27ea-cert\") pod \"ingress-canary-k2t2r\" (UID: \"21ba7795-411f-46a3-93c5-fedef51a27ea\") " pod="openshift-ingress-canary/ingress-canary-k2t2r" Apr 24 14:25:30.121964 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:25:30.121540 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:25:30.121964 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:25:30.121603 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b9edb70-aaea-4a5d-bd77-289fb7865065-metrics-tls podName:5b9edb70-aaea-4a5d-bd77-289fb7865065 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:02.121587167 +0000 UTC m=+98.251564704 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5b9edb70-aaea-4a5d-bd77-289fb7865065-metrics-tls") pod "dns-default-752tj" (UID: "5b9edb70-aaea-4a5d-bd77-289fb7865065") : secret "dns-default-metrics-tls" not found Apr 24 14:25:30.121964 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:25:30.121619 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:25:30.121964 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:25:30.121671 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 14:25:30.121964 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:25:30.121700 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55fdbcc56d-lrqtj: secret "image-registry-tls" not found Apr 24 14:25:30.121964 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:25:30.121678 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21ba7795-411f-46a3-93c5-fedef51a27ea-cert podName:21ba7795-411f-46a3-93c5-fedef51a27ea nodeName:}" failed. No retries permitted until 2026-04-24 14:26:02.121664121 +0000 UTC m=+98.251641673 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/21ba7795-411f-46a3-93c5-fedef51a27ea-cert") pod "ingress-canary-k2t2r" (UID: "21ba7795-411f-46a3-93c5-fedef51a27ea") : secret "canary-serving-cert" not found Apr 24 14:25:30.121964 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:25:30.121778 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-registry-tls podName:0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:02.121760911 +0000 UTC m=+98.251738452 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-registry-tls") pod "image-registry-55fdbcc56d-lrqtj" (UID: "0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00") : secret "image-registry-tls" not found Apr 24 14:25:30.123760 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:30.123742 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 14:25:30.132007 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:25:30.131972 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 14:25:30.132098 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:25:30.132058 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a022a0ca-5e80-43a6-8ee0-69dcf197d1a8-metrics-certs podName:a022a0ca-5e80-43a6-8ee0-69dcf197d1a8 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:34.13203448 +0000 UTC m=+130.262012018 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a022a0ca-5e80-43a6-8ee0-69dcf197d1a8-metrics-certs") pod "network-metrics-daemon-fsnj5" (UID: "a022a0ca-5e80-43a6-8ee0-69dcf197d1a8") : secret "metrics-daemon-secret" not found Apr 24 14:25:30.222145 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:30.222106 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mf78d\" (UniqueName: \"kubernetes.io/projected/4ebd1686-821a-4fb0-b091-8a636b80f78e-kube-api-access-mf78d\") pod \"network-check-target-dgzq2\" (UID: \"4ebd1686-821a-4fb0-b091-8a636b80f78e\") " pod="openshift-network-diagnostics/network-check-target-dgzq2" Apr 24 14:25:30.224719 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:30.224695 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 14:25:30.235096 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:30.235077 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 14:25:30.245905 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:30.245882 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf78d\" (UniqueName: \"kubernetes.io/projected/4ebd1686-821a-4fb0-b091-8a636b80f78e-kube-api-access-mf78d\") pod \"network-check-target-dgzq2\" (UID: \"4ebd1686-821a-4fb0-b091-8a636b80f78e\") " pod="openshift-network-diagnostics/network-check-target-dgzq2" Apr 24 14:25:30.378186 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:30.378103 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-qh9cm\"" Apr 24 14:25:30.386658 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:30.386640 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dgzq2" Apr 24 14:25:30.500215 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:30.500185 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-dgzq2"] Apr 24 14:25:30.503581 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:25:30.503552 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ebd1686_821a_4fb0_b091_8a636b80f78e.slice/crio-54bd801e85fa274862d5a2cdcec8e19f316a39b96e057d7437a8f0faa56e0a0e WatchSource:0}: Error finding container 54bd801e85fa274862d5a2cdcec8e19f316a39b96e057d7437a8f0faa56e0a0e: Status 404 returned error can't find the container with id 54bd801e85fa274862d5a2cdcec8e19f316a39b96e057d7437a8f0faa56e0a0e Apr 24 14:25:30.684696 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:30.684614 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-dgzq2" event={"ID":"4ebd1686-821a-4fb0-b091-8a636b80f78e","Type":"ContainerStarted","Data":"54bd801e85fa274862d5a2cdcec8e19f316a39b96e057d7437a8f0faa56e0a0e"} Apr 24 14:25:33.691466 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:33.691430 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-dgzq2" event={"ID":"4ebd1686-821a-4fb0-b091-8a636b80f78e","Type":"ContainerStarted","Data":"fb123ce13ae20d77ac593d0c98573da91fd27edacf0568ab07a30ab62fb58fb2"} Apr 24 14:25:33.691855 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:33.691567 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-dgzq2" Apr 24 14:25:33.707186 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:25:33.707144 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-dgzq2" podStartSLOduration=67.165849 podStartE2EDuration="1m9.707130251s" podCreationTimestamp="2026-04-24 14:24:24 +0000 UTC" firstStartedPulling="2026-04-24 14:25:30.505908802 +0000 UTC m=+66.635886340" lastFinishedPulling="2026-04-24 14:25:33.047190043 +0000 UTC m=+69.177167591" observedRunningTime="2026-04-24 14:25:33.70652674 +0000 UTC m=+69.836504302" watchObservedRunningTime="2026-04-24 14:25:33.707130251 +0000 UTC m=+69.837107807" Apr 24 14:26:02.139759 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:02.139700 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21ba7795-411f-46a3-93c5-fedef51a27ea-cert\") pod \"ingress-canary-k2t2r\" (UID: \"21ba7795-411f-46a3-93c5-fedef51a27ea\") " pod="openshift-ingress-canary/ingress-canary-k2t2r" Apr 24 14:26:02.139759 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:02.139759 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b9edb70-aaea-4a5d-bd77-289fb7865065-metrics-tls\") pod \"dns-default-752tj\" (UID: \"5b9edb70-aaea-4a5d-bd77-289fb7865065\") " pod="openshift-dns/dns-default-752tj" Apr 24 14:26:02.140263 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:02.139791 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-registry-tls\") pod \"image-registry-55fdbcc56d-lrqtj\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" Apr 24 14:26:02.140263 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:02.139856 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:26:02.140263 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:02.139886 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 14:26:02.140263 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:02.139893 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:26:02.140263 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:02.139923 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21ba7795-411f-46a3-93c5-fedef51a27ea-cert podName:21ba7795-411f-46a3-93c5-fedef51a27ea nodeName:}" failed. No retries permitted until 2026-04-24 14:27:06.139908331 +0000 UTC m=+162.269885869 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/21ba7795-411f-46a3-93c5-fedef51a27ea-cert") pod "ingress-canary-k2t2r" (UID: "21ba7795-411f-46a3-93c5-fedef51a27ea") : secret "canary-serving-cert" not found Apr 24 14:26:02.140263 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:02.139948 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b9edb70-aaea-4a5d-bd77-289fb7865065-metrics-tls podName:5b9edb70-aaea-4a5d-bd77-289fb7865065 nodeName:}" failed. No retries permitted until 2026-04-24 14:27:06.139935146 +0000 UTC m=+162.269912689 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5b9edb70-aaea-4a5d-bd77-289fb7865065-metrics-tls") pod "dns-default-752tj" (UID: "5b9edb70-aaea-4a5d-bd77-289fb7865065") : secret "dns-default-metrics-tls" not found Apr 24 14:26:02.140263 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:02.139897 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55fdbcc56d-lrqtj: secret "image-registry-tls" not found Apr 24 14:26:02.140263 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:02.139977 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-registry-tls podName:0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00 nodeName:}" failed. No retries permitted until 2026-04-24 14:27:06.139969945 +0000 UTC m=+162.269947483 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-registry-tls") pod "image-registry-55fdbcc56d-lrqtj" (UID: "0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00") : secret "image-registry-tls" not found Apr 24 14:26:04.696665 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:04.696628 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-dgzq2" Apr 24 14:26:34.153914 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:34.153871 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a022a0ca-5e80-43a6-8ee0-69dcf197d1a8-metrics-certs\") pod \"network-metrics-daemon-fsnj5\" (UID: \"a022a0ca-5e80-43a6-8ee0-69dcf197d1a8\") " pod="openshift-multus/network-metrics-daemon-fsnj5" Apr 24 14:26:34.154400 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:34.154033 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 14:26:34.154400 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:34.154109 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a022a0ca-5e80-43a6-8ee0-69dcf197d1a8-metrics-certs podName:a022a0ca-5e80-43a6-8ee0-69dcf197d1a8 nodeName:}" failed. No retries permitted until 2026-04-24 14:28:36.154090926 +0000 UTC m=+252.284068465 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a022a0ca-5e80-43a6-8ee0-69dcf197d1a8-metrics-certs") pod "network-metrics-daemon-fsnj5" (UID: "a022a0ca-5e80-43a6-8ee0-69dcf197d1a8") : secret "metrics-daemon-secret" not found Apr 24 14:26:35.248604 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.248568 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g7vcx"] Apr 24 14:26:35.250414 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.250388 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g7vcx" Apr 24 14:26:35.251510 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.251485 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-jdhx4"] Apr 24 14:26:35.253832 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.253815 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-jdhx4" Apr 24 14:26:35.255682 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.255659 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 24 14:26:35.256274 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.256257 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-rhrxf\"" Apr 24 14:26:35.258379 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.258363 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 24 14:26:35.258617 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.258599 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 24 14:26:35.258671 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.258616 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-xbnr9\"" Apr 24 14:26:35.258671 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.258623 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 24 14:26:35.262164 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.262143 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g7vcx"] Apr 24 14:26:35.272946 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.272927 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-jdhx4"] Apr 24 14:26:35.358577 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.358549 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-7dqjb"] Apr 24 14:26:35.363409 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.363054 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-8jnpj"] Apr 24 14:26:35.363544 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.363451 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7dqjb" Apr 24 14:26:35.363633 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.363600 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjdjg\" (UniqueName: \"kubernetes.io/projected/ead538d9-0357-4366-978f-3383d34778d6-kube-api-access-jjdjg\") pod \"volume-data-source-validator-7c6cbb6c87-g7vcx\" (UID: \"ead538d9-0357-4366-978f-3383d34778d6\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g7vcx" Apr 24 14:26:35.363704 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.363678 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/03107b23-78b0-454f-9952-be259de46a01-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jdhx4\" (UID: \"03107b23-78b0-454f-9952-be259de46a01\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jdhx4" Apr 24 14:26:35.363807 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.363789 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/03107b23-78b0-454f-9952-be259de46a01-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-jdhx4\" (UID: \"03107b23-78b0-454f-9952-be259de46a01\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jdhx4" Apr 24 14:26:35.365307 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.365288 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-rkjfd"] Apr 24 14:26:35.365433 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.365417 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-mwxxl\"" Apr 24 14:26:35.365480 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.365435 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jnpj" Apr 24 14:26:35.366938 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.366919 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-mk446"] Apr 24 14:26:35.367040 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.367021 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-rkjfd" Apr 24 14:26:35.367374 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.367352 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 24 14:26:35.367776 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.367762 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 14:26:35.368571 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.368554 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-mk446" Apr 24 14:26:35.368975 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.368957 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-fkzbj\"" Apr 24 14:26:35.369189 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.369170 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 14:26:35.370272 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.370151 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 24 14:26:35.371453 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.371436 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-x4mcr\"" Apr 24 14:26:35.371551 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.371488 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 14:26:35.371612 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.371598 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 24 14:26:35.371697 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.371679 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-hflbt\"" Apr 24 14:26:35.371955 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.371936 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 24 14:26:35.372095 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.372028 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 14:26:35.372216 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.372126 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 24 14:26:35.372345 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.372328 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 24 14:26:35.372412 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.372359 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 24 14:26:35.372559 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.372543 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 24 14:26:35.372667 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.372653 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-7dqjb"] Apr 24 14:26:35.373757 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.373737 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-8jnpj"] Apr 24 14:26:35.375479 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.375352 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-rkjfd"] Apr 24 14:26:35.377460 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.377444 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-mk446"] Apr 24 14:26:35.377898 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.377879 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 24 14:26:35.378529 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.378510 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 24 14:26:35.459471 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.459441 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-dc7c85968-65n67"] Apr 24 14:26:35.461517 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.461498 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-dc7c85968-65n67" Apr 24 14:26:35.464188 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.464166 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 24 14:26:35.464302 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.464170 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 24 14:26:35.464302 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.464208 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 24 14:26:35.464302 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.464225 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwgwz\" (UniqueName: \"kubernetes.io/projected/6f64bec9-dd77-4222-8388-3b584743cfa7-kube-api-access-nwgwz\") pod \"insights-operator-585dfdc468-rkjfd\" (UID: \"6f64bec9-dd77-4222-8388-3b584743cfa7\") " pod="openshift-insights/insights-operator-585dfdc468-rkjfd" Apr 24 14:26:35.464302 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.464260 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f64bec9-dd77-4222-8388-3b584743cfa7-service-ca-bundle\") pod \"insights-operator-585dfdc468-rkjfd\" (UID: \"6f64bec9-dd77-4222-8388-3b584743cfa7\") " pod="openshift-insights/insights-operator-585dfdc468-rkjfd" Apr 24 14:26:35.464576 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.464347 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 24 14:26:35.464576 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.464346 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/83b693cc-250b-45e7-b205-baf7f0feff6b-trusted-ca\") pod \"console-operator-9d4b6777b-mk446\" (UID: \"83b693cc-250b-45e7-b205-baf7f0feff6b\") " pod="openshift-console-operator/console-operator-9d4b6777b-mk446" Apr 24 14:26:35.464576 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.464446 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltg7g\" (UniqueName: \"kubernetes.io/projected/83b693cc-250b-45e7-b205-baf7f0feff6b-kube-api-access-ltg7g\") pod \"console-operator-9d4b6777b-mk446\" (UID: \"83b693cc-250b-45e7-b205-baf7f0feff6b\") " pod="openshift-console-operator/console-operator-9d4b6777b-mk446" Apr 24 14:26:35.464576 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.464473 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f64bec9-dd77-4222-8388-3b584743cfa7-serving-cert\") pod \"insights-operator-585dfdc468-rkjfd\" (UID: \"6f64bec9-dd77-4222-8388-3b584743cfa7\") " pod="openshift-insights/insights-operator-585dfdc468-rkjfd" Apr 24 14:26:35.464576 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.464503 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/6f64bec9-dd77-4222-8388-3b584743cfa7-snapshots\") pod \"insights-operator-585dfdc468-rkjfd\" (UID: \"6f64bec9-dd77-4222-8388-3b584743cfa7\") " pod="openshift-insights/insights-operator-585dfdc468-rkjfd" Apr 24 14:26:35.464576 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.464531 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jjdjg\" (UniqueName: \"kubernetes.io/projected/ead538d9-0357-4366-978f-3383d34778d6-kube-api-access-jjdjg\") pod \"volume-data-source-validator-7c6cbb6c87-g7vcx\" (UID: \"ead538d9-0357-4366-978f-3383d34778d6\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g7vcx" Apr 24 14:26:35.464576 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.464566 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83b693cc-250b-45e7-b205-baf7f0feff6b-serving-cert\") pod \"console-operator-9d4b6777b-mk446\" (UID: \"83b693cc-250b-45e7-b205-baf7f0feff6b\") " pod="openshift-console-operator/console-operator-9d4b6777b-mk446" Apr 24 14:26:35.464576 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.464576 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 24 14:26:35.465039 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.464605 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwx48\" (UniqueName: \"kubernetes.io/projected/ef39bb3c-1f6c-4174-b851-2464c20d74cf-kube-api-access-vwx48\") pod \"network-check-source-8894fc9bd-7dqjb\" (UID: \"ef39bb3c-1f6c-4174-b851-2464c20d74cf\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7dqjb" Apr 24 14:26:35.465039 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.464640 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/03107b23-78b0-454f-9952-be259de46a01-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jdhx4\" (UID: \"03107b23-78b0-454f-9952-be259de46a01\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jdhx4" Apr 24 14:26:35.465039 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.464670 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83b693cc-250b-45e7-b205-baf7f0feff6b-config\") pod \"console-operator-9d4b6777b-mk446\" (UID: \"83b693cc-250b-45e7-b205-baf7f0feff6b\") " pod="openshift-console-operator/console-operator-9d4b6777b-mk446" Apr 24 14:26:35.465039 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.464694 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fjr9\" (UniqueName: \"kubernetes.io/projected/75b83f1b-1fca-4a30-867f-a76c5b6bfe4f-kube-api-access-2fjr9\") pod \"cluster-monitoring-operator-75587bd455-8jnpj\" (UID: \"75b83f1b-1fca-4a30-867f-a76c5b6bfe4f\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jnpj" Apr 24 14:26:35.465039 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.464726 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f64bec9-dd77-4222-8388-3b584743cfa7-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-rkjfd\" (UID: \"6f64bec9-dd77-4222-8388-3b584743cfa7\") " pod="openshift-insights/insights-operator-585dfdc468-rkjfd" Apr 24 14:26:35.465039 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:35.464744 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 14:26:35.465039 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.464753 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/75b83f1b-1fca-4a30-867f-a76c5b6bfe4f-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-8jnpj\" (UID: \"75b83f1b-1fca-4a30-867f-a76c5b6bfe4f\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jnpj" Apr 24 14:26:35.465039 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.464803 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/75b83f1b-1fca-4a30-867f-a76c5b6bfe4f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8jnpj\" (UID: \"75b83f1b-1fca-4a30-867f-a76c5b6bfe4f\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jnpj" Apr 24 14:26:35.465039 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:35.464818 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03107b23-78b0-454f-9952-be259de46a01-networking-console-plugin-cert podName:03107b23-78b0-454f-9952-be259de46a01 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:35.964800389 +0000 UTC m=+132.094777938 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/03107b23-78b0-454f-9952-be259de46a01-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-jdhx4" (UID: "03107b23-78b0-454f-9952-be259de46a01") : secret "networking-console-plugin-cert" not found Apr 24 14:26:35.465039 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.464862 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 24 14:26:35.465039 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.464879 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6f64bec9-dd77-4222-8388-3b584743cfa7-tmp\") pod \"insights-operator-585dfdc468-rkjfd\" (UID: \"6f64bec9-dd77-4222-8388-3b584743cfa7\") " pod="openshift-insights/insights-operator-585dfdc468-rkjfd" Apr 24 14:26:35.465039 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.464919 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/03107b23-78b0-454f-9952-be259de46a01-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-jdhx4\" (UID: \"03107b23-78b0-454f-9952-be259de46a01\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jdhx4" Apr 24 14:26:35.465598 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.465151 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-s88rz\"" Apr 24 14:26:35.465598 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.465435 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/03107b23-78b0-454f-9952-be259de46a01-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-jdhx4\" (UID: \"03107b23-78b0-454f-9952-be259de46a01\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jdhx4" Apr 24 14:26:35.476390 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.476372 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-dc7c85968-65n67"] Apr 24 14:26:35.481694 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.481666 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjdjg\" (UniqueName: \"kubernetes.io/projected/ead538d9-0357-4366-978f-3383d34778d6-kube-api-access-jjdjg\") pod \"volume-data-source-validator-7c6cbb6c87-g7vcx\" (UID: \"ead538d9-0357-4366-978f-3383d34778d6\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g7vcx" Apr 24 14:26:35.560486 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.560412 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g7vcx" Apr 24 14:26:35.569842 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.569810 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f64bec9-dd77-4222-8388-3b584743cfa7-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-rkjfd\" (UID: \"6f64bec9-dd77-4222-8388-3b584743cfa7\") " pod="openshift-insights/insights-operator-585dfdc468-rkjfd" Apr 24 14:26:35.569965 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.569861 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-stats-auth\") pod \"router-default-dc7c85968-65n67\" (UID: \"fbe79ce3-ee9a-4d46-a8e8-345e1e315824\") " pod="openshift-ingress/router-default-dc7c85968-65n67" Apr 24 14:26:35.569965 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.569894 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/75b83f1b-1fca-4a30-867f-a76c5b6bfe4f-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-8jnpj\" (UID: \"75b83f1b-1fca-4a30-867f-a76c5b6bfe4f\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jnpj" Apr 24 14:26:35.569965 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.569922 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/75b83f1b-1fca-4a30-867f-a76c5b6bfe4f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8jnpj\" (UID: \"75b83f1b-1fca-4a30-867f-a76c5b6bfe4f\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jnpj" Apr 24 14:26:35.569965 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.569955 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-service-ca-bundle\") pod \"router-default-dc7c85968-65n67\" (UID: \"fbe79ce3-ee9a-4d46-a8e8-345e1e315824\") " pod="openshift-ingress/router-default-dc7c85968-65n67" Apr 24 14:26:35.570277 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.570006 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6f64bec9-dd77-4222-8388-3b584743cfa7-tmp\") pod \"insights-operator-585dfdc468-rkjfd\" (UID: \"6f64bec9-dd77-4222-8388-3b584743cfa7\") " pod="openshift-insights/insights-operator-585dfdc468-rkjfd" Apr 24 14:26:35.570277 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.570045 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czjdh\" (UniqueName: \"kubernetes.io/projected/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-kube-api-access-czjdh\") pod \"router-default-dc7c85968-65n67\" (UID: \"fbe79ce3-ee9a-4d46-a8e8-345e1e315824\") " pod="openshift-ingress/router-default-dc7c85968-65n67" Apr 24 14:26:35.570277 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.570196 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nwgwz\" (UniqueName: \"kubernetes.io/projected/6f64bec9-dd77-4222-8388-3b584743cfa7-kube-api-access-nwgwz\") pod \"insights-operator-585dfdc468-rkjfd\" (UID: \"6f64bec9-dd77-4222-8388-3b584743cfa7\") " pod="openshift-insights/insights-operator-585dfdc468-rkjfd" Apr 24 14:26:35.570277 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.570247 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-metrics-certs\") pod \"router-default-dc7c85968-65n67\" (UID: \"fbe79ce3-ee9a-4d46-a8e8-345e1e315824\") " pod="openshift-ingress/router-default-dc7c85968-65n67" Apr 24 14:26:35.570483 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.570282 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f64bec9-dd77-4222-8388-3b584743cfa7-service-ca-bundle\") pod \"insights-operator-585dfdc468-rkjfd\" (UID: \"6f64bec9-dd77-4222-8388-3b584743cfa7\") " pod="openshift-insights/insights-operator-585dfdc468-rkjfd" Apr 24 14:26:35.570483 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.570372 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/83b693cc-250b-45e7-b205-baf7f0feff6b-trusted-ca\") pod \"console-operator-9d4b6777b-mk446\" (UID: \"83b693cc-250b-45e7-b205-baf7f0feff6b\") " pod="openshift-console-operator/console-operator-9d4b6777b-mk446" Apr 24 14:26:35.570483 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.570400 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ltg7g\" (UniqueName: \"kubernetes.io/projected/83b693cc-250b-45e7-b205-baf7f0feff6b-kube-api-access-ltg7g\") pod \"console-operator-9d4b6777b-mk446\" (UID: \"83b693cc-250b-45e7-b205-baf7f0feff6b\") " pod="openshift-console-operator/console-operator-9d4b6777b-mk446" Apr 24 14:26:35.570483 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.570467 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f64bec9-dd77-4222-8388-3b584743cfa7-serving-cert\") pod \"insights-operator-585dfdc468-rkjfd\" (UID: \"6f64bec9-dd77-4222-8388-3b584743cfa7\") " pod="openshift-insights/insights-operator-585dfdc468-rkjfd" Apr 24 14:26:35.570665 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.570504 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/6f64bec9-dd77-4222-8388-3b584743cfa7-snapshots\") pod \"insights-operator-585dfdc468-rkjfd\" (UID: \"6f64bec9-dd77-4222-8388-3b584743cfa7\") " pod="openshift-insights/insights-operator-585dfdc468-rkjfd" Apr 24 14:26:35.570665 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.570560 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-default-certificate\") pod \"router-default-dc7c85968-65n67\" (UID: \"fbe79ce3-ee9a-4d46-a8e8-345e1e315824\") " pod="openshift-ingress/router-default-dc7c85968-65n67" Apr 24 14:26:35.570665 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.570618 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83b693cc-250b-45e7-b205-baf7f0feff6b-serving-cert\") pod \"console-operator-9d4b6777b-mk446\" (UID: \"83b693cc-250b-45e7-b205-baf7f0feff6b\") " pod="openshift-console-operator/console-operator-9d4b6777b-mk446" Apr 24 14:26:35.570665 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.570636 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/75b83f1b-1fca-4a30-867f-a76c5b6bfe4f-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-8jnpj\" (UID: \"75b83f1b-1fca-4a30-867f-a76c5b6bfe4f\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jnpj" Apr 24 14:26:35.570665 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.570657 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vwx48\" (UniqueName: \"kubernetes.io/projected/ef39bb3c-1f6c-4174-b851-2464c20d74cf-kube-api-access-vwx48\") pod \"network-check-source-8894fc9bd-7dqjb\" (UID: \"ef39bb3c-1f6c-4174-b851-2464c20d74cf\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7dqjb" Apr 24 14:26:35.570878 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.570702 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83b693cc-250b-45e7-b205-baf7f0feff6b-config\") pod \"console-operator-9d4b6777b-mk446\" (UID: \"83b693cc-250b-45e7-b205-baf7f0feff6b\") " pod="openshift-console-operator/console-operator-9d4b6777b-mk446" Apr 24 14:26:35.570878 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:35.570707 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 14:26:35.570878 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.570729 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fjr9\" (UniqueName: \"kubernetes.io/projected/75b83f1b-1fca-4a30-867f-a76c5b6bfe4f-kube-api-access-2fjr9\") pod \"cluster-monitoring-operator-75587bd455-8jnpj\" (UID: \"75b83f1b-1fca-4a30-867f-a76c5b6bfe4f\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jnpj" Apr 24 14:26:35.570878 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:35.570767 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75b83f1b-1fca-4a30-867f-a76c5b6bfe4f-cluster-monitoring-operator-tls podName:75b83f1b-1fca-4a30-867f-a76c5b6bfe4f nodeName:}" failed. No retries permitted until 2026-04-24 14:26:36.070748856 +0000 UTC m=+132.200726404 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/75b83f1b-1fca-4a30-867f-a76c5b6bfe4f-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8jnpj" (UID: "75b83f1b-1fca-4a30-867f-a76c5b6bfe4f") : secret "cluster-monitoring-operator-tls" not found Apr 24 14:26:35.571117 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.570926 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6f64bec9-dd77-4222-8388-3b584743cfa7-tmp\") pod \"insights-operator-585dfdc468-rkjfd\" (UID: \"6f64bec9-dd77-4222-8388-3b584743cfa7\") " pod="openshift-insights/insights-operator-585dfdc468-rkjfd" Apr 24 14:26:35.571865 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.571842 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/83b693cc-250b-45e7-b205-baf7f0feff6b-trusted-ca\") pod \"console-operator-9d4b6777b-mk446\" (UID: \"83b693cc-250b-45e7-b205-baf7f0feff6b\") " pod="openshift-console-operator/console-operator-9d4b6777b-mk446" Apr 24 14:26:35.572027 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.572005 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f64bec9-dd77-4222-8388-3b584743cfa7-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-rkjfd\" (UID: \"6f64bec9-dd77-4222-8388-3b584743cfa7\") " pod="openshift-insights/insights-operator-585dfdc468-rkjfd" Apr 24 14:26:35.572555 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.572533 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/6f64bec9-dd77-4222-8388-3b584743cfa7-snapshots\") pod \"insights-operator-585dfdc468-rkjfd\" (UID: \"6f64bec9-dd77-4222-8388-3b584743cfa7\") " pod="openshift-insights/insights-operator-585dfdc468-rkjfd" Apr 24 14:26:35.572659 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.572580 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f64bec9-dd77-4222-8388-3b584743cfa7-service-ca-bundle\") pod \"insights-operator-585dfdc468-rkjfd\" (UID: \"6f64bec9-dd77-4222-8388-3b584743cfa7\") " pod="openshift-insights/insights-operator-585dfdc468-rkjfd" Apr 24 14:26:35.572823 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.572800 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83b693cc-250b-45e7-b205-baf7f0feff6b-config\") pod \"console-operator-9d4b6777b-mk446\" (UID: \"83b693cc-250b-45e7-b205-baf7f0feff6b\") " pod="openshift-console-operator/console-operator-9d4b6777b-mk446" Apr 24 14:26:35.573833 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.573805 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f64bec9-dd77-4222-8388-3b584743cfa7-serving-cert\") pod \"insights-operator-585dfdc468-rkjfd\" (UID: \"6f64bec9-dd77-4222-8388-3b584743cfa7\") " pod="openshift-insights/insights-operator-585dfdc468-rkjfd" Apr 24 14:26:35.574146 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.574126 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83b693cc-250b-45e7-b205-baf7f0feff6b-serving-cert\") pod \"console-operator-9d4b6777b-mk446\" (UID: \"83b693cc-250b-45e7-b205-baf7f0feff6b\") " pod="openshift-console-operator/console-operator-9d4b6777b-mk446" Apr 24 14:26:35.583094 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.583064 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwgwz\" (UniqueName: \"kubernetes.io/projected/6f64bec9-dd77-4222-8388-3b584743cfa7-kube-api-access-nwgwz\") pod \"insights-operator-585dfdc468-rkjfd\" (UID: \"6f64bec9-dd77-4222-8388-3b584743cfa7\") " pod="openshift-insights/insights-operator-585dfdc468-rkjfd" Apr 24 14:26:35.583251 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.583232 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltg7g\" (UniqueName: \"kubernetes.io/projected/83b693cc-250b-45e7-b205-baf7f0feff6b-kube-api-access-ltg7g\") pod \"console-operator-9d4b6777b-mk446\" (UID: \"83b693cc-250b-45e7-b205-baf7f0feff6b\") " pod="openshift-console-operator/console-operator-9d4b6777b-mk446" Apr 24 14:26:35.583339 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.583319 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fjr9\" (UniqueName: \"kubernetes.io/projected/75b83f1b-1fca-4a30-867f-a76c5b6bfe4f-kube-api-access-2fjr9\") pod \"cluster-monitoring-operator-75587bd455-8jnpj\" (UID: \"75b83f1b-1fca-4a30-867f-a76c5b6bfe4f\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jnpj" Apr 24 14:26:35.583395 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.583373 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwx48\" (UniqueName: \"kubernetes.io/projected/ef39bb3c-1f6c-4174-b851-2464c20d74cf-kube-api-access-vwx48\") pod \"network-check-source-8894fc9bd-7dqjb\" (UID: \"ef39bb3c-1f6c-4174-b851-2464c20d74cf\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7dqjb" Apr 24 14:26:35.671308 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.671275 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-service-ca-bundle\") pod \"router-default-dc7c85968-65n67\" (UID: \"fbe79ce3-ee9a-4d46-a8e8-345e1e315824\") " pod="openshift-ingress/router-default-dc7c85968-65n67" Apr 24 14:26:35.671473 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.671315 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-czjdh\" (UniqueName: \"kubernetes.io/projected/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-kube-api-access-czjdh\") pod \"router-default-dc7c85968-65n67\" (UID: \"fbe79ce3-ee9a-4d46-a8e8-345e1e315824\") " pod="openshift-ingress/router-default-dc7c85968-65n67" Apr 24 14:26:35.671473 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.671357 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-metrics-certs\") pod \"router-default-dc7c85968-65n67\" (UID: \"fbe79ce3-ee9a-4d46-a8e8-345e1e315824\") " pod="openshift-ingress/router-default-dc7c85968-65n67" Apr 24 14:26:35.671473 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.671420 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-default-certificate\") pod \"router-default-dc7c85968-65n67\" (UID: \"fbe79ce3-ee9a-4d46-a8e8-345e1e315824\") " pod="openshift-ingress/router-default-dc7c85968-65n67" Apr 24 14:26:35.671473 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:35.671456 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-service-ca-bundle podName:fbe79ce3-ee9a-4d46-a8e8-345e1e315824 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:36.171432246 +0000 UTC m=+132.301409800 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-service-ca-bundle") pod "router-default-dc7c85968-65n67" (UID: "fbe79ce3-ee9a-4d46-a8e8-345e1e315824") : configmap references non-existent config key: service-ca.crt Apr 24 14:26:35.671672 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:35.671509 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 14:26:35.671672 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.671526 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-stats-auth\") pod \"router-default-dc7c85968-65n67\" (UID: \"fbe79ce3-ee9a-4d46-a8e8-345e1e315824\") " pod="openshift-ingress/router-default-dc7c85968-65n67" Apr 24 14:26:35.671672 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:35.671572 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-metrics-certs podName:fbe79ce3-ee9a-4d46-a8e8-345e1e315824 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:36.171559304 +0000 UTC m=+132.301536843 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-metrics-certs") pod "router-default-dc7c85968-65n67" (UID: "fbe79ce3-ee9a-4d46-a8e8-345e1e315824") : secret "router-metrics-certs-default" not found Apr 24 14:26:35.674309 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.674260 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-stats-auth\") pod \"router-default-dc7c85968-65n67\" (UID: \"fbe79ce3-ee9a-4d46-a8e8-345e1e315824\") " pod="openshift-ingress/router-default-dc7c85968-65n67" Apr 24 14:26:35.674401 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.674307 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-default-certificate\") pod \"router-default-dc7c85968-65n67\" (UID: \"fbe79ce3-ee9a-4d46-a8e8-345e1e315824\") " pod="openshift-ingress/router-default-dc7c85968-65n67" Apr 24 14:26:35.674678 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.674663 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g7vcx"] Apr 24 14:26:35.675912 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.675898 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7dqjb" Apr 24 14:26:35.677686 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:26:35.677668 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podead538d9_0357_4366_978f_3383d34778d6.slice/crio-ea5313814945cd9d78fccc06bcbd6cec1edb9767b5af1a82a00ffc6703554ed1 WatchSource:0}: Error finding container ea5313814945cd9d78fccc06bcbd6cec1edb9767b5af1a82a00ffc6703554ed1: Status 404 returned error can't find the container with id ea5313814945cd9d78fccc06bcbd6cec1edb9767b5af1a82a00ffc6703554ed1 Apr 24 14:26:35.682600 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.682579 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-czjdh\" (UniqueName: \"kubernetes.io/projected/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-kube-api-access-czjdh\") pod \"router-default-dc7c85968-65n67\" (UID: \"fbe79ce3-ee9a-4d46-a8e8-345e1e315824\") " pod="openshift-ingress/router-default-dc7c85968-65n67" Apr 24 14:26:35.687782 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.687762 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-rkjfd" Apr 24 14:26:35.693370 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.693346 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-mk446" Apr 24 14:26:35.805891 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.805861 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-7dqjb"] Apr 24 14:26:35.809520 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:26:35.809487 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef39bb3c_1f6c_4174_b851_2464c20d74cf.slice/crio-341f7f0847c748c8580363e497e09c2289e440f0908b3ef5e71b942f3cc2dc6d WatchSource:0}: Error finding container 341f7f0847c748c8580363e497e09c2289e440f0908b3ef5e71b942f3cc2dc6d: Status 404 returned error can't find the container with id 341f7f0847c748c8580363e497e09c2289e440f0908b3ef5e71b942f3cc2dc6d Apr 24 14:26:35.810201 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.810154 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g7vcx" event={"ID":"ead538d9-0357-4366-978f-3383d34778d6","Type":"ContainerStarted","Data":"ea5313814945cd9d78fccc06bcbd6cec1edb9767b5af1a82a00ffc6703554ed1"} Apr 24 14:26:35.823641 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.823615 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-rkjfd"] Apr 24 14:26:35.827191 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:26:35.827165 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f64bec9_dd77_4222_8388_3b584743cfa7.slice/crio-cbdd985f209879fb29cafbd87653f8c62daad7f1fc9da26768eda359dd0b1adf WatchSource:0}: Error finding container cbdd985f209879fb29cafbd87653f8c62daad7f1fc9da26768eda359dd0b1adf: Status 404 returned error can't find the container with id cbdd985f209879fb29cafbd87653f8c62daad7f1fc9da26768eda359dd0b1adf Apr 24 14:26:35.839341 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.839321 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-mk446"] Apr 24 14:26:35.842678 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:26:35.842652 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83b693cc_250b_45e7_b205_baf7f0feff6b.slice/crio-4cfe537a1792fd8f7ac64e9d048a57a7cf5b719292747e04fa5802dd55b227c4 WatchSource:0}: Error finding container 4cfe537a1792fd8f7ac64e9d048a57a7cf5b719292747e04fa5802dd55b227c4: Status 404 returned error can't find the container with id 4cfe537a1792fd8f7ac64e9d048a57a7cf5b719292747e04fa5802dd55b227c4 Apr 24 14:26:35.974292 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:35.974260 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/03107b23-78b0-454f-9952-be259de46a01-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jdhx4\" (UID: \"03107b23-78b0-454f-9952-be259de46a01\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jdhx4" Apr 24 14:26:35.974477 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:35.974381 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 14:26:35.974477 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:35.974434 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03107b23-78b0-454f-9952-be259de46a01-networking-console-plugin-cert podName:03107b23-78b0-454f-9952-be259de46a01 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:36.974420327 +0000 UTC m=+133.104397864 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/03107b23-78b0-454f-9952-be259de46a01-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-jdhx4" (UID: "03107b23-78b0-454f-9952-be259de46a01") : secret "networking-console-plugin-cert" not found Apr 24 14:26:36.074828 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:36.074747 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/75b83f1b-1fca-4a30-867f-a76c5b6bfe4f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8jnpj\" (UID: \"75b83f1b-1fca-4a30-867f-a76c5b6bfe4f\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jnpj" Apr 24 14:26:36.074960 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:36.074894 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 14:26:36.074960 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:36.074950 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75b83f1b-1fca-4a30-867f-a76c5b6bfe4f-cluster-monitoring-operator-tls podName:75b83f1b-1fca-4a30-867f-a76c5b6bfe4f nodeName:}" failed. No retries permitted until 2026-04-24 14:26:37.074936545 +0000 UTC m=+133.204914083 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/75b83f1b-1fca-4a30-867f-a76c5b6bfe4f-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8jnpj" (UID: "75b83f1b-1fca-4a30-867f-a76c5b6bfe4f") : secret "cluster-monitoring-operator-tls" not found Apr 24 14:26:36.176067 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:36.176038 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-metrics-certs\") pod \"router-default-dc7c85968-65n67\" (UID: \"fbe79ce3-ee9a-4d46-a8e8-345e1e315824\") " pod="openshift-ingress/router-default-dc7c85968-65n67" Apr 24 14:26:36.176188 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:36.176141 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-service-ca-bundle\") pod \"router-default-dc7c85968-65n67\" (UID: \"fbe79ce3-ee9a-4d46-a8e8-345e1e315824\") " pod="openshift-ingress/router-default-dc7c85968-65n67" Apr 24 14:26:36.176188 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:36.176168 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 14:26:36.176271 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:36.176223 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-metrics-certs podName:fbe79ce3-ee9a-4d46-a8e8-345e1e315824 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:37.176208231 +0000 UTC m=+133.306185775 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-metrics-certs") pod "router-default-dc7c85968-65n67" (UID: "fbe79ce3-ee9a-4d46-a8e8-345e1e315824") : secret "router-metrics-certs-default" not found Apr 24 14:26:36.176271 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:36.176241 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-service-ca-bundle podName:fbe79ce3-ee9a-4d46-a8e8-345e1e315824 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:37.176230055 +0000 UTC m=+133.306207593 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-service-ca-bundle") pod "router-default-dc7c85968-65n67" (UID: "fbe79ce3-ee9a-4d46-a8e8-345e1e315824") : configmap references non-existent config key: service-ca.crt Apr 24 14:26:36.814133 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:36.814092 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-mk446" event={"ID":"83b693cc-250b-45e7-b205-baf7f0feff6b","Type":"ContainerStarted","Data":"4cfe537a1792fd8f7ac64e9d048a57a7cf5b719292747e04fa5802dd55b227c4"} Apr 24 14:26:36.815291 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:36.815265 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-rkjfd" event={"ID":"6f64bec9-dd77-4222-8388-3b584743cfa7","Type":"ContainerStarted","Data":"cbdd985f209879fb29cafbd87653f8c62daad7f1fc9da26768eda359dd0b1adf"} Apr 24 14:26:36.817168 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:36.817118 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7dqjb" event={"ID":"ef39bb3c-1f6c-4174-b851-2464c20d74cf","Type":"ContainerStarted","Data":"a81e583d68a5db6f8d1bdef348a512ced0fd7906293711b61c7a7b3a8016e1f3"} Apr 24 14:26:36.817168 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:36.817149 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7dqjb" event={"ID":"ef39bb3c-1f6c-4174-b851-2464c20d74cf","Type":"ContainerStarted","Data":"341f7f0847c748c8580363e497e09c2289e440f0908b3ef5e71b942f3cc2dc6d"} Apr 24 14:26:36.834535 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:36.834483 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7dqjb" podStartSLOduration=1.8344663570000002 podStartE2EDuration="1.834466357s" podCreationTimestamp="2026-04-24 14:26:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:26:36.832244332 +0000 UTC m=+132.962221895" watchObservedRunningTime="2026-04-24 14:26:36.834466357 +0000 UTC m=+132.964443920" Apr 24 14:26:36.984154 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:36.984111 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/03107b23-78b0-454f-9952-be259de46a01-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jdhx4\" (UID: \"03107b23-78b0-454f-9952-be259de46a01\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jdhx4" Apr 24 14:26:36.984329 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:36.984283 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 14:26:36.984394 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:36.984352 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03107b23-78b0-454f-9952-be259de46a01-networking-console-plugin-cert podName:03107b23-78b0-454f-9952-be259de46a01 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:38.984332638 +0000 UTC m=+135.114310178 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/03107b23-78b0-454f-9952-be259de46a01-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-jdhx4" (UID: "03107b23-78b0-454f-9952-be259de46a01") : secret "networking-console-plugin-cert" not found Apr 24 14:26:37.086250 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:37.085549 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/75b83f1b-1fca-4a30-867f-a76c5b6bfe4f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8jnpj\" (UID: \"75b83f1b-1fca-4a30-867f-a76c5b6bfe4f\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jnpj" Apr 24 14:26:37.086250 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:37.085727 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 14:26:37.086250 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:37.085800 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75b83f1b-1fca-4a30-867f-a76c5b6bfe4f-cluster-monitoring-operator-tls podName:75b83f1b-1fca-4a30-867f-a76c5b6bfe4f nodeName:}" failed. No retries permitted until 2026-04-24 14:26:39.085771468 +0000 UTC m=+135.215749017 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/75b83f1b-1fca-4a30-867f-a76c5b6bfe4f-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8jnpj" (UID: "75b83f1b-1fca-4a30-867f-a76c5b6bfe4f") : secret "cluster-monitoring-operator-tls" not found Apr 24 14:26:37.186964 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:37.186916 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-service-ca-bundle\") pod \"router-default-dc7c85968-65n67\" (UID: \"fbe79ce3-ee9a-4d46-a8e8-345e1e315824\") " pod="openshift-ingress/router-default-dc7c85968-65n67" Apr 24 14:26:37.187165 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:37.187039 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-metrics-certs\") pod \"router-default-dc7c85968-65n67\" (UID: \"fbe79ce3-ee9a-4d46-a8e8-345e1e315824\") " pod="openshift-ingress/router-default-dc7c85968-65n67" Apr 24 14:26:37.187165 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:37.187109 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-service-ca-bundle podName:fbe79ce3-ee9a-4d46-a8e8-345e1e315824 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:39.187085894 +0000 UTC m=+135.317063436 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-service-ca-bundle") pod "router-default-dc7c85968-65n67" (UID: "fbe79ce3-ee9a-4d46-a8e8-345e1e315824") : configmap references non-existent config key: service-ca.crt Apr 24 14:26:37.187293 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:37.187164 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 14:26:37.187293 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:37.187217 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-metrics-certs podName:fbe79ce3-ee9a-4d46-a8e8-345e1e315824 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:39.18720412 +0000 UTC m=+135.317181661 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-metrics-certs") pod "router-default-dc7c85968-65n67" (UID: "fbe79ce3-ee9a-4d46-a8e8-345e1e315824") : secret "router-metrics-certs-default" not found Apr 24 14:26:38.827070 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:38.827038 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mk446_83b693cc-250b-45e7-b205-baf7f0feff6b/console-operator/0.log" Apr 24 14:26:38.827523 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:38.827086 2571 generic.go:358] "Generic (PLEG): container finished" podID="83b693cc-250b-45e7-b205-baf7f0feff6b" containerID="9d29bb00059021c51cc7e674098864c24b9b79fa6933b86586e18c3735a90ad5" exitCode=255 Apr 24 14:26:38.827523 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:38.827159 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-mk446" event={"ID":"83b693cc-250b-45e7-b205-baf7f0feff6b","Type":"ContainerDied","Data":"9d29bb00059021c51cc7e674098864c24b9b79fa6933b86586e18c3735a90ad5"} Apr 24 14:26:38.827523 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:38.827446 2571 scope.go:117] "RemoveContainer" containerID="9d29bb00059021c51cc7e674098864c24b9b79fa6933b86586e18c3735a90ad5" Apr 24 14:26:38.828618 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:38.828586 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-rkjfd" event={"ID":"6f64bec9-dd77-4222-8388-3b584743cfa7","Type":"ContainerStarted","Data":"7413cb5e872e6f952ae18cb1a1d968d7113dee641ea2cf29e3132ae8bdc8ad79"} Apr 24 14:26:38.829860 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:38.829832 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g7vcx" event={"ID":"ead538d9-0357-4366-978f-3383d34778d6","Type":"ContainerStarted","Data":"4d17697beb954aa3899b03b7c058c270afe82899f27783226d258386b9acf05f"} Apr 24 14:26:38.859056 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:38.859012 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g7vcx" podStartSLOduration=1.3012455410000001 podStartE2EDuration="3.85899973s" podCreationTimestamp="2026-04-24 14:26:35 +0000 UTC" firstStartedPulling="2026-04-24 14:26:35.681129302 +0000 UTC m=+131.811106839" lastFinishedPulling="2026-04-24 14:26:38.238883483 +0000 UTC m=+134.368861028" observedRunningTime="2026-04-24 14:26:38.857998463 +0000 UTC m=+134.987976014" watchObservedRunningTime="2026-04-24 14:26:38.85899973 +0000 UTC m=+134.988977285" Apr 24 14:26:38.875137 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:38.875090 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-rkjfd" podStartSLOduration=1.4594073920000001 podStartE2EDuration="3.875072842s" podCreationTimestamp="2026-04-24 14:26:35 +0000 UTC" firstStartedPulling="2026-04-24 14:26:35.829223325 +0000 UTC m=+131.959200863" lastFinishedPulling="2026-04-24 14:26:38.244888762 +0000 UTC m=+134.374866313" observedRunningTime="2026-04-24 14:26:38.874074561 +0000 UTC m=+135.004052123" watchObservedRunningTime="2026-04-24 14:26:38.875072842 +0000 UTC m=+135.005050403" Apr 24 14:26:39.004752 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:39.004717 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/03107b23-78b0-454f-9952-be259de46a01-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jdhx4\" (UID: \"03107b23-78b0-454f-9952-be259de46a01\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jdhx4" Apr 24 14:26:39.004894 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:39.004864 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 14:26:39.005431 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:39.005045 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03107b23-78b0-454f-9952-be259de46a01-networking-console-plugin-cert podName:03107b23-78b0-454f-9952-be259de46a01 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:43.005011557 +0000 UTC m=+139.134989100 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/03107b23-78b0-454f-9952-be259de46a01-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-jdhx4" (UID: "03107b23-78b0-454f-9952-be259de46a01") : secret "networking-console-plugin-cert" not found Apr 24 14:26:39.106602 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:39.106568 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/75b83f1b-1fca-4a30-867f-a76c5b6bfe4f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8jnpj\" (UID: \"75b83f1b-1fca-4a30-867f-a76c5b6bfe4f\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jnpj" Apr 24 14:26:39.106764 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:39.106727 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 14:26:39.106839 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:39.106810 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75b83f1b-1fca-4a30-867f-a76c5b6bfe4f-cluster-monitoring-operator-tls podName:75b83f1b-1fca-4a30-867f-a76c5b6bfe4f nodeName:}" failed. No retries permitted until 2026-04-24 14:26:43.106789119 +0000 UTC m=+139.236766682 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/75b83f1b-1fca-4a30-867f-a76c5b6bfe4f-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8jnpj" (UID: "75b83f1b-1fca-4a30-867f-a76c5b6bfe4f") : secret "cluster-monitoring-operator-tls" not found Apr 24 14:26:39.207588 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:39.207555 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-service-ca-bundle\") pod \"router-default-dc7c85968-65n67\" (UID: \"fbe79ce3-ee9a-4d46-a8e8-345e1e315824\") " pod="openshift-ingress/router-default-dc7c85968-65n67" Apr 24 14:26:39.207724 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:39.207605 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-metrics-certs\") pod \"router-default-dc7c85968-65n67\" (UID: \"fbe79ce3-ee9a-4d46-a8e8-345e1e315824\") " pod="openshift-ingress/router-default-dc7c85968-65n67" Apr 24 14:26:39.207724 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:39.207683 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 14:26:39.207724 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:39.207711 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-service-ca-bundle podName:fbe79ce3-ee9a-4d46-a8e8-345e1e315824 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:43.207694317 +0000 UTC m=+139.337671856 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-service-ca-bundle") pod "router-default-dc7c85968-65n67" (UID: "fbe79ce3-ee9a-4d46-a8e8-345e1e315824") : configmap references non-existent config key: service-ca.crt Apr 24 14:26:39.207838 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:39.207791 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-metrics-certs podName:fbe79ce3-ee9a-4d46-a8e8-345e1e315824 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:43.207783983 +0000 UTC m=+139.337761521 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-metrics-certs") pod "router-default-dc7c85968-65n67" (UID: "fbe79ce3-ee9a-4d46-a8e8-345e1e315824") : secret "router-metrics-certs-default" not found Apr 24 14:26:39.834047 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:39.834017 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mk446_83b693cc-250b-45e7-b205-baf7f0feff6b/console-operator/1.log" Apr 24 14:26:39.834553 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:39.834414 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mk446_83b693cc-250b-45e7-b205-baf7f0feff6b/console-operator/0.log" Apr 24 14:26:39.834553 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:39.834459 2571 generic.go:358] "Generic (PLEG): container finished" podID="83b693cc-250b-45e7-b205-baf7f0feff6b" containerID="3a1b7252656112bc7616e3cc2ef751c4527a7d505e000f479dc7f84829f73698" exitCode=255 Apr 24 14:26:39.834553 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:39.834543 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-mk446" event={"ID":"83b693cc-250b-45e7-b205-baf7f0feff6b","Type":"ContainerDied","Data":"3a1b7252656112bc7616e3cc2ef751c4527a7d505e000f479dc7f84829f73698"} Apr 24 14:26:39.834704 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:39.834591 2571 scope.go:117] "RemoveContainer" containerID="9d29bb00059021c51cc7e674098864c24b9b79fa6933b86586e18c3735a90ad5" Apr 24 14:26:39.834801 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:39.834782 2571 scope.go:117] "RemoveContainer" containerID="3a1b7252656112bc7616e3cc2ef751c4527a7d505e000f479dc7f84829f73698" Apr 24 14:26:39.835030 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:39.835010 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-mk446_openshift-console-operator(83b693cc-250b-45e7-b205-baf7f0feff6b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-mk446" podUID="83b693cc-250b-45e7-b205-baf7f0feff6b" Apr 24 14:26:40.838399 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:40.838372 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mk446_83b693cc-250b-45e7-b205-baf7f0feff6b/console-operator/1.log" Apr 24 14:26:40.838770 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:40.838711 2571 scope.go:117] "RemoveContainer" containerID="3a1b7252656112bc7616e3cc2ef751c4527a7d505e000f479dc7f84829f73698" Apr 24 14:26:40.838894 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:40.838876 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-mk446_openshift-console-operator(83b693cc-250b-45e7-b205-baf7f0feff6b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-mk446" podUID="83b693cc-250b-45e7-b205-baf7f0feff6b" Apr 24 14:26:42.221564 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:42.221530 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-9d5rs"] Apr 24 14:26:42.223941 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:42.223925 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-9d5rs" Apr 24 14:26:42.226000 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:42.225958 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 24 14:26:42.226000 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:42.225976 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 24 14:26:42.226490 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:42.226475 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 24 14:26:42.226568 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:42.226509 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-9j588\"" Apr 24 14:26:42.226568 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:42.226539 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 24 14:26:42.231607 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:42.231585 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-9d5rs"] Apr 24 14:26:42.334943 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:42.334914 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7c5ec8d2-9c55-46d2-a524-ecf0e5b3091d-signing-key\") pod \"service-ca-865cb79987-9d5rs\" (UID: \"7c5ec8d2-9c55-46d2-a524-ecf0e5b3091d\") " pod="openshift-service-ca/service-ca-865cb79987-9d5rs" Apr 24 14:26:42.334943 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:42.334946 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6n2d\" (UniqueName: \"kubernetes.io/projected/7c5ec8d2-9c55-46d2-a524-ecf0e5b3091d-kube-api-access-n6n2d\") pod \"service-ca-865cb79987-9d5rs\" (UID: \"7c5ec8d2-9c55-46d2-a524-ecf0e5b3091d\") " pod="openshift-service-ca/service-ca-865cb79987-9d5rs" Apr 24 14:26:42.335119 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:42.335051 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7c5ec8d2-9c55-46d2-a524-ecf0e5b3091d-signing-cabundle\") pod \"service-ca-865cb79987-9d5rs\" (UID: \"7c5ec8d2-9c55-46d2-a524-ecf0e5b3091d\") " pod="openshift-service-ca/service-ca-865cb79987-9d5rs" Apr 24 14:26:42.435411 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:42.435378 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7c5ec8d2-9c55-46d2-a524-ecf0e5b3091d-signing-cabundle\") pod \"service-ca-865cb79987-9d5rs\" (UID: \"7c5ec8d2-9c55-46d2-a524-ecf0e5b3091d\") " pod="openshift-service-ca/service-ca-865cb79987-9d5rs" Apr 24 14:26:42.435537 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:42.435486 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7c5ec8d2-9c55-46d2-a524-ecf0e5b3091d-signing-key\") pod \"service-ca-865cb79987-9d5rs\" (UID: \"7c5ec8d2-9c55-46d2-a524-ecf0e5b3091d\") " pod="openshift-service-ca/service-ca-865cb79987-9d5rs" Apr 24 14:26:42.435537 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:42.435517 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n6n2d\" (UniqueName: \"kubernetes.io/projected/7c5ec8d2-9c55-46d2-a524-ecf0e5b3091d-kube-api-access-n6n2d\") pod \"service-ca-865cb79987-9d5rs\" (UID: \"7c5ec8d2-9c55-46d2-a524-ecf0e5b3091d\") " pod="openshift-service-ca/service-ca-865cb79987-9d5rs" Apr 24 14:26:42.436149 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:42.436123 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7c5ec8d2-9c55-46d2-a524-ecf0e5b3091d-signing-cabundle\") pod \"service-ca-865cb79987-9d5rs\" (UID: \"7c5ec8d2-9c55-46d2-a524-ecf0e5b3091d\") " pod="openshift-service-ca/service-ca-865cb79987-9d5rs" Apr 24 14:26:42.437885 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:42.437865 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7c5ec8d2-9c55-46d2-a524-ecf0e5b3091d-signing-key\") pod \"service-ca-865cb79987-9d5rs\" (UID: \"7c5ec8d2-9c55-46d2-a524-ecf0e5b3091d\") " pod="openshift-service-ca/service-ca-865cb79987-9d5rs" Apr 24 14:26:42.444058 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:42.444035 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6n2d\" (UniqueName: \"kubernetes.io/projected/7c5ec8d2-9c55-46d2-a524-ecf0e5b3091d-kube-api-access-n6n2d\") pod \"service-ca-865cb79987-9d5rs\" (UID: \"7c5ec8d2-9c55-46d2-a524-ecf0e5b3091d\") " pod="openshift-service-ca/service-ca-865cb79987-9d5rs" Apr 24 14:26:42.533409 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:42.533336 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-9d5rs" Apr 24 14:26:42.642700 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:42.642668 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-9d5rs"] Apr 24 14:26:42.646610 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:26:42.646577 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c5ec8d2_9c55_46d2_a524_ecf0e5b3091d.slice/crio-9a4e03f1573d9987d85e860625dbb685d54b4cf6a96caf947ade6506bf9986f0 WatchSource:0}: Error finding container 9a4e03f1573d9987d85e860625dbb685d54b4cf6a96caf947ade6506bf9986f0: Status 404 returned error can't find the container with id 9a4e03f1573d9987d85e860625dbb685d54b4cf6a96caf947ade6506bf9986f0 Apr 24 14:26:42.690476 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:42.690453 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-bmpbb_b97fcfbf-fa5a-4b45-8446-f25172d545bb/dns-node-resolver/0.log" Apr 24 14:26:42.844784 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:42.844712 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-9d5rs" event={"ID":"7c5ec8d2-9c55-46d2-a524-ecf0e5b3091d","Type":"ContainerStarted","Data":"9a4e03f1573d9987d85e860625dbb685d54b4cf6a96caf947ade6506bf9986f0"} Apr 24 14:26:43.040775 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:43.040740 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/03107b23-78b0-454f-9952-be259de46a01-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jdhx4\" (UID: \"03107b23-78b0-454f-9952-be259de46a01\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jdhx4" Apr 24 14:26:43.040964 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:43.040870 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 14:26:43.040964 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:43.040933 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03107b23-78b0-454f-9952-be259de46a01-networking-console-plugin-cert podName:03107b23-78b0-454f-9952-be259de46a01 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:51.04091665 +0000 UTC m=+147.170894188 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/03107b23-78b0-454f-9952-be259de46a01-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-jdhx4" (UID: "03107b23-78b0-454f-9952-be259de46a01") : secret "networking-console-plugin-cert" not found Apr 24 14:26:43.142322 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:43.142280 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/75b83f1b-1fca-4a30-867f-a76c5b6bfe4f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8jnpj\" (UID: \"75b83f1b-1fca-4a30-867f-a76c5b6bfe4f\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jnpj" Apr 24 14:26:43.142513 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:43.142479 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 14:26:43.142575 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:43.142560 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75b83f1b-1fca-4a30-867f-a76c5b6bfe4f-cluster-monitoring-operator-tls podName:75b83f1b-1fca-4a30-867f-a76c5b6bfe4f nodeName:}" failed. No retries permitted until 2026-04-24 14:26:51.142538929 +0000 UTC m=+147.272516469 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/75b83f1b-1fca-4a30-867f-a76c5b6bfe4f-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8jnpj" (UID: "75b83f1b-1fca-4a30-867f-a76c5b6bfe4f") : secret "cluster-monitoring-operator-tls" not found Apr 24 14:26:43.243303 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:43.243254 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-service-ca-bundle\") pod \"router-default-dc7c85968-65n67\" (UID: \"fbe79ce3-ee9a-4d46-a8e8-345e1e315824\") " pod="openshift-ingress/router-default-dc7c85968-65n67" Apr 24 14:26:43.243775 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:43.243344 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-metrics-certs\") pod \"router-default-dc7c85968-65n67\" (UID: \"fbe79ce3-ee9a-4d46-a8e8-345e1e315824\") " pod="openshift-ingress/router-default-dc7c85968-65n67" Apr 24 14:26:43.243775 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:43.243443 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-service-ca-bundle podName:fbe79ce3-ee9a-4d46-a8e8-345e1e315824 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:51.243419028 +0000 UTC m=+147.373396570 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-service-ca-bundle") pod "router-default-dc7c85968-65n67" (UID: "fbe79ce3-ee9a-4d46-a8e8-345e1e315824") : configmap references non-existent config key: service-ca.crt Apr 24 14:26:43.243775 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:43.243469 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 14:26:43.243775 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:43.243523 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-metrics-certs podName:fbe79ce3-ee9a-4d46-a8e8-345e1e315824 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:51.243505973 +0000 UTC m=+147.373483520 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-metrics-certs") pod "router-default-dc7c85968-65n67" (UID: "fbe79ce3-ee9a-4d46-a8e8-345e1e315824") : secret "router-metrics-certs-default" not found Apr 24 14:26:44.090509 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:44.090479 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-hgntn_87bcc27f-4b5f-48a3-9ae3-f93cb520eea0/node-ca/0.log" Apr 24 14:26:44.851356 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:44.851320 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-9d5rs" event={"ID":"7c5ec8d2-9c55-46d2-a524-ecf0e5b3091d","Type":"ContainerStarted","Data":"874a06b3483b9e1eca160101016070cf7e2c879eee4bf2aa93cd4149f5e6be96"} Apr 24 14:26:44.868187 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:44.868140 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-9d5rs" podStartSLOduration=1.305799467 podStartE2EDuration="2.868127831s" podCreationTimestamp="2026-04-24 14:26:42 +0000 UTC" firstStartedPulling="2026-04-24 14:26:42.648430722 +0000 UTC m=+138.778408264" lastFinishedPulling="2026-04-24 14:26:44.210759082 +0000 UTC m=+140.340736628" observedRunningTime="2026-04-24 14:26:44.867008655 +0000 UTC m=+140.996986206" watchObservedRunningTime="2026-04-24 14:26:44.868127831 +0000 UTC m=+140.998105390" Apr 24 14:26:45.693540 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:45.693501 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-mk446" Apr 24 14:26:45.693540 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:45.693545 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-mk446" Apr 24 14:26:45.693940 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:45.693927 2571 scope.go:117] "RemoveContainer" containerID="3a1b7252656112bc7616e3cc2ef751c4527a7d505e000f479dc7f84829f73698" Apr 24 14:26:45.694136 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:45.694118 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-mk446_openshift-console-operator(83b693cc-250b-45e7-b205-baf7f0feff6b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-mk446" podUID="83b693cc-250b-45e7-b205-baf7f0feff6b" Apr 24 14:26:51.109317 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:51.109278 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/03107b23-78b0-454f-9952-be259de46a01-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jdhx4\" (UID: \"03107b23-78b0-454f-9952-be259de46a01\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jdhx4" Apr 24 14:26:51.109674 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:51.109410 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 14:26:51.109674 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:51.109492 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03107b23-78b0-454f-9952-be259de46a01-networking-console-plugin-cert podName:03107b23-78b0-454f-9952-be259de46a01 nodeName:}" failed. No retries permitted until 2026-04-24 14:27:07.109476545 +0000 UTC m=+163.239454084 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/03107b23-78b0-454f-9952-be259de46a01-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-jdhx4" (UID: "03107b23-78b0-454f-9952-be259de46a01") : secret "networking-console-plugin-cert" not found Apr 24 14:26:51.209671 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:51.209633 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/75b83f1b-1fca-4a30-867f-a76c5b6bfe4f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8jnpj\" (UID: \"75b83f1b-1fca-4a30-867f-a76c5b6bfe4f\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jnpj" Apr 24 14:26:51.209837 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:51.209779 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 14:26:51.209881 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:51.209869 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75b83f1b-1fca-4a30-867f-a76c5b6bfe4f-cluster-monitoring-operator-tls podName:75b83f1b-1fca-4a30-867f-a76c5b6bfe4f nodeName:}" failed. No retries permitted until 2026-04-24 14:27:07.209851327 +0000 UTC m=+163.339828878 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/75b83f1b-1fca-4a30-867f-a76c5b6bfe4f-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8jnpj" (UID: "75b83f1b-1fca-4a30-867f-a76c5b6bfe4f") : secret "cluster-monitoring-operator-tls" not found Apr 24 14:26:51.311126 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:51.311093 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-service-ca-bundle\") pod \"router-default-dc7c85968-65n67\" (UID: \"fbe79ce3-ee9a-4d46-a8e8-345e1e315824\") " pod="openshift-ingress/router-default-dc7c85968-65n67" Apr 24 14:26:51.311284 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:51.311142 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-metrics-certs\") pod \"router-default-dc7c85968-65n67\" (UID: \"fbe79ce3-ee9a-4d46-a8e8-345e1e315824\") " pod="openshift-ingress/router-default-dc7c85968-65n67" Apr 24 14:26:51.311284 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:51.311251 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-service-ca-bundle podName:fbe79ce3-ee9a-4d46-a8e8-345e1e315824 nodeName:}" failed. No retries permitted until 2026-04-24 14:27:07.31123344 +0000 UTC m=+163.441210978 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-service-ca-bundle") pod "router-default-dc7c85968-65n67" (UID: "fbe79ce3-ee9a-4d46-a8e8-345e1e315824") : configmap references non-existent config key: service-ca.crt Apr 24 14:26:51.313468 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:51.313447 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-metrics-certs\") pod \"router-default-dc7c85968-65n67\" (UID: \"fbe79ce3-ee9a-4d46-a8e8-345e1e315824\") " pod="openshift-ingress/router-default-dc7c85968-65n67" Apr 24 14:26:58.456913 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:58.456882 2571 scope.go:117] "RemoveContainer" containerID="3a1b7252656112bc7616e3cc2ef751c4527a7d505e000f479dc7f84829f73698" Apr 24 14:26:58.889462 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:58.889434 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mk446_83b693cc-250b-45e7-b205-baf7f0feff6b/console-operator/2.log" Apr 24 14:26:58.889792 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:58.889775 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mk446_83b693cc-250b-45e7-b205-baf7f0feff6b/console-operator/1.log" Apr 24 14:26:58.889868 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:58.889818 2571 generic.go:358] "Generic (PLEG): container finished" podID="83b693cc-250b-45e7-b205-baf7f0feff6b" containerID="6732be76bbedea52241d1a2fc9f587759fe8fee20386133423760bdba6f73099" exitCode=255 Apr 24 14:26:58.889868 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:58.889847 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-mk446" event={"ID":"83b693cc-250b-45e7-b205-baf7f0feff6b","Type":"ContainerDied","Data":"6732be76bbedea52241d1a2fc9f587759fe8fee20386133423760bdba6f73099"} Apr 24 14:26:58.889939 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:58.889875 2571 scope.go:117] "RemoveContainer" containerID="3a1b7252656112bc7616e3cc2ef751c4527a7d505e000f479dc7f84829f73698" Apr 24 14:26:58.890223 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:58.890207 2571 scope.go:117] "RemoveContainer" containerID="6732be76bbedea52241d1a2fc9f587759fe8fee20386133423760bdba6f73099" Apr 24 14:26:58.890395 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:26:58.890376 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-mk446_openshift-console-operator(83b693cc-250b-45e7-b205-baf7f0feff6b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-mk446" podUID="83b693cc-250b-45e7-b205-baf7f0feff6b" Apr 24 14:26:59.893787 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:26:59.893738 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mk446_83b693cc-250b-45e7-b205-baf7f0feff6b/console-operator/2.log" Apr 24 14:27:01.243020 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:27:01.242968 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" podUID="0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00" Apr 24 14:27:01.264329 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:27:01.264297 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-752tj" podUID="5b9edb70-aaea-4a5d-bd77-289fb7865065" Apr 24 14:27:01.289672 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:27:01.289638 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-k2t2r" podUID="21ba7795-411f-46a3-93c5-fedef51a27ea" Apr 24 14:27:01.897875 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:01.897843 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-752tj" Apr 24 14:27:02.471893 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:27:02.471857 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-fsnj5" podUID="a022a0ca-5e80-43a6-8ee0-69dcf197d1a8" Apr 24 14:27:02.703206 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:02.703168 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-hp5nm"] Apr 24 14:27:02.705664 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:02.705649 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-hp5nm" Apr 24 14:27:02.707653 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:02.707628 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 14:27:02.708273 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:02.708256 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-kg8l8\"" Apr 24 14:27:02.708351 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:02.708298 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 14:27:02.716110 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:02.716087 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-hp5nm"] Apr 24 14:27:02.807467 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:02.807387 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/95fb1bf9-aa81-4ad5-950e-fec708dadd4a-crio-socket\") pod \"insights-runtime-extractor-hp5nm\" (UID: \"95fb1bf9-aa81-4ad5-950e-fec708dadd4a\") " pod="openshift-insights/insights-runtime-extractor-hp5nm" Apr 24 14:27:02.807467 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:02.807419 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/95fb1bf9-aa81-4ad5-950e-fec708dadd4a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hp5nm\" (UID: \"95fb1bf9-aa81-4ad5-950e-fec708dadd4a\") " pod="openshift-insights/insights-runtime-extractor-hp5nm" Apr 24 14:27:02.807467 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:02.807441 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/95fb1bf9-aa81-4ad5-950e-fec708dadd4a-data-volume\") pod \"insights-runtime-extractor-hp5nm\" (UID: \"95fb1bf9-aa81-4ad5-950e-fec708dadd4a\") " pod="openshift-insights/insights-runtime-extractor-hp5nm" Apr 24 14:27:02.807467 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:02.807458 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8c89\" (UniqueName: \"kubernetes.io/projected/95fb1bf9-aa81-4ad5-950e-fec708dadd4a-kube-api-access-g8c89\") pod \"insights-runtime-extractor-hp5nm\" (UID: \"95fb1bf9-aa81-4ad5-950e-fec708dadd4a\") " pod="openshift-insights/insights-runtime-extractor-hp5nm" Apr 24 14:27:02.807708 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:02.807525 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/95fb1bf9-aa81-4ad5-950e-fec708dadd4a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hp5nm\" (UID: \"95fb1bf9-aa81-4ad5-950e-fec708dadd4a\") " pod="openshift-insights/insights-runtime-extractor-hp5nm" Apr 24 14:27:02.908129 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:02.908097 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/95fb1bf9-aa81-4ad5-950e-fec708dadd4a-crio-socket\") pod \"insights-runtime-extractor-hp5nm\" (UID: \"95fb1bf9-aa81-4ad5-950e-fec708dadd4a\") " pod="openshift-insights/insights-runtime-extractor-hp5nm" Apr 24 14:27:02.908129 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:02.908132 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/95fb1bf9-aa81-4ad5-950e-fec708dadd4a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hp5nm\" (UID: \"95fb1bf9-aa81-4ad5-950e-fec708dadd4a\") " pod="openshift-insights/insights-runtime-extractor-hp5nm" Apr 24 14:27:02.908345 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:02.908153 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/95fb1bf9-aa81-4ad5-950e-fec708dadd4a-data-volume\") pod \"insights-runtime-extractor-hp5nm\" (UID: \"95fb1bf9-aa81-4ad5-950e-fec708dadd4a\") " pod="openshift-insights/insights-runtime-extractor-hp5nm" Apr 24 14:27:02.908345 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:02.908177 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g8c89\" (UniqueName: \"kubernetes.io/projected/95fb1bf9-aa81-4ad5-950e-fec708dadd4a-kube-api-access-g8c89\") pod \"insights-runtime-extractor-hp5nm\" (UID: \"95fb1bf9-aa81-4ad5-950e-fec708dadd4a\") " pod="openshift-insights/insights-runtime-extractor-hp5nm" Apr 24 14:27:02.908345 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:02.908228 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/95fb1bf9-aa81-4ad5-950e-fec708dadd4a-crio-socket\") pod \"insights-runtime-extractor-hp5nm\" (UID: \"95fb1bf9-aa81-4ad5-950e-fec708dadd4a\") " pod="openshift-insights/insights-runtime-extractor-hp5nm" Apr 24 14:27:02.908477 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:02.908456 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/95fb1bf9-aa81-4ad5-950e-fec708dadd4a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hp5nm\" (UID: \"95fb1bf9-aa81-4ad5-950e-fec708dadd4a\") " pod="openshift-insights/insights-runtime-extractor-hp5nm" Apr 24 14:27:02.908595 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:02.908455 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/95fb1bf9-aa81-4ad5-950e-fec708dadd4a-data-volume\") pod \"insights-runtime-extractor-hp5nm\" (UID: \"95fb1bf9-aa81-4ad5-950e-fec708dadd4a\") " pod="openshift-insights/insights-runtime-extractor-hp5nm" Apr 24 14:27:02.909308 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:02.909291 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/95fb1bf9-aa81-4ad5-950e-fec708dadd4a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hp5nm\" (UID: \"95fb1bf9-aa81-4ad5-950e-fec708dadd4a\") " pod="openshift-insights/insights-runtime-extractor-hp5nm" Apr 24 14:27:02.911018 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:02.910997 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/95fb1bf9-aa81-4ad5-950e-fec708dadd4a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hp5nm\" (UID: \"95fb1bf9-aa81-4ad5-950e-fec708dadd4a\") " pod="openshift-insights/insights-runtime-extractor-hp5nm" Apr 24 14:27:02.920956 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:02.920933 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8c89\" (UniqueName: \"kubernetes.io/projected/95fb1bf9-aa81-4ad5-950e-fec708dadd4a-kube-api-access-g8c89\") pod \"insights-runtime-extractor-hp5nm\" (UID: \"95fb1bf9-aa81-4ad5-950e-fec708dadd4a\") " pod="openshift-insights/insights-runtime-extractor-hp5nm" Apr 24 14:27:03.013908 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:03.013874 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-hp5nm" Apr 24 14:27:03.141958 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:03.141932 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-hp5nm"] Apr 24 14:27:03.146028 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:27:03.145979 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95fb1bf9_aa81_4ad5_950e_fec708dadd4a.slice/crio-927e0f28f5d9eb0bcc0135c29955ae270e26a1859e7cb2ae578e1fa8a349c058 WatchSource:0}: Error finding container 927e0f28f5d9eb0bcc0135c29955ae270e26a1859e7cb2ae578e1fa8a349c058: Status 404 returned error can't find the container with id 927e0f28f5d9eb0bcc0135c29955ae270e26a1859e7cb2ae578e1fa8a349c058 Apr 24 14:27:03.904426 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:03.904386 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hp5nm" event={"ID":"95fb1bf9-aa81-4ad5-950e-fec708dadd4a","Type":"ContainerStarted","Data":"43a12c71cf8192f6f5e63970c518cbb678d1d762d73c788fb6c4188d01eee958"} Apr 24 14:27:03.904426 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:03.904422 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hp5nm" event={"ID":"95fb1bf9-aa81-4ad5-950e-fec708dadd4a","Type":"ContainerStarted","Data":"ad01096e8aff9cbeb29cf7bc6a0f059ea09081ffe9a307eeaddb38a2089317cc"} Apr 24 14:27:03.904426 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:03.904431 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hp5nm" event={"ID":"95fb1bf9-aa81-4ad5-950e-fec708dadd4a","Type":"ContainerStarted","Data":"927e0f28f5d9eb0bcc0135c29955ae270e26a1859e7cb2ae578e1fa8a349c058"} Apr 24 14:27:05.693752 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:05.693713 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-mk446" Apr 24 14:27:05.693752 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:05.693755 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-mk446" Apr 24 14:27:05.694235 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:05.694116 2571 scope.go:117] "RemoveContainer" containerID="6732be76bbedea52241d1a2fc9f587759fe8fee20386133423760bdba6f73099" Apr 24 14:27:05.694314 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:27:05.694297 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-mk446_openshift-console-operator(83b693cc-250b-45e7-b205-baf7f0feff6b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-mk446" podUID="83b693cc-250b-45e7-b205-baf7f0feff6b" Apr 24 14:27:05.910925 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:05.910887 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hp5nm" event={"ID":"95fb1bf9-aa81-4ad5-950e-fec708dadd4a","Type":"ContainerStarted","Data":"09f3971dbb7158b437877374143c6b6ddbfa19316f5b2b3023e9e7ae779879bb"} Apr 24 14:27:05.934176 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:05.934127 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-hp5nm" podStartSLOduration=2.030544612 podStartE2EDuration="3.934110501s" podCreationTimestamp="2026-04-24 14:27:02 +0000 UTC" firstStartedPulling="2026-04-24 14:27:03.193862977 +0000 UTC m=+159.323840518" lastFinishedPulling="2026-04-24 14:27:05.097428852 +0000 UTC m=+161.227406407" observedRunningTime="2026-04-24 14:27:05.932909466 +0000 UTC m=+162.062887025" watchObservedRunningTime="2026-04-24 14:27:05.934110501 +0000 UTC m=+162.064088058" Apr 24 14:27:06.237525 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:06.237492 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21ba7795-411f-46a3-93c5-fedef51a27ea-cert\") pod \"ingress-canary-k2t2r\" (UID: \"21ba7795-411f-46a3-93c5-fedef51a27ea\") " pod="openshift-ingress-canary/ingress-canary-k2t2r" Apr 24 14:27:06.237690 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:06.237535 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b9edb70-aaea-4a5d-bd77-289fb7865065-metrics-tls\") pod \"dns-default-752tj\" (UID: \"5b9edb70-aaea-4a5d-bd77-289fb7865065\") " pod="openshift-dns/dns-default-752tj" Apr 24 14:27:06.237690 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:06.237568 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-registry-tls\") pod \"image-registry-55fdbcc56d-lrqtj\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" Apr 24 14:27:06.239909 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:06.239872 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b9edb70-aaea-4a5d-bd77-289fb7865065-metrics-tls\") pod \"dns-default-752tj\" (UID: \"5b9edb70-aaea-4a5d-bd77-289fb7865065\") " pod="openshift-dns/dns-default-752tj" Apr 24 14:27:06.240073 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:06.239926 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21ba7795-411f-46a3-93c5-fedef51a27ea-cert\") pod \"ingress-canary-k2t2r\" (UID: \"21ba7795-411f-46a3-93c5-fedef51a27ea\") " pod="openshift-ingress-canary/ingress-canary-k2t2r" Apr 24 14:27:06.240141 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:06.240108 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-registry-tls\") pod \"image-registry-55fdbcc56d-lrqtj\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" Apr 24 14:27:06.401110 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:06.401072 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7bxm2\"" Apr 24 14:27:06.409649 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:06.409627 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-752tj" Apr 24 14:27:06.553761 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:06.553730 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-752tj"] Apr 24 14:27:06.556805 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:27:06.556776 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b9edb70_aaea_4a5d_bd77_289fb7865065.slice/crio-7d63a2040f2d011341092e4feb1ce16f625d0341f560a19ee07f43b2097a52ce WatchSource:0}: Error finding container 7d63a2040f2d011341092e4feb1ce16f625d0341f560a19ee07f43b2097a52ce: Status 404 returned error can't find the container with id 7d63a2040f2d011341092e4feb1ce16f625d0341f560a19ee07f43b2097a52ce Apr 24 14:27:06.914148 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:06.914106 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-752tj" event={"ID":"5b9edb70-aaea-4a5d-bd77-289fb7865065","Type":"ContainerStarted","Data":"7d63a2040f2d011341092e4feb1ce16f625d0341f560a19ee07f43b2097a52ce"} Apr 24 14:27:07.147380 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:07.147331 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/03107b23-78b0-454f-9952-be259de46a01-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jdhx4\" (UID: \"03107b23-78b0-454f-9952-be259de46a01\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jdhx4" Apr 24 14:27:07.150116 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:07.150080 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/03107b23-78b0-454f-9952-be259de46a01-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jdhx4\" (UID: \"03107b23-78b0-454f-9952-be259de46a01\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jdhx4" Apr 24 14:27:07.248592 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:07.248511 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/75b83f1b-1fca-4a30-867f-a76c5b6bfe4f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8jnpj\" (UID: \"75b83f1b-1fca-4a30-867f-a76c5b6bfe4f\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jnpj" Apr 24 14:27:07.251297 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:07.251270 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/75b83f1b-1fca-4a30-867f-a76c5b6bfe4f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8jnpj\" (UID: \"75b83f1b-1fca-4a30-867f-a76c5b6bfe4f\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jnpj" Apr 24 14:27:07.350034 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:07.349975 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-service-ca-bundle\") pod \"router-default-dc7c85968-65n67\" (UID: \"fbe79ce3-ee9a-4d46-a8e8-345e1e315824\") " pod="openshift-ingress/router-default-dc7c85968-65n67" Apr 24 14:27:07.350631 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:07.350609 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fbe79ce3-ee9a-4d46-a8e8-345e1e315824-service-ca-bundle\") pod \"router-default-dc7c85968-65n67\" (UID: \"fbe79ce3-ee9a-4d46-a8e8-345e1e315824\") " pod="openshift-ingress/router-default-dc7c85968-65n67" Apr 24 14:27:07.365917 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:07.365883 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-jdhx4" Apr 24 14:27:07.482285 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:07.482253 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jnpj" Apr 24 14:27:07.570528 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:07.570439 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-dc7c85968-65n67" Apr 24 14:27:07.816828 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:07.816801 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-dc7c85968-65n67"] Apr 24 14:27:07.824366 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:27:07.824333 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbe79ce3_ee9a_4d46_a8e8_345e1e315824.slice/crio-923b165c00f858a3e73726fa028df9851202a0f5cacd7f8b8010c205c459ef36 WatchSource:0}: Error finding container 923b165c00f858a3e73726fa028df9851202a0f5cacd7f8b8010c205c459ef36: Status 404 returned error can't find the container with id 923b165c00f858a3e73726fa028df9851202a0f5cacd7f8b8010c205c459ef36 Apr 24 14:27:07.833086 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:07.833065 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-8jnpj"] Apr 24 14:27:07.836159 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:27:07.836133 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75b83f1b_1fca_4a30_867f_a76c5b6bfe4f.slice/crio-34161e85feabc5af2944c150c416f318fdf12c634cb0f13f8abc61b9b0145e5e WatchSource:0}: Error finding container 34161e85feabc5af2944c150c416f318fdf12c634cb0f13f8abc61b9b0145e5e: Status 404 returned error can't find the container with id 34161e85feabc5af2944c150c416f318fdf12c634cb0f13f8abc61b9b0145e5e Apr 24 14:27:07.847238 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:07.847050 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-jdhx4"] Apr 24 14:27:07.854527 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:27:07.854502 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03107b23_78b0_454f_9952_be259de46a01.slice/crio-2edd20230c446ee60faa56f66a4fe132de929e332de939a8620705faa02c3b5e WatchSource:0}: Error finding container 2edd20230c446ee60faa56f66a4fe132de929e332de939a8620705faa02c3b5e: Status 404 returned error can't find the container with id 2edd20230c446ee60faa56f66a4fe132de929e332de939a8620705faa02c3b5e Apr 24 14:27:07.919918 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:07.919853 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-jdhx4" event={"ID":"03107b23-78b0-454f-9952-be259de46a01","Type":"ContainerStarted","Data":"2edd20230c446ee60faa56f66a4fe132de929e332de939a8620705faa02c3b5e"} Apr 24 14:27:07.922695 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:07.921864 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-dc7c85968-65n67" event={"ID":"fbe79ce3-ee9a-4d46-a8e8-345e1e315824","Type":"ContainerStarted","Data":"163235da3f34f275a2b3863f9146f6bb36cc645e913e3e0547748e1b6977ed64"} Apr 24 14:27:07.922695 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:07.921902 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-dc7c85968-65n67" event={"ID":"fbe79ce3-ee9a-4d46-a8e8-345e1e315824","Type":"ContainerStarted","Data":"923b165c00f858a3e73726fa028df9851202a0f5cacd7f8b8010c205c459ef36"} Apr 24 14:27:07.924347 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:07.924299 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jnpj" event={"ID":"75b83f1b-1fca-4a30-867f-a76c5b6bfe4f","Type":"ContainerStarted","Data":"34161e85feabc5af2944c150c416f318fdf12c634cb0f13f8abc61b9b0145e5e"} Apr 24 14:27:07.927256 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:07.927234 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-752tj" event={"ID":"5b9edb70-aaea-4a5d-bd77-289fb7865065","Type":"ContainerStarted","Data":"034946aaba861686c0404435dd00c4a35f4c89d6d71a38a4cdd5f2e5ed83f30f"} Apr 24 14:27:07.944701 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:07.944396 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-dc7c85968-65n67" podStartSLOduration=32.944379675 podStartE2EDuration="32.944379675s" podCreationTimestamp="2026-04-24 14:26:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:27:07.943104123 +0000 UTC m=+164.073081695" watchObservedRunningTime="2026-04-24 14:27:07.944379675 +0000 UTC m=+164.074357235" Apr 24 14:27:08.571464 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:08.571432 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-dc7c85968-65n67" Apr 24 14:27:08.574717 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:08.574687 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-dc7c85968-65n67" Apr 24 14:27:08.931838 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:08.931799 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-752tj" event={"ID":"5b9edb70-aaea-4a5d-bd77-289fb7865065","Type":"ContainerStarted","Data":"4c00901920eae9dc90d86459f496dc858d1974d2797757e09ecc02466306413d"} Apr 24 14:27:08.932300 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:08.932008 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-752tj" Apr 24 14:27:08.933267 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:08.933238 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-jdhx4" event={"ID":"03107b23-78b0-454f-9952-be259de46a01","Type":"ContainerStarted","Data":"b6f0767116a629afeaf6c49aab1201290e88b38904429605e081a623f94ec914"} Apr 24 14:27:08.933583 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:08.933557 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-dc7c85968-65n67" Apr 24 14:27:08.934873 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:08.934853 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-dc7c85968-65n67" Apr 24 14:27:08.962531 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:08.962482 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-752tj" podStartSLOduration=129.8262812 podStartE2EDuration="2m10.962466785s" podCreationTimestamp="2026-04-24 14:24:58 +0000 UTC" firstStartedPulling="2026-04-24 14:27:06.55924474 +0000 UTC m=+162.689222282" lastFinishedPulling="2026-04-24 14:27:07.69543031 +0000 UTC m=+163.825407867" observedRunningTime="2026-04-24 14:27:08.96120947 +0000 UTC m=+165.091187030" watchObservedRunningTime="2026-04-24 14:27:08.962466785 +0000 UTC m=+165.092444543" Apr 24 14:27:08.985866 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:08.985822 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-jdhx4" podStartSLOduration=33.113370523 podStartE2EDuration="33.985802631s" podCreationTimestamp="2026-04-24 14:26:35 +0000 UTC" firstStartedPulling="2026-04-24 14:27:07.857531608 +0000 UTC m=+163.987509153" lastFinishedPulling="2026-04-24 14:27:08.729963709 +0000 UTC m=+164.859941261" observedRunningTime="2026-04-24 14:27:08.985255754 +0000 UTC m=+165.115233314" watchObservedRunningTime="2026-04-24 14:27:08.985802631 +0000 UTC m=+165.115780212" Apr 24 14:27:09.937193 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:09.937105 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jnpj" event={"ID":"75b83f1b-1fca-4a30-867f-a76c5b6bfe4f","Type":"ContainerStarted","Data":"263fad8026b96711f87b5b3a43a9a66d429bad981e340c222e42c9b2b0f148c9"} Apr 24 14:27:09.956521 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:09.956466 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jnpj" podStartSLOduration=33.109292022 podStartE2EDuration="34.956451376s" podCreationTimestamp="2026-04-24 14:26:35 +0000 UTC" firstStartedPulling="2026-04-24 14:27:07.838443282 +0000 UTC m=+163.968420820" lastFinishedPulling="2026-04-24 14:27:09.685602624 +0000 UTC m=+165.815580174" observedRunningTime="2026-04-24 14:27:09.955149139 +0000 UTC m=+166.085126700" watchObservedRunningTime="2026-04-24 14:27:09.956451376 +0000 UTC m=+166.086428936" Apr 24 14:27:13.456810 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:13.456784 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" Apr 24 14:27:13.459136 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:13.459119 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-tpm4v\"" Apr 24 14:27:13.467989 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:13.467969 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" Apr 24 14:27:13.588741 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:13.588707 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-55fdbcc56d-lrqtj"] Apr 24 14:27:13.591864 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:27:13.591834 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fbd6e7d_aa77_4475_bd4c_8a0d6edc0a00.slice/crio-a7d3711cd996ba3b6ed557b8f67c61cdf2b14103e42bf0ccfc3730cfe1aab0ae WatchSource:0}: Error finding container a7d3711cd996ba3b6ed557b8f67c61cdf2b14103e42bf0ccfc3730cfe1aab0ae: Status 404 returned error can't find the container with id a7d3711cd996ba3b6ed557b8f67c61cdf2b14103e42bf0ccfc3730cfe1aab0ae Apr 24 14:27:13.947207 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:13.947166 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" event={"ID":"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00","Type":"ContainerStarted","Data":"53933d2b06bfabdba52abb4ab04778904a0ff9b27843acaf7997c92ae90a8cbb"} Apr 24 14:27:13.947207 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:13.947208 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" event={"ID":"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00","Type":"ContainerStarted","Data":"a7d3711cd996ba3b6ed557b8f67c61cdf2b14103e42bf0ccfc3730cfe1aab0ae"} Apr 24 14:27:13.947424 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:13.947241 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" Apr 24 14:27:13.967400 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:13.967351 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" podStartSLOduration=149.96733641 podStartE2EDuration="2m29.96733641s" podCreationTimestamp="2026-04-24 14:24:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:27:13.966728448 +0000 UTC m=+170.096706021" watchObservedRunningTime="2026-04-24 14:27:13.96733641 +0000 UTC m=+170.097313969" Apr 24 14:27:15.456256 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:15.456206 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-k2t2r" Apr 24 14:27:15.456256 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:15.456238 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fsnj5" Apr 24 14:27:15.458615 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:15.458599 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xfc9t\"" Apr 24 14:27:15.467321 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:15.467304 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-k2t2r" Apr 24 14:27:15.578305 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:15.578277 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-k2t2r"] Apr 24 14:27:15.581426 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:27:15.581399 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21ba7795_411f_46a3_93c5_fedef51a27ea.slice/crio-e852a5d5bb2083e2916cf373a4290df7a11a543f65eaf9f15bf264a89f21cda5 WatchSource:0}: Error finding container e852a5d5bb2083e2916cf373a4290df7a11a543f65eaf9f15bf264a89f21cda5: Status 404 returned error can't find the container with id e852a5d5bb2083e2916cf373a4290df7a11a543f65eaf9f15bf264a89f21cda5 Apr 24 14:27:15.955724 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:15.955688 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-k2t2r" event={"ID":"21ba7795-411f-46a3-93c5-fedef51a27ea","Type":"ContainerStarted","Data":"e852a5d5bb2083e2916cf373a4290df7a11a543f65eaf9f15bf264a89f21cda5"} Apr 24 14:27:17.457172 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.457141 2571 scope.go:117] "RemoveContainer" containerID="6732be76bbedea52241d1a2fc9f587759fe8fee20386133423760bdba6f73099" Apr 24 14:27:17.457628 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:27:17.457322 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-mk446_openshift-console-operator(83b693cc-250b-45e7-b205-baf7f0feff6b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-mk446" podUID="83b693cc-250b-45e7-b205-baf7f0feff6b" Apr 24 14:27:17.652099 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.652057 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-9g2xl"] Apr 24 14:27:17.654709 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.654692 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-9g2xl" Apr 24 14:27:17.657153 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.657129 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 24 14:27:17.657285 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.657153 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 24 14:27:17.657765 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.657749 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-gxtgz\"" Apr 24 14:27:17.657874 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.657856 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 14:27:17.671923 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.671896 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-9g2xl"] Apr 24 14:27:17.677226 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.677201 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-bp2rg"] Apr 24 14:27:17.679967 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.679944 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-bp2rg" Apr 24 14:27:17.681944 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.681924 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-6r7wf\"" Apr 24 14:27:17.682431 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.682406 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 24 14:27:17.682511 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.682472 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 24 14:27:17.682770 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.682754 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 24 14:27:17.689077 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.689058 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-v9p7d"] Apr 24 14:27:17.691138 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.691119 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-v9p7d" Apr 24 14:27:17.692018 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.691974 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-bp2rg"] Apr 24 14:27:17.692968 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.692946 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 14:27:17.693571 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.693544 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 14:27:17.693942 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.693922 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 14:27:17.694075 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.693927 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-6zzcr\"" Apr 24 14:27:17.742033 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.741947 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9dmx\" (UniqueName: \"kubernetes.io/projected/b0493229-a011-4a74-9ccf-6fde5588cde8-kube-api-access-s9dmx\") pod \"openshift-state-metrics-9d44df66c-9g2xl\" (UID: \"b0493229-a011-4a74-9ccf-6fde5588cde8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-9g2xl" Apr 24 14:27:17.742033 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.742004 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b0493229-a011-4a74-9ccf-6fde5588cde8-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-9g2xl\" (UID: \"b0493229-a011-4a74-9ccf-6fde5588cde8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-9g2xl" Apr 24 14:27:17.742033 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.742033 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b0493229-a011-4a74-9ccf-6fde5588cde8-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-9g2xl\" (UID: \"b0493229-a011-4a74-9ccf-6fde5588cde8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-9g2xl" Apr 24 14:27:17.742276 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.742053 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b0493229-a011-4a74-9ccf-6fde5588cde8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-9g2xl\" (UID: \"b0493229-a011-4a74-9ccf-6fde5588cde8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-9g2xl" Apr 24 14:27:17.843225 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.843195 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2acef0d4-dc55-4e0c-8027-950b1d7e2fb1-sys\") pod \"node-exporter-v9p7d\" (UID: \"2acef0d4-dc55-4e0c-8027-950b1d7e2fb1\") " pod="openshift-monitoring/node-exporter-v9p7d" Apr 24 14:27:17.843398 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.843236 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2acef0d4-dc55-4e0c-8027-950b1d7e2fb1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-v9p7d\" (UID: \"2acef0d4-dc55-4e0c-8027-950b1d7e2fb1\") " pod="openshift-monitoring/node-exporter-v9p7d" Apr 24 14:27:17.843398 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.843283 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/76bc4d18-59d9-46a2-97c6-4b157dd5bd77-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-bp2rg\" (UID: \"76bc4d18-59d9-46a2-97c6-4b157dd5bd77\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bp2rg" Apr 24 14:27:17.843398 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.843309 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/76bc4d18-59d9-46a2-97c6-4b157dd5bd77-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-bp2rg\" (UID: \"76bc4d18-59d9-46a2-97c6-4b157dd5bd77\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bp2rg" Apr 24 14:27:17.843398 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.843331 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2acef0d4-dc55-4e0c-8027-950b1d7e2fb1-node-exporter-wtmp\") pod \"node-exporter-v9p7d\" (UID: \"2acef0d4-dc55-4e0c-8027-950b1d7e2fb1\") " pod="openshift-monitoring/node-exporter-v9p7d" Apr 24 14:27:17.843398 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.843349 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q79ls\" (UniqueName: \"kubernetes.io/projected/76bc4d18-59d9-46a2-97c6-4b157dd5bd77-kube-api-access-q79ls\") pod \"kube-state-metrics-69db897b98-bp2rg\" (UID: \"76bc4d18-59d9-46a2-97c6-4b157dd5bd77\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bp2rg" Apr 24 14:27:17.843653 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.843402 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2acef0d4-dc55-4e0c-8027-950b1d7e2fb1-root\") pod \"node-exporter-v9p7d\" (UID: \"2acef0d4-dc55-4e0c-8027-950b1d7e2fb1\") " pod="openshift-monitoring/node-exporter-v9p7d" Apr 24 14:27:17.843653 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.843431 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/76bc4d18-59d9-46a2-97c6-4b157dd5bd77-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-bp2rg\" (UID: \"76bc4d18-59d9-46a2-97c6-4b157dd5bd77\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bp2rg" Apr 24 14:27:17.843653 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.843471 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/76bc4d18-59d9-46a2-97c6-4b157dd5bd77-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-bp2rg\" (UID: \"76bc4d18-59d9-46a2-97c6-4b157dd5bd77\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bp2rg" Apr 24 14:27:17.843653 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.843511 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2acef0d4-dc55-4e0c-8027-950b1d7e2fb1-node-exporter-accelerators-collector-config\") pod \"node-exporter-v9p7d\" (UID: \"2acef0d4-dc55-4e0c-8027-950b1d7e2fb1\") " pod="openshift-monitoring/node-exporter-v9p7d" Apr 24 14:27:17.843653 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.843563 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9dmx\" (UniqueName: \"kubernetes.io/projected/b0493229-a011-4a74-9ccf-6fde5588cde8-kube-api-access-s9dmx\") pod \"openshift-state-metrics-9d44df66c-9g2xl\" (UID: \"b0493229-a011-4a74-9ccf-6fde5588cde8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-9g2xl" Apr 24 14:27:17.843653 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.843608 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2acef0d4-dc55-4e0c-8027-950b1d7e2fb1-metrics-client-ca\") pod \"node-exporter-v9p7d\" (UID: \"2acef0d4-dc55-4e0c-8027-950b1d7e2fb1\") " pod="openshift-monitoring/node-exporter-v9p7d" Apr 24 14:27:17.843653 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.843633 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2acef0d4-dc55-4e0c-8027-950b1d7e2fb1-node-exporter-textfile\") pod \"node-exporter-v9p7d\" (UID: \"2acef0d4-dc55-4e0c-8027-950b1d7e2fb1\") " pod="openshift-monitoring/node-exporter-v9p7d" Apr 24 14:27:17.844006 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.843659 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/76bc4d18-59d9-46a2-97c6-4b157dd5bd77-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-bp2rg\" (UID: \"76bc4d18-59d9-46a2-97c6-4b157dd5bd77\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bp2rg" Apr 24 14:27:17.844006 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.843708 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2acef0d4-dc55-4e0c-8027-950b1d7e2fb1-node-exporter-tls\") pod \"node-exporter-v9p7d\" (UID: \"2acef0d4-dc55-4e0c-8027-950b1d7e2fb1\") " pod="openshift-monitoring/node-exporter-v9p7d" Apr 24 14:27:17.844006 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.843762 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b0493229-a011-4a74-9ccf-6fde5588cde8-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-9g2xl\" (UID: \"b0493229-a011-4a74-9ccf-6fde5588cde8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-9g2xl" Apr 24 14:27:17.844006 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.843790 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b0493229-a011-4a74-9ccf-6fde5588cde8-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-9g2xl\" (UID: \"b0493229-a011-4a74-9ccf-6fde5588cde8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-9g2xl" Apr 24 14:27:17.844006 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.843818 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b0493229-a011-4a74-9ccf-6fde5588cde8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-9g2xl\" (UID: \"b0493229-a011-4a74-9ccf-6fde5588cde8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-9g2xl" Apr 24 14:27:17.844006 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.843847 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-278p6\" (UniqueName: \"kubernetes.io/projected/2acef0d4-dc55-4e0c-8027-950b1d7e2fb1-kube-api-access-278p6\") pod \"node-exporter-v9p7d\" (UID: \"2acef0d4-dc55-4e0c-8027-950b1d7e2fb1\") " pod="openshift-monitoring/node-exporter-v9p7d" Apr 24 14:27:17.844423 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.844403 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b0493229-a011-4a74-9ccf-6fde5588cde8-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-9g2xl\" (UID: \"b0493229-a011-4a74-9ccf-6fde5588cde8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-9g2xl" Apr 24 14:27:17.846235 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.846213 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b0493229-a011-4a74-9ccf-6fde5588cde8-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-9g2xl\" (UID: \"b0493229-a011-4a74-9ccf-6fde5588cde8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-9g2xl" Apr 24 14:27:17.846364 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.846285 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b0493229-a011-4a74-9ccf-6fde5588cde8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-9g2xl\" (UID: \"b0493229-a011-4a74-9ccf-6fde5588cde8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-9g2xl" Apr 24 14:27:17.852022 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.851980 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9dmx\" (UniqueName: \"kubernetes.io/projected/b0493229-a011-4a74-9ccf-6fde5588cde8-kube-api-access-s9dmx\") pod \"openshift-state-metrics-9d44df66c-9g2xl\" (UID: \"b0493229-a011-4a74-9ccf-6fde5588cde8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-9g2xl" Apr 24 14:27:17.944236 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.944201 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2acef0d4-dc55-4e0c-8027-950b1d7e2fb1-node-exporter-tls\") pod \"node-exporter-v9p7d\" (UID: \"2acef0d4-dc55-4e0c-8027-950b1d7e2fb1\") " pod="openshift-monitoring/node-exporter-v9p7d" Apr 24 14:27:17.944401 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.944255 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-278p6\" (UniqueName: \"kubernetes.io/projected/2acef0d4-dc55-4e0c-8027-950b1d7e2fb1-kube-api-access-278p6\") pod \"node-exporter-v9p7d\" (UID: \"2acef0d4-dc55-4e0c-8027-950b1d7e2fb1\") " pod="openshift-monitoring/node-exporter-v9p7d" Apr 24 14:27:17.944401 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.944291 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2acef0d4-dc55-4e0c-8027-950b1d7e2fb1-sys\") pod \"node-exporter-v9p7d\" (UID: \"2acef0d4-dc55-4e0c-8027-950b1d7e2fb1\") " pod="openshift-monitoring/node-exporter-v9p7d" Apr 24 14:27:17.944401 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.944316 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2acef0d4-dc55-4e0c-8027-950b1d7e2fb1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-v9p7d\" (UID: \"2acef0d4-dc55-4e0c-8027-950b1d7e2fb1\") " pod="openshift-monitoring/node-exporter-v9p7d" Apr 24 14:27:17.944401 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:27:17.944339 2571 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 14:27:17.944613 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.944403 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2acef0d4-dc55-4e0c-8027-950b1d7e2fb1-sys\") pod \"node-exporter-v9p7d\" (UID: \"2acef0d4-dc55-4e0c-8027-950b1d7e2fb1\") " pod="openshift-monitoring/node-exporter-v9p7d" Apr 24 14:27:17.944613 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:27:17.944409 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2acef0d4-dc55-4e0c-8027-950b1d7e2fb1-node-exporter-tls podName:2acef0d4-dc55-4e0c-8027-950b1d7e2fb1 nodeName:}" failed. No retries permitted until 2026-04-24 14:27:18.444387312 +0000 UTC m=+174.574364867 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/2acef0d4-dc55-4e0c-8027-950b1d7e2fb1-node-exporter-tls") pod "node-exporter-v9p7d" (UID: "2acef0d4-dc55-4e0c-8027-950b1d7e2fb1") : secret "node-exporter-tls" not found Apr 24 14:27:17.944613 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.944341 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/76bc4d18-59d9-46a2-97c6-4b157dd5bd77-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-bp2rg\" (UID: \"76bc4d18-59d9-46a2-97c6-4b157dd5bd77\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bp2rg" Apr 24 14:27:17.944613 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.944485 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/76bc4d18-59d9-46a2-97c6-4b157dd5bd77-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-bp2rg\" (UID: \"76bc4d18-59d9-46a2-97c6-4b157dd5bd77\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bp2rg" Apr 24 14:27:17.944613 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.944516 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2acef0d4-dc55-4e0c-8027-950b1d7e2fb1-node-exporter-wtmp\") pod \"node-exporter-v9p7d\" (UID: \"2acef0d4-dc55-4e0c-8027-950b1d7e2fb1\") " pod="openshift-monitoring/node-exporter-v9p7d" Apr 24 14:27:17.944864 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.944643 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2acef0d4-dc55-4e0c-8027-950b1d7e2fb1-node-exporter-wtmp\") pod \"node-exporter-v9p7d\" (UID: \"2acef0d4-dc55-4e0c-8027-950b1d7e2fb1\") " pod="openshift-monitoring/node-exporter-v9p7d" Apr 24 14:27:17.944864 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.944677 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q79ls\" (UniqueName: \"kubernetes.io/projected/76bc4d18-59d9-46a2-97c6-4b157dd5bd77-kube-api-access-q79ls\") pod \"kube-state-metrics-69db897b98-bp2rg\" (UID: \"76bc4d18-59d9-46a2-97c6-4b157dd5bd77\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bp2rg" Apr 24 14:27:17.944864 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.944713 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2acef0d4-dc55-4e0c-8027-950b1d7e2fb1-root\") pod \"node-exporter-v9p7d\" (UID: \"2acef0d4-dc55-4e0c-8027-950b1d7e2fb1\") " pod="openshift-monitoring/node-exporter-v9p7d" Apr 24 14:27:17.944864 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.944739 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/76bc4d18-59d9-46a2-97c6-4b157dd5bd77-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-bp2rg\" (UID: \"76bc4d18-59d9-46a2-97c6-4b157dd5bd77\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bp2rg" Apr 24 14:27:17.944864 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.944782 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/76bc4d18-59d9-46a2-97c6-4b157dd5bd77-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-bp2rg\" (UID: \"76bc4d18-59d9-46a2-97c6-4b157dd5bd77\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bp2rg" Apr 24 14:27:17.944864 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.944786 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2acef0d4-dc55-4e0c-8027-950b1d7e2fb1-root\") pod \"node-exporter-v9p7d\" (UID: \"2acef0d4-dc55-4e0c-8027-950b1d7e2fb1\") " pod="openshift-monitoring/node-exporter-v9p7d" Apr 24 14:27:17.944864 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.944833 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2acef0d4-dc55-4e0c-8027-950b1d7e2fb1-node-exporter-accelerators-collector-config\") pod \"node-exporter-v9p7d\" (UID: \"2acef0d4-dc55-4e0c-8027-950b1d7e2fb1\") " pod="openshift-monitoring/node-exporter-v9p7d" Apr 24 14:27:17.945220 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.944891 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2acef0d4-dc55-4e0c-8027-950b1d7e2fb1-metrics-client-ca\") pod \"node-exporter-v9p7d\" (UID: \"2acef0d4-dc55-4e0c-8027-950b1d7e2fb1\") " pod="openshift-monitoring/node-exporter-v9p7d" Apr 24 14:27:17.945220 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.944919 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2acef0d4-dc55-4e0c-8027-950b1d7e2fb1-node-exporter-textfile\") pod \"node-exporter-v9p7d\" (UID: \"2acef0d4-dc55-4e0c-8027-950b1d7e2fb1\") " pod="openshift-monitoring/node-exporter-v9p7d" Apr 24 14:27:17.945220 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.944945 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/76bc4d18-59d9-46a2-97c6-4b157dd5bd77-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-bp2rg\" (UID: \"76bc4d18-59d9-46a2-97c6-4b157dd5bd77\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bp2rg" Apr 24 14:27:17.945220 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:27:17.945075 2571 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 24 14:27:17.945220 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:27:17.945122 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76bc4d18-59d9-46a2-97c6-4b157dd5bd77-kube-state-metrics-tls podName:76bc4d18-59d9-46a2-97c6-4b157dd5bd77 nodeName:}" failed. No retries permitted until 2026-04-24 14:27:18.445105895 +0000 UTC m=+174.575083440 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/76bc4d18-59d9-46a2-97c6-4b157dd5bd77-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-bp2rg" (UID: "76bc4d18-59d9-46a2-97c6-4b157dd5bd77") : secret "kube-state-metrics-tls" not found Apr 24 14:27:17.945220 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.945159 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/76bc4d18-59d9-46a2-97c6-4b157dd5bd77-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-bp2rg\" (UID: \"76bc4d18-59d9-46a2-97c6-4b157dd5bd77\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bp2rg" Apr 24 14:27:17.945220 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.945194 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/76bc4d18-59d9-46a2-97c6-4b157dd5bd77-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-bp2rg\" (UID: \"76bc4d18-59d9-46a2-97c6-4b157dd5bd77\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bp2rg" Apr 24 14:27:17.945504 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.945384 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2acef0d4-dc55-4e0c-8027-950b1d7e2fb1-node-exporter-textfile\") pod \"node-exporter-v9p7d\" (UID: \"2acef0d4-dc55-4e0c-8027-950b1d7e2fb1\") " pod="openshift-monitoring/node-exporter-v9p7d" Apr 24 14:27:17.945504 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.945428 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/76bc4d18-59d9-46a2-97c6-4b157dd5bd77-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-bp2rg\" (UID: \"76bc4d18-59d9-46a2-97c6-4b157dd5bd77\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bp2rg" Apr 24 14:27:17.945649 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.945627 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2acef0d4-dc55-4e0c-8027-950b1d7e2fb1-node-exporter-accelerators-collector-config\") pod \"node-exporter-v9p7d\" (UID: \"2acef0d4-dc55-4e0c-8027-950b1d7e2fb1\") " pod="openshift-monitoring/node-exporter-v9p7d" Apr 24 14:27:17.945708 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.945692 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2acef0d4-dc55-4e0c-8027-950b1d7e2fb1-metrics-client-ca\") pod \"node-exporter-v9p7d\" (UID: \"2acef0d4-dc55-4e0c-8027-950b1d7e2fb1\") " pod="openshift-monitoring/node-exporter-v9p7d" Apr 24 14:27:17.947280 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.947258 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2acef0d4-dc55-4e0c-8027-950b1d7e2fb1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-v9p7d\" (UID: \"2acef0d4-dc55-4e0c-8027-950b1d7e2fb1\") " pod="openshift-monitoring/node-exporter-v9p7d" Apr 24 14:27:17.947369 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.947292 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/76bc4d18-59d9-46a2-97c6-4b157dd5bd77-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-bp2rg\" (UID: \"76bc4d18-59d9-46a2-97c6-4b157dd5bd77\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bp2rg" Apr 24 14:27:17.956557 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.956510 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-278p6\" (UniqueName: \"kubernetes.io/projected/2acef0d4-dc55-4e0c-8027-950b1d7e2fb1-kube-api-access-278p6\") pod \"node-exporter-v9p7d\" (UID: \"2acef0d4-dc55-4e0c-8027-950b1d7e2fb1\") " pod="openshift-monitoring/node-exporter-v9p7d" Apr 24 14:27:17.956735 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.956710 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q79ls\" (UniqueName: \"kubernetes.io/projected/76bc4d18-59d9-46a2-97c6-4b157dd5bd77-kube-api-access-q79ls\") pod \"kube-state-metrics-69db897b98-bp2rg\" (UID: \"76bc4d18-59d9-46a2-97c6-4b157dd5bd77\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bp2rg" Apr 24 14:27:17.963906 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.963883 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-k2t2r" event={"ID":"21ba7795-411f-46a3-93c5-fedef51a27ea","Type":"ContainerStarted","Data":"4d38481de200db09c75437e4fb3359ada0e2432dfbdbdba194e1c99eb4b438e0"} Apr 24 14:27:17.965250 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:17.965231 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-9g2xl" Apr 24 14:27:18.085105 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.084955 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-k2t2r" podStartSLOduration=138.667242593 podStartE2EDuration="2m20.084932483s" podCreationTimestamp="2026-04-24 14:24:58 +0000 UTC" firstStartedPulling="2026-04-24 14:27:15.583368231 +0000 UTC m=+171.713345769" lastFinishedPulling="2026-04-24 14:27:17.001058117 +0000 UTC m=+173.131035659" observedRunningTime="2026-04-24 14:27:17.985241206 +0000 UTC m=+174.115218766" watchObservedRunningTime="2026-04-24 14:27:18.084932483 +0000 UTC m=+174.214910043" Apr 24 14:27:18.085615 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.085458 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-9g2xl"] Apr 24 14:27:18.089300 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:27:18.089261 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0493229_a011_4a74_9ccf_6fde5588cde8.slice/crio-4fbf4dc296afcf9e7cb3de0a95d9c671b2f8b5d351ef64fb382c44bce0da8254 WatchSource:0}: Error finding container 4fbf4dc296afcf9e7cb3de0a95d9c671b2f8b5d351ef64fb382c44bce0da8254: Status 404 returned error can't find the container with id 4fbf4dc296afcf9e7cb3de0a95d9c671b2f8b5d351ef64fb382c44bce0da8254 Apr 24 14:27:18.449295 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.449261 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2acef0d4-dc55-4e0c-8027-950b1d7e2fb1-node-exporter-tls\") pod \"node-exporter-v9p7d\" (UID: \"2acef0d4-dc55-4e0c-8027-950b1d7e2fb1\") " pod="openshift-monitoring/node-exporter-v9p7d" Apr 24 14:27:18.449458 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.449359 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/76bc4d18-59d9-46a2-97c6-4b157dd5bd77-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-bp2rg\" (UID: \"76bc4d18-59d9-46a2-97c6-4b157dd5bd77\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bp2rg" Apr 24 14:27:18.451707 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.451678 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2acef0d4-dc55-4e0c-8027-950b1d7e2fb1-node-exporter-tls\") pod \"node-exporter-v9p7d\" (UID: \"2acef0d4-dc55-4e0c-8027-950b1d7e2fb1\") " pod="openshift-monitoring/node-exporter-v9p7d" Apr 24 14:27:18.451839 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.451756 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/76bc4d18-59d9-46a2-97c6-4b157dd5bd77-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-bp2rg\" (UID: \"76bc4d18-59d9-46a2-97c6-4b157dd5bd77\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bp2rg" Apr 24 14:27:18.590313 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.590286 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-bp2rg" Apr 24 14:27:18.602117 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.602084 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-v9p7d" Apr 24 14:27:18.612723 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:27:18.612691 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2acef0d4_dc55_4e0c_8027_950b1d7e2fb1.slice/crio-1eb9c6da9b99e96491755206f265615efa61b244baa643d4c03abe93898021bf WatchSource:0}: Error finding container 1eb9c6da9b99e96491755206f265615efa61b244baa643d4c03abe93898021bf: Status 404 returned error can't find the container with id 1eb9c6da9b99e96491755206f265615efa61b244baa643d4c03abe93898021bf Apr 24 14:27:18.713728 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.713611 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-bp2rg"] Apr 24 14:27:18.716677 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:27:18.716640 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76bc4d18_59d9_46a2_97c6_4b157dd5bd77.slice/crio-c99035a81474dd82057ed91fddcd22ace3e067a917e1f32b15f12d9a74176aaa WatchSource:0}: Error finding container c99035a81474dd82057ed91fddcd22ace3e067a917e1f32b15f12d9a74176aaa: Status 404 returned error can't find the container with id c99035a81474dd82057ed91fddcd22ace3e067a917e1f32b15f12d9a74176aaa Apr 24 14:27:18.758764 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.756743 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 14:27:18.760058 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.760016 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:18.762470 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.762312 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 14:27:18.762470 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.762367 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 14:27:18.762627 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.762595 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 14:27:18.762770 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.762312 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 14:27:18.762906 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.762871 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 14:27:18.762967 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.762918 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 14:27:18.763137 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.763083 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 14:27:18.763196 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.763146 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-lcd95\"" Apr 24 14:27:18.763244 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.763145 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 14:27:18.763244 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.763087 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 14:27:18.777461 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.777438 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 14:27:18.852366 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.852336 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:18.852582 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.852391 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:18.852582 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.852420 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-config-volume\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:18.852582 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.852446 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-tls-assets\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:18.852582 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.852477 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:18.852582 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.852525 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:18.852801 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.852615 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:18.852801 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.852654 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:18.852801 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.852681 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-config-out\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:18.852801 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.852708 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltf29\" (UniqueName: \"kubernetes.io/projected/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-kube-api-access-ltf29\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:18.852801 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.852778 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-web-config\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:18.852801 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.852800 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:18.853131 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.852823 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:18.939500 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.939470 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-752tj" Apr 24 14:27:18.953522 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.953494 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:18.953658 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.953545 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:18.953658 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.953586 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-config-volume\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:18.953658 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.953614 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-tls-assets\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:18.953658 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.953646 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:18.953890 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.953671 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:18.953890 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.953732 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:18.953890 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:27:18.953881 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-alertmanager-trusted-ca-bundle podName:dfa4cdab-b9bf-4984-b99d-6aca2dc55292 nodeName:}" failed. No retries permitted until 2026-04-24 14:27:19.453858352 +0000 UTC m=+175.583835906 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "dfa4cdab-b9bf-4984-b99d-6aca2dc55292") : configmap references non-existent config key: ca-bundle.crt Apr 24 14:27:18.954190 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.954162 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:18.954293 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.954215 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-config-out\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:18.954293 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.954242 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ltf29\" (UniqueName: \"kubernetes.io/projected/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-kube-api-access-ltf29\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:18.954293 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.954289 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-web-config\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:18.954468 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.954315 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:18.954468 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.954342 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:18.954468 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.954352 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:18.954919 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.954862 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:18.954919 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:27:18.954912 2571 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 24 14:27:18.955091 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:27:18.954969 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-secret-alertmanager-main-tls podName:dfa4cdab-b9bf-4984-b99d-6aca2dc55292 nodeName:}" failed. No retries permitted until 2026-04-24 14:27:19.454951227 +0000 UTC m=+175.584928770 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "dfa4cdab-b9bf-4984-b99d-6aca2dc55292") : secret "alertmanager-main-tls" not found Apr 24 14:27:18.956423 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.956398 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:18.956680 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.956599 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-tls-assets\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:18.958013 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.957944 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:18.958470 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.958408 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:18.958615 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.958596 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-config-out\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:18.958780 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.958723 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-config-volume\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:18.959138 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.959116 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-web-config\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:18.959415 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.959389 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:18.968906 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.968812 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v9p7d" event={"ID":"2acef0d4-dc55-4e0c-8027-950b1d7e2fb1","Type":"ContainerStarted","Data":"1eb9c6da9b99e96491755206f265615efa61b244baa643d4c03abe93898021bf"} Apr 24 14:27:18.970533 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.970494 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltf29\" (UniqueName: \"kubernetes.io/projected/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-kube-api-access-ltf29\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:18.971023 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.970976 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-bp2rg" event={"ID":"76bc4d18-59d9-46a2-97c6-4b157dd5bd77","Type":"ContainerStarted","Data":"c99035a81474dd82057ed91fddcd22ace3e067a917e1f32b15f12d9a74176aaa"} Apr 24 14:27:18.974175 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.974106 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-9g2xl" event={"ID":"b0493229-a011-4a74-9ccf-6fde5588cde8","Type":"ContainerStarted","Data":"5e7a640a01e01df5f3ea147cbeda43e79c66f3a3e294adbb3cf3c10e04e52ae5"} Apr 24 14:27:18.974175 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.974141 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-9g2xl" event={"ID":"b0493229-a011-4a74-9ccf-6fde5588cde8","Type":"ContainerStarted","Data":"2cfeecba9f8397f9c2663db33dbaadaf95489d07edd3bab9a162bb2de557dd09"} Apr 24 14:27:18.974175 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:18.974156 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-9g2xl" event={"ID":"b0493229-a011-4a74-9ccf-6fde5588cde8","Type":"ContainerStarted","Data":"4fbf4dc296afcf9e7cb3de0a95d9c671b2f8b5d351ef64fb382c44bce0da8254"} Apr 24 14:27:19.458375 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:19.458295 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:19.458375 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:19.458375 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:19.459326 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:19.459270 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:19.462064 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:19.462015 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:19.671961 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:19.671934 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:19.979350 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:19.979303 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-9g2xl" event={"ID":"b0493229-a011-4a74-9ccf-6fde5588cde8","Type":"ContainerStarted","Data":"09bf7ad5d7f4a3839eb72c129952034b66421c224d193af56e5c2de38f930a80"} Apr 24 14:27:19.980962 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:19.980919 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v9p7d" event={"ID":"2acef0d4-dc55-4e0c-8027-950b1d7e2fb1","Type":"ContainerStarted","Data":"04919f774817b7a5e06d10a19fce8a1ec008079b3ce2d727c3c4e94aa0945ecf"} Apr 24 14:27:19.996869 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:19.996722 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-9g2xl" podStartSLOduration=2.052041497 podStartE2EDuration="2.996707311s" podCreationTimestamp="2026-04-24 14:27:17 +0000 UTC" firstStartedPulling="2026-04-24 14:27:18.226447118 +0000 UTC m=+174.356424660" lastFinishedPulling="2026-04-24 14:27:19.171112937 +0000 UTC m=+175.301090474" observedRunningTime="2026-04-24 14:27:19.996558865 +0000 UTC m=+176.126536426" watchObservedRunningTime="2026-04-24 14:27:19.996707311 +0000 UTC m=+176.126684872" Apr 24 14:27:20.124940 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:20.124912 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 14:27:20.132451 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:27:20.132421 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfa4cdab_b9bf_4984_b99d_6aca2dc55292.slice/crio-eb1e4b0b0dc07a8de2d6252bc280ad37b3ccbdb06fa3316a54b3b0db8425ee53 WatchSource:0}: Error finding container eb1e4b0b0dc07a8de2d6252bc280ad37b3ccbdb06fa3316a54b3b0db8425ee53: Status 404 returned error can't find the container with id eb1e4b0b0dc07a8de2d6252bc280ad37b3ccbdb06fa3316a54b3b0db8425ee53 Apr 24 14:27:20.987277 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:20.987227 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dfa4cdab-b9bf-4984-b99d-6aca2dc55292","Type":"ContainerStarted","Data":"eb1e4b0b0dc07a8de2d6252bc280ad37b3ccbdb06fa3316a54b3b0db8425ee53"} Apr 24 14:27:20.989663 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:20.989631 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-bp2rg" event={"ID":"76bc4d18-59d9-46a2-97c6-4b157dd5bd77","Type":"ContainerStarted","Data":"4d0c1fe96b3b8fdddd7eb1c7ed6243b669a6feed1bd5b39f5177b32d7487d71b"} Apr 24 14:27:20.989775 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:20.989668 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-bp2rg" event={"ID":"76bc4d18-59d9-46a2-97c6-4b157dd5bd77","Type":"ContainerStarted","Data":"4fd0faf2d3526b79713fdecb3dad6433476b818efff8d39075b6884c7ac116f0"} Apr 24 14:27:20.989775 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:20.989679 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-bp2rg" event={"ID":"76bc4d18-59d9-46a2-97c6-4b157dd5bd77","Type":"ContainerStarted","Data":"c1cf79a29e5fe8abd58cac36cdbe73e6dd886e4b625d2b465bfa6a453c1bb23e"} Apr 24 14:27:20.991627 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:20.991601 2571 generic.go:358] "Generic (PLEG): container finished" podID="2acef0d4-dc55-4e0c-8027-950b1d7e2fb1" containerID="04919f774817b7a5e06d10a19fce8a1ec008079b3ce2d727c3c4e94aa0945ecf" exitCode=0 Apr 24 14:27:20.991716 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:20.991659 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v9p7d" event={"ID":"2acef0d4-dc55-4e0c-8027-950b1d7e2fb1","Type":"ContainerDied","Data":"04919f774817b7a5e06d10a19fce8a1ec008079b3ce2d727c3c4e94aa0945ecf"} Apr 24 14:27:21.017433 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:21.017379 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-bp2rg" podStartSLOduration=2.711903275 podStartE2EDuration="4.01736102s" podCreationTimestamp="2026-04-24 14:27:17 +0000 UTC" firstStartedPulling="2026-04-24 14:27:18.71934563 +0000 UTC m=+174.849323174" lastFinishedPulling="2026-04-24 14:27:20.024803372 +0000 UTC m=+176.154780919" observedRunningTime="2026-04-24 14:27:21.016373238 +0000 UTC m=+177.146350799" watchObservedRunningTime="2026-04-24 14:27:21.01736102 +0000 UTC m=+177.147338580" Apr 24 14:27:21.996477 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:21.996442 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v9p7d" event={"ID":"2acef0d4-dc55-4e0c-8027-950b1d7e2fb1","Type":"ContainerStarted","Data":"0bee08e96e2f933f2e82038ee7f7cdda2328be50a430222a0409af677b9dfdf2"} Apr 24 14:27:21.996477 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:21.996477 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v9p7d" event={"ID":"2acef0d4-dc55-4e0c-8027-950b1d7e2fb1","Type":"ContainerStarted","Data":"bf825f3951a28eb8f481da619770861252843ce308c1daf2b8685bbbf0f08e20"} Apr 24 14:27:21.997679 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:21.997656 2571 generic.go:358] "Generic (PLEG): container finished" podID="dfa4cdab-b9bf-4984-b99d-6aca2dc55292" containerID="a0969595f195037d0a6e7afce7b63687325dd6342c2ecd4760aa28248d514315" exitCode=0 Apr 24 14:27:21.997768 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:21.997743 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dfa4cdab-b9bf-4984-b99d-6aca2dc55292","Type":"ContainerDied","Data":"a0969595f195037d0a6e7afce7b63687325dd6342c2ecd4760aa28248d514315"} Apr 24 14:27:22.017966 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:22.017906 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-v9p7d" podStartSLOduration=4.032616087 podStartE2EDuration="5.017894255s" podCreationTimestamp="2026-04-24 14:27:17 +0000 UTC" firstStartedPulling="2026-04-24 14:27:18.614855474 +0000 UTC m=+174.744833013" lastFinishedPulling="2026-04-24 14:27:19.600133633 +0000 UTC m=+175.730111181" observedRunningTime="2026-04-24 14:27:22.017181842 +0000 UTC m=+178.147159403" watchObservedRunningTime="2026-04-24 14:27:22.017894255 +0000 UTC m=+178.147871814" Apr 24 14:27:24.006403 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:24.006321 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dfa4cdab-b9bf-4984-b99d-6aca2dc55292","Type":"ContainerStarted","Data":"47b90b223607467342b53bf0d5458d852c06b81e71c3def862915ce2fc4836f3"} Apr 24 14:27:24.006403 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:24.006358 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dfa4cdab-b9bf-4984-b99d-6aca2dc55292","Type":"ContainerStarted","Data":"f835458bf8a52fa90ae1eb11e92f9a8d6cf760a54fc107a08e47f86a273228fd"} Apr 24 14:27:24.006403 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:24.006367 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dfa4cdab-b9bf-4984-b99d-6aca2dc55292","Type":"ContainerStarted","Data":"3ef0433ea7ee86f8a32757a8e2a839fb11a592629b07f8478588a9949d88e068"} Apr 24 14:27:24.006403 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:24.006377 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dfa4cdab-b9bf-4984-b99d-6aca2dc55292","Type":"ContainerStarted","Data":"906891c8ea3bf3d3d6e5e126edb8625f100ab5c6cd1225149cc307591690fd72"} Apr 24 14:27:24.006403 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:24.006384 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dfa4cdab-b9bf-4984-b99d-6aca2dc55292","Type":"ContainerStarted","Data":"fc0db63e36592a473ae51fefb4d1dc0c743f025411d3858c91157d7c11bb14eb"} Apr 24 14:27:24.902548 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:24.902517 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-55fdbcc56d-lrqtj"] Apr 24 14:27:25.011815 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:25.011780 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dfa4cdab-b9bf-4984-b99d-6aca2dc55292","Type":"ContainerStarted","Data":"e61b782d10492e5085915bc92bb1ad599146b9a57a0b2d5bf11c820dd9928100"} Apr 24 14:27:25.040155 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:25.040108 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.765727306 podStartE2EDuration="7.040094328s" podCreationTimestamp="2026-04-24 14:27:18 +0000 UTC" firstStartedPulling="2026-04-24 14:27:20.134293586 +0000 UTC m=+176.264271124" lastFinishedPulling="2026-04-24 14:27:24.408660605 +0000 UTC m=+180.538638146" observedRunningTime="2026-04-24 14:27:25.038666678 +0000 UTC m=+181.168644240" watchObservedRunningTime="2026-04-24 14:27:25.040094328 +0000 UTC m=+181.170071888" Apr 24 14:27:28.460337 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:28.460298 2571 scope.go:117] "RemoveContainer" containerID="6732be76bbedea52241d1a2fc9f587759fe8fee20386133423760bdba6f73099" Apr 24 14:27:29.024824 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:29.024796 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mk446_83b693cc-250b-45e7-b205-baf7f0feff6b/console-operator/2.log" Apr 24 14:27:29.024999 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:29.024862 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-mk446" event={"ID":"83b693cc-250b-45e7-b205-baf7f0feff6b","Type":"ContainerStarted","Data":"e57ba534c406bc97e53cbf4ff6087c3540e59139482767c89d77562d68ba53fc"} Apr 24 14:27:29.025180 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:29.025157 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-mk446" Apr 24 14:27:29.033668 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:29.033642 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-mk446" Apr 24 14:27:29.051724 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:29.051669 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-mk446" podStartSLOduration=51.650378847 podStartE2EDuration="54.051650291s" podCreationTimestamp="2026-04-24 14:26:35 +0000 UTC" firstStartedPulling="2026-04-24 14:26:35.844165324 +0000 UTC m=+131.974142862" lastFinishedPulling="2026-04-24 14:26:38.245436754 +0000 UTC m=+134.375414306" observedRunningTime="2026-04-24 14:27:29.050073418 +0000 UTC m=+185.180051047" watchObservedRunningTime="2026-04-24 14:27:29.051650291 +0000 UTC m=+185.181627853" Apr 24 14:27:29.195380 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:29.195349 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-zc4wq"] Apr 24 14:27:29.197674 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:29.197657 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-zc4wq" Apr 24 14:27:29.199604 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:29.199580 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 14:27:29.199745 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:29.199725 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 14:27:29.200113 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:29.200099 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-j5hgr\"" Apr 24 14:27:29.207526 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:29.207505 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-zc4wq"] Apr 24 14:27:29.344633 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:29.344555 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvd7g\" (UniqueName: \"kubernetes.io/projected/7e9f030d-42db-425f-8dec-4073b006cce9-kube-api-access-bvd7g\") pod \"downloads-6bcc868b7-zc4wq\" (UID: \"7e9f030d-42db-425f-8dec-4073b006cce9\") " pod="openshift-console/downloads-6bcc868b7-zc4wq" Apr 24 14:27:29.445450 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:29.445421 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bvd7g\" (UniqueName: \"kubernetes.io/projected/7e9f030d-42db-425f-8dec-4073b006cce9-kube-api-access-bvd7g\") pod \"downloads-6bcc868b7-zc4wq\" (UID: \"7e9f030d-42db-425f-8dec-4073b006cce9\") " pod="openshift-console/downloads-6bcc868b7-zc4wq" Apr 24 14:27:29.453619 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:29.453588 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvd7g\" (UniqueName: \"kubernetes.io/projected/7e9f030d-42db-425f-8dec-4073b006cce9-kube-api-access-bvd7g\") pod \"downloads-6bcc868b7-zc4wq\" (UID: \"7e9f030d-42db-425f-8dec-4073b006cce9\") " pod="openshift-console/downloads-6bcc868b7-zc4wq" Apr 24 14:27:29.507002 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:29.506960 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-zc4wq" Apr 24 14:27:29.630112 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:29.630081 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-zc4wq"] Apr 24 14:27:29.632896 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:27:29.632867 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e9f030d_42db_425f_8dec_4073b006cce9.slice/crio-bfc085e5ad8f51f7ef128e49043eba2ea81d1e0ca43907d11e8670c98a5cafb9 WatchSource:0}: Error finding container bfc085e5ad8f51f7ef128e49043eba2ea81d1e0ca43907d11e8670c98a5cafb9: Status 404 returned error can't find the container with id bfc085e5ad8f51f7ef128e49043eba2ea81d1e0ca43907d11e8670c98a5cafb9 Apr 24 14:27:30.029277 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:30.029194 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-zc4wq" event={"ID":"7e9f030d-42db-425f-8dec-4073b006cce9","Type":"ContainerStarted","Data":"bfc085e5ad8f51f7ef128e49043eba2ea81d1e0ca43907d11e8670c98a5cafb9"} Apr 24 14:27:34.910669 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:34.910635 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" Apr 24 14:27:45.077187 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:45.077143 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-zc4wq" event={"ID":"7e9f030d-42db-425f-8dec-4073b006cce9","Type":"ContainerStarted","Data":"d04211dbe5bc44e9eaccd08746c37f2c04c22280d897a509ddaf184726fac518"} Apr 24 14:27:45.077597 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:45.077314 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-zc4wq" Apr 24 14:27:45.079106 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:45.079072 2571 patch_prober.go:28] interesting pod/downloads-6bcc868b7-zc4wq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.132.0.21:8080/\": dial tcp 10.132.0.21:8080: connect: connection refused" start-of-body= Apr 24 14:27:45.079228 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:45.079127 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-6bcc868b7-zc4wq" podUID="7e9f030d-42db-425f-8dec-4073b006cce9" containerName="download-server" probeResult="failure" output="Get \"http://10.132.0.21:8080/\": dial tcp 10.132.0.21:8080: connect: connection refused" Apr 24 14:27:45.109022 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:45.108954 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-zc4wq" podStartSLOduration=0.849788233 podStartE2EDuration="16.108940195s" podCreationTimestamp="2026-04-24 14:27:29 +0000 UTC" firstStartedPulling="2026-04-24 14:27:29.634813171 +0000 UTC m=+185.764790713" lastFinishedPulling="2026-04-24 14:27:44.893965126 +0000 UTC m=+201.023942675" observedRunningTime="2026-04-24 14:27:45.107514529 +0000 UTC m=+201.237492090" watchObservedRunningTime="2026-04-24 14:27:45.108940195 +0000 UTC m=+201.238917823" Apr 24 14:27:46.094168 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:46.094134 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-zc4wq" Apr 24 14:27:49.303038 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:49.303000 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7574bf65c9-q6cvt"] Apr 24 14:27:49.331788 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:49.331759 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7574bf65c9-q6cvt"] Apr 24 14:27:49.331951 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:49.331902 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7574bf65c9-q6cvt" Apr 24 14:27:49.334246 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:49.334214 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 14:27:49.334246 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:49.334226 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 14:27:49.334428 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:49.334256 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 14:27:49.334758 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:49.334735 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 14:27:49.334915 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:49.334759 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-gw62q\"" Apr 24 14:27:49.335176 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:49.335153 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 14:27:49.338876 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:49.338856 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 14:27:49.431294 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:49.431262 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e172b560-3655-4d67-86da-7599eb980870-console-oauth-config\") pod \"console-7574bf65c9-q6cvt\" (UID: \"e172b560-3655-4d67-86da-7599eb980870\") " pod="openshift-console/console-7574bf65c9-q6cvt" Apr 24 14:27:49.431468 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:49.431313 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e172b560-3655-4d67-86da-7599eb980870-trusted-ca-bundle\") pod \"console-7574bf65c9-q6cvt\" (UID: \"e172b560-3655-4d67-86da-7599eb980870\") " pod="openshift-console/console-7574bf65c9-q6cvt" Apr 24 14:27:49.431468 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:49.431419 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e172b560-3655-4d67-86da-7599eb980870-service-ca\") pod \"console-7574bf65c9-q6cvt\" (UID: \"e172b560-3655-4d67-86da-7599eb980870\") " pod="openshift-console/console-7574bf65c9-q6cvt" Apr 24 14:27:49.431468 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:49.431449 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnffc\" (UniqueName: \"kubernetes.io/projected/e172b560-3655-4d67-86da-7599eb980870-kube-api-access-vnffc\") pod \"console-7574bf65c9-q6cvt\" (UID: \"e172b560-3655-4d67-86da-7599eb980870\") " pod="openshift-console/console-7574bf65c9-q6cvt" Apr 24 14:27:49.431633 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:49.431552 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e172b560-3655-4d67-86da-7599eb980870-console-serving-cert\") pod \"console-7574bf65c9-q6cvt\" (UID: \"e172b560-3655-4d67-86da-7599eb980870\") " pod="openshift-console/console-7574bf65c9-q6cvt" Apr 24 14:27:49.431633 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:49.431616 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e172b560-3655-4d67-86da-7599eb980870-oauth-serving-cert\") pod \"console-7574bf65c9-q6cvt\" (UID: \"e172b560-3655-4d67-86da-7599eb980870\") " pod="openshift-console/console-7574bf65c9-q6cvt" Apr 24 14:27:49.431731 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:49.431648 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e172b560-3655-4d67-86da-7599eb980870-console-config\") pod \"console-7574bf65c9-q6cvt\" (UID: \"e172b560-3655-4d67-86da-7599eb980870\") " pod="openshift-console/console-7574bf65c9-q6cvt" Apr 24 14:27:49.532771 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:49.532734 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e172b560-3655-4d67-86da-7599eb980870-trusted-ca-bundle\") pod \"console-7574bf65c9-q6cvt\" (UID: \"e172b560-3655-4d67-86da-7599eb980870\") " pod="openshift-console/console-7574bf65c9-q6cvt" Apr 24 14:27:49.532952 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:49.532804 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e172b560-3655-4d67-86da-7599eb980870-service-ca\") pod \"console-7574bf65c9-q6cvt\" (UID: \"e172b560-3655-4d67-86da-7599eb980870\") " pod="openshift-console/console-7574bf65c9-q6cvt" Apr 24 14:27:49.532952 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:49.532827 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vnffc\" (UniqueName: \"kubernetes.io/projected/e172b560-3655-4d67-86da-7599eb980870-kube-api-access-vnffc\") pod \"console-7574bf65c9-q6cvt\" (UID: \"e172b560-3655-4d67-86da-7599eb980870\") " pod="openshift-console/console-7574bf65c9-q6cvt" Apr 24 14:27:49.532952 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:49.532873 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e172b560-3655-4d67-86da-7599eb980870-console-serving-cert\") pod \"console-7574bf65c9-q6cvt\" (UID: \"e172b560-3655-4d67-86da-7599eb980870\") " pod="openshift-console/console-7574bf65c9-q6cvt" Apr 24 14:27:49.532952 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:49.532923 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e172b560-3655-4d67-86da-7599eb980870-oauth-serving-cert\") pod \"console-7574bf65c9-q6cvt\" (UID: \"e172b560-3655-4d67-86da-7599eb980870\") " pod="openshift-console/console-7574bf65c9-q6cvt" Apr 24 14:27:49.532952 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:49.532949 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e172b560-3655-4d67-86da-7599eb980870-console-config\") pod \"console-7574bf65c9-q6cvt\" (UID: \"e172b560-3655-4d67-86da-7599eb980870\") " pod="openshift-console/console-7574bf65c9-q6cvt" Apr 24 14:27:49.533244 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:49.533011 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e172b560-3655-4d67-86da-7599eb980870-console-oauth-config\") pod \"console-7574bf65c9-q6cvt\" (UID: \"e172b560-3655-4d67-86da-7599eb980870\") " pod="openshift-console/console-7574bf65c9-q6cvt" Apr 24 14:27:49.533666 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:49.533635 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e172b560-3655-4d67-86da-7599eb980870-service-ca\") pod \"console-7574bf65c9-q6cvt\" (UID: \"e172b560-3655-4d67-86da-7599eb980870\") " pod="openshift-console/console-7574bf65c9-q6cvt" Apr 24 14:27:49.533809 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:49.533701 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e172b560-3655-4d67-86da-7599eb980870-trusted-ca-bundle\") pod \"console-7574bf65c9-q6cvt\" (UID: \"e172b560-3655-4d67-86da-7599eb980870\") " pod="openshift-console/console-7574bf65c9-q6cvt" Apr 24 14:27:49.533911 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:49.533879 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e172b560-3655-4d67-86da-7599eb980870-oauth-serving-cert\") pod \"console-7574bf65c9-q6cvt\" (UID: \"e172b560-3655-4d67-86da-7599eb980870\") " pod="openshift-console/console-7574bf65c9-q6cvt" Apr 24 14:27:49.533974 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:49.533879 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e172b560-3655-4d67-86da-7599eb980870-console-config\") pod \"console-7574bf65c9-q6cvt\" (UID: \"e172b560-3655-4d67-86da-7599eb980870\") " pod="openshift-console/console-7574bf65c9-q6cvt" Apr 24 14:27:49.535755 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:49.535736 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e172b560-3655-4d67-86da-7599eb980870-console-oauth-config\") pod \"console-7574bf65c9-q6cvt\" (UID: \"e172b560-3655-4d67-86da-7599eb980870\") " pod="openshift-console/console-7574bf65c9-q6cvt" Apr 24 14:27:49.535849 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:49.535806 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e172b560-3655-4d67-86da-7599eb980870-console-serving-cert\") pod \"console-7574bf65c9-q6cvt\" (UID: \"e172b560-3655-4d67-86da-7599eb980870\") " pod="openshift-console/console-7574bf65c9-q6cvt" Apr 24 14:27:49.546237 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:49.546213 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnffc\" (UniqueName: \"kubernetes.io/projected/e172b560-3655-4d67-86da-7599eb980870-kube-api-access-vnffc\") pod \"console-7574bf65c9-q6cvt\" (UID: \"e172b560-3655-4d67-86da-7599eb980870\") " pod="openshift-console/console-7574bf65c9-q6cvt" Apr 24 14:27:49.644237 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:49.644198 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7574bf65c9-q6cvt" Apr 24 14:27:49.782367 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:49.782324 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7574bf65c9-q6cvt"] Apr 24 14:27:49.786575 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:27:49.786545 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode172b560_3655_4d67_86da_7599eb980870.slice/crio-2c733f10f6464d260e5a23c31f48bbb1203b164c891a74fa1e74ae52c11a1ecc WatchSource:0}: Error finding container 2c733f10f6464d260e5a23c31f48bbb1203b164c891a74fa1e74ae52c11a1ecc: Status 404 returned error can't find the container with id 2c733f10f6464d260e5a23c31f48bbb1203b164c891a74fa1e74ae52c11a1ecc Apr 24 14:27:49.921357 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:49.921270 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" podUID="0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00" containerName="registry" containerID="cri-o://53933d2b06bfabdba52abb4ab04778904a0ff9b27843acaf7997c92ae90a8cbb" gracePeriod=30 Apr 24 14:27:50.095942 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:50.095893 2571 generic.go:358] "Generic (PLEG): container finished" podID="0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00" containerID="53933d2b06bfabdba52abb4ab04778904a0ff9b27843acaf7997c92ae90a8cbb" exitCode=0 Apr 24 14:27:50.096143 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:50.096013 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" event={"ID":"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00","Type":"ContainerDied","Data":"53933d2b06bfabdba52abb4ab04778904a0ff9b27843acaf7997c92ae90a8cbb"} Apr 24 14:27:50.097254 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:50.097228 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7574bf65c9-q6cvt" event={"ID":"e172b560-3655-4d67-86da-7599eb980870","Type":"ContainerStarted","Data":"2c733f10f6464d260e5a23c31f48bbb1203b164c891a74fa1e74ae52c11a1ecc"} Apr 24 14:27:50.182020 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:50.181968 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" Apr 24 14:27:50.241341 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:50.241306 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-installation-pull-secrets\") pod \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " Apr 24 14:27:50.241504 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:50.241361 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-image-registry-private-configuration\") pod \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " Apr 24 14:27:50.241504 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:50.241393 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-registry-certificates\") pod \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " Apr 24 14:27:50.241504 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:50.241409 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-trusted-ca\") pod \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " Apr 24 14:27:50.241504 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:50.241452 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-ca-trust-extracted\") pod \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " Apr 24 14:27:50.241719 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:50.241535 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zckcc\" (UniqueName: \"kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-kube-api-access-zckcc\") pod \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " Apr 24 14:27:50.241719 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:50.241583 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-registry-tls\") pod \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " Apr 24 14:27:50.241719 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:50.241619 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-bound-sa-token\") pod \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\" (UID: \"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00\") " Apr 24 14:27:50.241901 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:50.241874 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00" (UID: "0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:27:50.241952 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:50.241939 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00" (UID: "0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:27:50.244528 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:50.244476 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00" (UID: "0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:27:50.244801 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:50.244756 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00" (UID: "0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:27:50.244801 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:50.244770 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-kube-api-access-zckcc" (OuterVolumeSpecName: "kube-api-access-zckcc") pod "0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00" (UID: "0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00"). InnerVolumeSpecName "kube-api-access-zckcc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:27:50.245005 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:50.244774 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00" (UID: "0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:27:50.245005 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:50.244834 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00" (UID: "0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:27:50.252368 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:50.252341 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00" (UID: "0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:27:50.342884 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:50.342849 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zckcc\" (UniqueName: \"kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-kube-api-access-zckcc\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:27:50.342884 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:50.342884 2571 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-registry-tls\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:27:50.343388 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:50.342901 2571 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-bound-sa-token\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:27:50.343388 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:50.342913 2571 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-installation-pull-secrets\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:27:50.343388 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:50.342922 2571 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-image-registry-private-configuration\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:27:50.343388 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:50.342932 2571 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-registry-certificates\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:27:50.343388 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:50.342943 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-trusted-ca\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:27:50.343388 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:50.342957 2571 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00-ca-trust-extracted\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:27:51.102687 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:51.102650 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" event={"ID":"0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00","Type":"ContainerDied","Data":"a7d3711cd996ba3b6ed557b8f67c61cdf2b14103e42bf0ccfc3730cfe1aab0ae"} Apr 24 14:27:51.102877 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:51.102705 2571 scope.go:117] "RemoveContainer" containerID="53933d2b06bfabdba52abb4ab04778904a0ff9b27843acaf7997c92ae90a8cbb" Apr 24 14:27:51.102877 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:51.102788 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55fdbcc56d-lrqtj" Apr 24 14:27:51.122545 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:51.122516 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-55fdbcc56d-lrqtj"] Apr 24 14:27:51.127481 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:51.127435 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-55fdbcc56d-lrqtj"] Apr 24 14:27:52.462487 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:52.462446 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00" path="/var/lib/kubelet/pods/0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00/volumes" Apr 24 14:27:54.118750 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:54.118711 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7574bf65c9-q6cvt" event={"ID":"e172b560-3655-4d67-86da-7599eb980870","Type":"ContainerStarted","Data":"02fcfdaf0512f3e58c485d6de04420c1562ed7624236e1eabfcbdb51cd1cb3ae"} Apr 24 14:27:54.136769 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:54.136712 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7574bf65c9-q6cvt" podStartSLOduration=1.5415197 podStartE2EDuration="5.136694013s" podCreationTimestamp="2026-04-24 14:27:49 +0000 UTC" firstStartedPulling="2026-04-24 14:27:49.78886266 +0000 UTC m=+205.918840198" lastFinishedPulling="2026-04-24 14:27:53.384036956 +0000 UTC m=+209.514014511" observedRunningTime="2026-04-24 14:27:54.136314167 +0000 UTC m=+210.266291730" watchObservedRunningTime="2026-04-24 14:27:54.136694013 +0000 UTC m=+210.266671576" Apr 24 14:27:59.645032 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:59.644980 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7574bf65c9-q6cvt" Apr 24 14:27:59.645032 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:59.645042 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7574bf65c9-q6cvt" Apr 24 14:27:59.649779 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:27:59.649758 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7574bf65c9-q6cvt" Apr 24 14:28:00.139543 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:00.139509 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7574bf65c9-q6cvt" Apr 24 14:28:15.180743 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:15.180712 2571 generic.go:358] "Generic (PLEG): container finished" podID="6f64bec9-dd77-4222-8388-3b584743cfa7" containerID="7413cb5e872e6f952ae18cb1a1d968d7113dee641ea2cf29e3132ae8bdc8ad79" exitCode=0 Apr 24 14:28:15.181178 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:15.180788 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-rkjfd" event={"ID":"6f64bec9-dd77-4222-8388-3b584743cfa7","Type":"ContainerDied","Data":"7413cb5e872e6f952ae18cb1a1d968d7113dee641ea2cf29e3132ae8bdc8ad79"} Apr 24 14:28:15.181178 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:15.181131 2571 scope.go:117] "RemoveContainer" containerID="7413cb5e872e6f952ae18cb1a1d968d7113dee641ea2cf29e3132ae8bdc8ad79" Apr 24 14:28:16.185498 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:16.185460 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-rkjfd" event={"ID":"6f64bec9-dd77-4222-8388-3b584743cfa7","Type":"ContainerStarted","Data":"34dc3c35a41e70658d736191695c83f2746cfc291e62df2072668b9db5bf92de"} Apr 24 14:28:36.227329 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:36.227233 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a022a0ca-5e80-43a6-8ee0-69dcf197d1a8-metrics-certs\") pod \"network-metrics-daemon-fsnj5\" (UID: \"a022a0ca-5e80-43a6-8ee0-69dcf197d1a8\") " pod="openshift-multus/network-metrics-daemon-fsnj5" Apr 24 14:28:36.229596 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:36.229569 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a022a0ca-5e80-43a6-8ee0-69dcf197d1a8-metrics-certs\") pod \"network-metrics-daemon-fsnj5\" (UID: \"a022a0ca-5e80-43a6-8ee0-69dcf197d1a8\") " pod="openshift-multus/network-metrics-daemon-fsnj5" Apr 24 14:28:36.458804 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:36.458775 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-crqvq\"" Apr 24 14:28:36.467329 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:36.467311 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fsnj5" Apr 24 14:28:36.585496 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:36.585465 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fsnj5"] Apr 24 14:28:36.588368 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:28:36.588337 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda022a0ca_5e80_43a6_8ee0_69dcf197d1a8.slice/crio-d75603423d58bfd3dfdbe9ad7b6bfeaf71da0e05551f7d56f72ca12e6a5e12a4 WatchSource:0}: Error finding container d75603423d58bfd3dfdbe9ad7b6bfeaf71da0e05551f7d56f72ca12e6a5e12a4: Status 404 returned error can't find the container with id d75603423d58bfd3dfdbe9ad7b6bfeaf71da0e05551f7d56f72ca12e6a5e12a4 Apr 24 14:28:37.267614 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:37.267570 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fsnj5" event={"ID":"a022a0ca-5e80-43a6-8ee0-69dcf197d1a8","Type":"ContainerStarted","Data":"d75603423d58bfd3dfdbe9ad7b6bfeaf71da0e05551f7d56f72ca12e6a5e12a4"} Apr 24 14:28:37.970540 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:37.969999 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 14:28:37.970734 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:37.970662 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="dfa4cdab-b9bf-4984-b99d-6aca2dc55292" containerName="alertmanager" containerID="cri-o://fc0db63e36592a473ae51fefb4d1dc0c743f025411d3858c91157d7c11bb14eb" gracePeriod=120 Apr 24 14:28:37.970808 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:37.970765 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="dfa4cdab-b9bf-4984-b99d-6aca2dc55292" containerName="prom-label-proxy" containerID="cri-o://e61b782d10492e5085915bc92bb1ad599146b9a57a0b2d5bf11c820dd9928100" gracePeriod=120 Apr 24 14:28:37.970808 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:37.970790 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="dfa4cdab-b9bf-4984-b99d-6aca2dc55292" containerName="kube-rbac-proxy" containerID="cri-o://f835458bf8a52fa90ae1eb11e92f9a8d6cf760a54fc107a08e47f86a273228fd" gracePeriod=120 Apr 24 14:28:37.970808 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:37.970768 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="dfa4cdab-b9bf-4984-b99d-6aca2dc55292" containerName="kube-rbac-proxy-web" containerID="cri-o://3ef0433ea7ee86f8a32757a8e2a839fb11a592629b07f8478588a9949d88e068" gracePeriod=120 Apr 24 14:28:37.970968 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:37.970760 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="dfa4cdab-b9bf-4984-b99d-6aca2dc55292" containerName="kube-rbac-proxy-metric" containerID="cri-o://47b90b223607467342b53bf0d5458d852c06b81e71c3def862915ce2fc4836f3" gracePeriod=120 Apr 24 14:28:37.970968 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:37.970796 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="dfa4cdab-b9bf-4984-b99d-6aca2dc55292" containerName="config-reloader" containerID="cri-o://906891c8ea3bf3d3d6e5e126edb8625f100ab5c6cd1225149cc307591690fd72" gracePeriod=120 Apr 24 14:28:38.279027 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:38.278928 2571 generic.go:358] "Generic (PLEG): container finished" podID="dfa4cdab-b9bf-4984-b99d-6aca2dc55292" containerID="e61b782d10492e5085915bc92bb1ad599146b9a57a0b2d5bf11c820dd9928100" exitCode=0 Apr 24 14:28:38.279027 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:38.278952 2571 generic.go:358] "Generic (PLEG): container finished" podID="dfa4cdab-b9bf-4984-b99d-6aca2dc55292" containerID="f835458bf8a52fa90ae1eb11e92f9a8d6cf760a54fc107a08e47f86a273228fd" exitCode=0 Apr 24 14:28:38.279027 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:38.278958 2571 generic.go:358] "Generic (PLEG): container finished" podID="dfa4cdab-b9bf-4984-b99d-6aca2dc55292" containerID="906891c8ea3bf3d3d6e5e126edb8625f100ab5c6cd1225149cc307591690fd72" exitCode=0 Apr 24 14:28:38.279027 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:38.278963 2571 generic.go:358] "Generic (PLEG): container finished" podID="dfa4cdab-b9bf-4984-b99d-6aca2dc55292" containerID="fc0db63e36592a473ae51fefb4d1dc0c743f025411d3858c91157d7c11bb14eb" exitCode=0 Apr 24 14:28:38.279027 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:38.279012 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dfa4cdab-b9bf-4984-b99d-6aca2dc55292","Type":"ContainerDied","Data":"e61b782d10492e5085915bc92bb1ad599146b9a57a0b2d5bf11c820dd9928100"} Apr 24 14:28:38.279549 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:38.279043 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dfa4cdab-b9bf-4984-b99d-6aca2dc55292","Type":"ContainerDied","Data":"f835458bf8a52fa90ae1eb11e92f9a8d6cf760a54fc107a08e47f86a273228fd"} Apr 24 14:28:38.279549 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:38.279055 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dfa4cdab-b9bf-4984-b99d-6aca2dc55292","Type":"ContainerDied","Data":"906891c8ea3bf3d3d6e5e126edb8625f100ab5c6cd1225149cc307591690fd72"} Apr 24 14:28:38.279549 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:38.279065 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dfa4cdab-b9bf-4984-b99d-6aca2dc55292","Type":"ContainerDied","Data":"fc0db63e36592a473ae51fefb4d1dc0c743f025411d3858c91157d7c11bb14eb"} Apr 24 14:28:38.280498 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:38.280474 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fsnj5" event={"ID":"a022a0ca-5e80-43a6-8ee0-69dcf197d1a8","Type":"ContainerStarted","Data":"82728968660e166515bea90550525ee0a7e5fe94c5cf3cd120a258e0bc87f4b6"} Apr 24 14:28:38.280584 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:38.280507 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fsnj5" event={"ID":"a022a0ca-5e80-43a6-8ee0-69dcf197d1a8","Type":"ContainerStarted","Data":"4ca6a55fd5799219efb3cdb2d8b761b7c7ebfc356015b0130d35f84c2060cac8"} Apr 24 14:28:38.295007 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:38.294956 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-fsnj5" podStartSLOduration=253.372825922 podStartE2EDuration="4m14.29494441s" podCreationTimestamp="2026-04-24 14:24:24 +0000 UTC" firstStartedPulling="2026-04-24 14:28:36.590140494 +0000 UTC m=+252.720118039" lastFinishedPulling="2026-04-24 14:28:37.512258986 +0000 UTC m=+253.642236527" observedRunningTime="2026-04-24 14:28:38.294922339 +0000 UTC m=+254.424899913" watchObservedRunningTime="2026-04-24 14:28:38.29494441 +0000 UTC m=+254.424921970" Apr 24 14:28:39.227122 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.227098 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.252879 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.252851 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltf29\" (UniqueName: \"kubernetes.io/projected/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-kube-api-access-ltf29\") pod \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " Apr 24 14:28:39.253045 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.252887 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-secret-alertmanager-kube-rbac-proxy-metric\") pod \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " Apr 24 14:28:39.253045 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.252908 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-config-volume\") pod \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " Apr 24 14:28:39.253045 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.252927 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-metrics-client-ca\") pod \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " Apr 24 14:28:39.253045 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.252962 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-alertmanager-trusted-ca-bundle\") pod \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " Apr 24 14:28:39.253045 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.253019 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-secret-alertmanager-kube-rbac-proxy\") pod \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " Apr 24 14:28:39.253045 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.253042 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-config-out\") pod \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " Apr 24 14:28:39.254113 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.253477 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-web-config\") pod \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " Apr 24 14:28:39.254113 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.253540 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-secret-alertmanager-main-tls\") pod \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " Apr 24 14:28:39.254113 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.253590 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-secret-alertmanager-kube-rbac-proxy-web\") pod \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " Apr 24 14:28:39.254113 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.253629 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-tls-assets\") pod \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " Apr 24 14:28:39.254113 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.253630 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "dfa4cdab-b9bf-4984-b99d-6aca2dc55292" (UID: "dfa4cdab-b9bf-4984-b99d-6aca2dc55292"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:28:39.254113 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.253654 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-cluster-tls-config\") pod \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " Apr 24 14:28:39.254113 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.253687 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-alertmanager-main-db\") pod \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\" (UID: \"dfa4cdab-b9bf-4984-b99d-6aca2dc55292\") " Apr 24 14:28:39.254113 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.253936 2571 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:28:39.254113 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.254097 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "dfa4cdab-b9bf-4984-b99d-6aca2dc55292" (UID: "dfa4cdab-b9bf-4984-b99d-6aca2dc55292"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:28:39.254831 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.254271 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "dfa4cdab-b9bf-4984-b99d-6aca2dc55292" (UID: "dfa4cdab-b9bf-4984-b99d-6aca2dc55292"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:28:39.258464 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.258407 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-config-out" (OuterVolumeSpecName: "config-out") pod "dfa4cdab-b9bf-4984-b99d-6aca2dc55292" (UID: "dfa4cdab-b9bf-4984-b99d-6aca2dc55292"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:28:39.258578 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.258527 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-config-volume" (OuterVolumeSpecName: "config-volume") pod "dfa4cdab-b9bf-4984-b99d-6aca2dc55292" (UID: "dfa4cdab-b9bf-4984-b99d-6aca2dc55292"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:28:39.258704 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.258650 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-kube-api-access-ltf29" (OuterVolumeSpecName: "kube-api-access-ltf29") pod "dfa4cdab-b9bf-4984-b99d-6aca2dc55292" (UID: "dfa4cdab-b9bf-4984-b99d-6aca2dc55292"). InnerVolumeSpecName "kube-api-access-ltf29". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:28:39.258704 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.258674 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "dfa4cdab-b9bf-4984-b99d-6aca2dc55292" (UID: "dfa4cdab-b9bf-4984-b99d-6aca2dc55292"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:28:39.258848 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.258819 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "dfa4cdab-b9bf-4984-b99d-6aca2dc55292" (UID: "dfa4cdab-b9bf-4984-b99d-6aca2dc55292"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:28:39.258929 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.258845 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "dfa4cdab-b9bf-4984-b99d-6aca2dc55292" (UID: "dfa4cdab-b9bf-4984-b99d-6aca2dc55292"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:28:39.260874 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.260231 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "dfa4cdab-b9bf-4984-b99d-6aca2dc55292" (UID: "dfa4cdab-b9bf-4984-b99d-6aca2dc55292"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:28:39.260874 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.260829 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "dfa4cdab-b9bf-4984-b99d-6aca2dc55292" (UID: "dfa4cdab-b9bf-4984-b99d-6aca2dc55292"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:28:39.264972 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.264948 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "dfa4cdab-b9bf-4984-b99d-6aca2dc55292" (UID: "dfa4cdab-b9bf-4984-b99d-6aca2dc55292"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:28:39.272964 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.272915 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-web-config" (OuterVolumeSpecName: "web-config") pod "dfa4cdab-b9bf-4984-b99d-6aca2dc55292" (UID: "dfa4cdab-b9bf-4984-b99d-6aca2dc55292"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:28:39.286584 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.286561 2571 generic.go:358] "Generic (PLEG): container finished" podID="dfa4cdab-b9bf-4984-b99d-6aca2dc55292" containerID="47b90b223607467342b53bf0d5458d852c06b81e71c3def862915ce2fc4836f3" exitCode=0 Apr 24 14:28:39.286584 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.286583 2571 generic.go:358] "Generic (PLEG): container finished" podID="dfa4cdab-b9bf-4984-b99d-6aca2dc55292" containerID="3ef0433ea7ee86f8a32757a8e2a839fb11a592629b07f8478588a9949d88e068" exitCode=0 Apr 24 14:28:39.286937 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.286642 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dfa4cdab-b9bf-4984-b99d-6aca2dc55292","Type":"ContainerDied","Data":"47b90b223607467342b53bf0d5458d852c06b81e71c3def862915ce2fc4836f3"} Apr 24 14:28:39.286937 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.286679 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.286937 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.286685 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dfa4cdab-b9bf-4984-b99d-6aca2dc55292","Type":"ContainerDied","Data":"3ef0433ea7ee86f8a32757a8e2a839fb11a592629b07f8478588a9949d88e068"} Apr 24 14:28:39.286937 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.286702 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dfa4cdab-b9bf-4984-b99d-6aca2dc55292","Type":"ContainerDied","Data":"eb1e4b0b0dc07a8de2d6252bc280ad37b3ccbdb06fa3316a54b3b0db8425ee53"} Apr 24 14:28:39.286937 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.286720 2571 scope.go:117] "RemoveContainer" containerID="e61b782d10492e5085915bc92bb1ad599146b9a57a0b2d5bf11c820dd9928100" Apr 24 14:28:39.293884 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.293869 2571 scope.go:117] "RemoveContainer" containerID="47b90b223607467342b53bf0d5458d852c06b81e71c3def862915ce2fc4836f3" Apr 24 14:28:39.301187 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.301170 2571 scope.go:117] "RemoveContainer" containerID="f835458bf8a52fa90ae1eb11e92f9a8d6cf760a54fc107a08e47f86a273228fd" Apr 24 14:28:39.307688 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.307671 2571 scope.go:117] "RemoveContainer" containerID="3ef0433ea7ee86f8a32757a8e2a839fb11a592629b07f8478588a9949d88e068" Apr 24 14:28:39.313852 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.313835 2571 scope.go:117] "RemoveContainer" containerID="906891c8ea3bf3d3d6e5e126edb8625f100ab5c6cd1225149cc307591690fd72" Apr 24 14:28:39.320627 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.320603 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 14:28:39.320940 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.320927 2571 scope.go:117] "RemoveContainer" containerID="fc0db63e36592a473ae51fefb4d1dc0c743f025411d3858c91157d7c11bb14eb" Apr 24 14:28:39.325213 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.325193 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 14:28:39.329196 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.328746 2571 scope.go:117] "RemoveContainer" containerID="a0969595f195037d0a6e7afce7b63687325dd6342c2ecd4760aa28248d514315" Apr 24 14:28:39.336293 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.336276 2571 scope.go:117] "RemoveContainer" containerID="e61b782d10492e5085915bc92bb1ad599146b9a57a0b2d5bf11c820dd9928100" Apr 24 14:28:39.336535 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:28:39.336517 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e61b782d10492e5085915bc92bb1ad599146b9a57a0b2d5bf11c820dd9928100\": container with ID starting with e61b782d10492e5085915bc92bb1ad599146b9a57a0b2d5bf11c820dd9928100 not found: ID does not exist" containerID="e61b782d10492e5085915bc92bb1ad599146b9a57a0b2d5bf11c820dd9928100" Apr 24 14:28:39.336593 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.336542 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e61b782d10492e5085915bc92bb1ad599146b9a57a0b2d5bf11c820dd9928100"} err="failed to get container status \"e61b782d10492e5085915bc92bb1ad599146b9a57a0b2d5bf11c820dd9928100\": rpc error: code = NotFound desc = could not find container \"e61b782d10492e5085915bc92bb1ad599146b9a57a0b2d5bf11c820dd9928100\": container with ID starting with e61b782d10492e5085915bc92bb1ad599146b9a57a0b2d5bf11c820dd9928100 not found: ID does not exist" Apr 24 14:28:39.336593 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.336574 2571 scope.go:117] "RemoveContainer" containerID="47b90b223607467342b53bf0d5458d852c06b81e71c3def862915ce2fc4836f3" Apr 24 14:28:39.336797 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:28:39.336779 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47b90b223607467342b53bf0d5458d852c06b81e71c3def862915ce2fc4836f3\": container with ID starting with 47b90b223607467342b53bf0d5458d852c06b81e71c3def862915ce2fc4836f3 not found: ID does not exist" containerID="47b90b223607467342b53bf0d5458d852c06b81e71c3def862915ce2fc4836f3" Apr 24 14:28:39.336841 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.336806 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47b90b223607467342b53bf0d5458d852c06b81e71c3def862915ce2fc4836f3"} err="failed to get container status \"47b90b223607467342b53bf0d5458d852c06b81e71c3def862915ce2fc4836f3\": rpc error: code = NotFound desc = could not find container \"47b90b223607467342b53bf0d5458d852c06b81e71c3def862915ce2fc4836f3\": container with ID starting with 47b90b223607467342b53bf0d5458d852c06b81e71c3def862915ce2fc4836f3 not found: ID does not exist" Apr 24 14:28:39.336841 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.336826 2571 scope.go:117] "RemoveContainer" containerID="f835458bf8a52fa90ae1eb11e92f9a8d6cf760a54fc107a08e47f86a273228fd" Apr 24 14:28:39.337048 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:28:39.337031 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f835458bf8a52fa90ae1eb11e92f9a8d6cf760a54fc107a08e47f86a273228fd\": container with ID starting with f835458bf8a52fa90ae1eb11e92f9a8d6cf760a54fc107a08e47f86a273228fd not found: ID does not exist" containerID="f835458bf8a52fa90ae1eb11e92f9a8d6cf760a54fc107a08e47f86a273228fd" Apr 24 14:28:39.337102 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.337052 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f835458bf8a52fa90ae1eb11e92f9a8d6cf760a54fc107a08e47f86a273228fd"} err="failed to get container status \"f835458bf8a52fa90ae1eb11e92f9a8d6cf760a54fc107a08e47f86a273228fd\": rpc error: code = NotFound desc = could not find container \"f835458bf8a52fa90ae1eb11e92f9a8d6cf760a54fc107a08e47f86a273228fd\": container with ID starting with f835458bf8a52fa90ae1eb11e92f9a8d6cf760a54fc107a08e47f86a273228fd not found: ID does not exist" Apr 24 14:28:39.337102 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.337065 2571 scope.go:117] "RemoveContainer" containerID="3ef0433ea7ee86f8a32757a8e2a839fb11a592629b07f8478588a9949d88e068" Apr 24 14:28:39.337241 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:28:39.337225 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ef0433ea7ee86f8a32757a8e2a839fb11a592629b07f8478588a9949d88e068\": container with ID starting with 3ef0433ea7ee86f8a32757a8e2a839fb11a592629b07f8478588a9949d88e068 not found: ID does not exist" containerID="3ef0433ea7ee86f8a32757a8e2a839fb11a592629b07f8478588a9949d88e068" Apr 24 14:28:39.337284 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.337244 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ef0433ea7ee86f8a32757a8e2a839fb11a592629b07f8478588a9949d88e068"} err="failed to get container status \"3ef0433ea7ee86f8a32757a8e2a839fb11a592629b07f8478588a9949d88e068\": rpc error: code = NotFound desc = could not find container \"3ef0433ea7ee86f8a32757a8e2a839fb11a592629b07f8478588a9949d88e068\": container with ID starting with 3ef0433ea7ee86f8a32757a8e2a839fb11a592629b07f8478588a9949d88e068 not found: ID does not exist" Apr 24 14:28:39.337284 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.337259 2571 scope.go:117] "RemoveContainer" containerID="906891c8ea3bf3d3d6e5e126edb8625f100ab5c6cd1225149cc307591690fd72" Apr 24 14:28:39.337486 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:28:39.337468 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"906891c8ea3bf3d3d6e5e126edb8625f100ab5c6cd1225149cc307591690fd72\": container with ID starting with 906891c8ea3bf3d3d6e5e126edb8625f100ab5c6cd1225149cc307591690fd72 not found: ID does not exist" containerID="906891c8ea3bf3d3d6e5e126edb8625f100ab5c6cd1225149cc307591690fd72" Apr 24 14:28:39.337530 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.337492 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"906891c8ea3bf3d3d6e5e126edb8625f100ab5c6cd1225149cc307591690fd72"} err="failed to get container status \"906891c8ea3bf3d3d6e5e126edb8625f100ab5c6cd1225149cc307591690fd72\": rpc error: code = NotFound desc = could not find container \"906891c8ea3bf3d3d6e5e126edb8625f100ab5c6cd1225149cc307591690fd72\": container with ID starting with 906891c8ea3bf3d3d6e5e126edb8625f100ab5c6cd1225149cc307591690fd72 not found: ID does not exist" Apr 24 14:28:39.337530 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.337506 2571 scope.go:117] "RemoveContainer" containerID="fc0db63e36592a473ae51fefb4d1dc0c743f025411d3858c91157d7c11bb14eb" Apr 24 14:28:39.337732 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:28:39.337713 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc0db63e36592a473ae51fefb4d1dc0c743f025411d3858c91157d7c11bb14eb\": container with ID starting with fc0db63e36592a473ae51fefb4d1dc0c743f025411d3858c91157d7c11bb14eb not found: ID does not exist" containerID="fc0db63e36592a473ae51fefb4d1dc0c743f025411d3858c91157d7c11bb14eb" Apr 24 14:28:39.337772 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.337735 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc0db63e36592a473ae51fefb4d1dc0c743f025411d3858c91157d7c11bb14eb"} err="failed to get container status \"fc0db63e36592a473ae51fefb4d1dc0c743f025411d3858c91157d7c11bb14eb\": rpc error: code = NotFound desc = could not find container \"fc0db63e36592a473ae51fefb4d1dc0c743f025411d3858c91157d7c11bb14eb\": container with ID starting with fc0db63e36592a473ae51fefb4d1dc0c743f025411d3858c91157d7c11bb14eb not found: ID does not exist" Apr 24 14:28:39.337772 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.337747 2571 scope.go:117] "RemoveContainer" containerID="a0969595f195037d0a6e7afce7b63687325dd6342c2ecd4760aa28248d514315" Apr 24 14:28:39.337959 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:28:39.337937 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0969595f195037d0a6e7afce7b63687325dd6342c2ecd4760aa28248d514315\": container with ID starting with a0969595f195037d0a6e7afce7b63687325dd6342c2ecd4760aa28248d514315 not found: ID does not exist" containerID="a0969595f195037d0a6e7afce7b63687325dd6342c2ecd4760aa28248d514315" Apr 24 14:28:39.338136 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.337964 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0969595f195037d0a6e7afce7b63687325dd6342c2ecd4760aa28248d514315"} err="failed to get container status \"a0969595f195037d0a6e7afce7b63687325dd6342c2ecd4760aa28248d514315\": rpc error: code = NotFound desc = could not find container \"a0969595f195037d0a6e7afce7b63687325dd6342c2ecd4760aa28248d514315\": container with ID starting with a0969595f195037d0a6e7afce7b63687325dd6342c2ecd4760aa28248d514315 not found: ID does not exist" Apr 24 14:28:39.338136 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.338007 2571 scope.go:117] "RemoveContainer" containerID="e61b782d10492e5085915bc92bb1ad599146b9a57a0b2d5bf11c820dd9928100" Apr 24 14:28:39.338247 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.338191 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e61b782d10492e5085915bc92bb1ad599146b9a57a0b2d5bf11c820dd9928100"} err="failed to get container status \"e61b782d10492e5085915bc92bb1ad599146b9a57a0b2d5bf11c820dd9928100\": rpc error: code = NotFound desc = could not find container \"e61b782d10492e5085915bc92bb1ad599146b9a57a0b2d5bf11c820dd9928100\": container with ID starting with e61b782d10492e5085915bc92bb1ad599146b9a57a0b2d5bf11c820dd9928100 not found: ID does not exist" Apr 24 14:28:39.338247 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.338208 2571 scope.go:117] "RemoveContainer" containerID="47b90b223607467342b53bf0d5458d852c06b81e71c3def862915ce2fc4836f3" Apr 24 14:28:39.338410 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.338391 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47b90b223607467342b53bf0d5458d852c06b81e71c3def862915ce2fc4836f3"} err="failed to get container status \"47b90b223607467342b53bf0d5458d852c06b81e71c3def862915ce2fc4836f3\": rpc error: code = NotFound desc = could not find container \"47b90b223607467342b53bf0d5458d852c06b81e71c3def862915ce2fc4836f3\": container with ID starting with 47b90b223607467342b53bf0d5458d852c06b81e71c3def862915ce2fc4836f3 not found: ID does not exist" Apr 24 14:28:39.338480 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.338413 2571 scope.go:117] "RemoveContainer" containerID="f835458bf8a52fa90ae1eb11e92f9a8d6cf760a54fc107a08e47f86a273228fd" Apr 24 14:28:39.338626 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.338610 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f835458bf8a52fa90ae1eb11e92f9a8d6cf760a54fc107a08e47f86a273228fd"} err="failed to get container status \"f835458bf8a52fa90ae1eb11e92f9a8d6cf760a54fc107a08e47f86a273228fd\": rpc error: code = NotFound desc = could not find container \"f835458bf8a52fa90ae1eb11e92f9a8d6cf760a54fc107a08e47f86a273228fd\": container with ID starting with f835458bf8a52fa90ae1eb11e92f9a8d6cf760a54fc107a08e47f86a273228fd not found: ID does not exist" Apr 24 14:28:39.338677 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.338626 2571 scope.go:117] "RemoveContainer" containerID="3ef0433ea7ee86f8a32757a8e2a839fb11a592629b07f8478588a9949d88e068" Apr 24 14:28:39.338814 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.338797 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ef0433ea7ee86f8a32757a8e2a839fb11a592629b07f8478588a9949d88e068"} err="failed to get container status \"3ef0433ea7ee86f8a32757a8e2a839fb11a592629b07f8478588a9949d88e068\": rpc error: code = NotFound desc = could not find container \"3ef0433ea7ee86f8a32757a8e2a839fb11a592629b07f8478588a9949d88e068\": container with ID starting with 3ef0433ea7ee86f8a32757a8e2a839fb11a592629b07f8478588a9949d88e068 not found: ID does not exist" Apr 24 14:28:39.338814 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.338814 2571 scope.go:117] "RemoveContainer" containerID="906891c8ea3bf3d3d6e5e126edb8625f100ab5c6cd1225149cc307591690fd72" Apr 24 14:28:39.339036 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.339018 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"906891c8ea3bf3d3d6e5e126edb8625f100ab5c6cd1225149cc307591690fd72"} err="failed to get container status \"906891c8ea3bf3d3d6e5e126edb8625f100ab5c6cd1225149cc307591690fd72\": rpc error: code = NotFound desc = could not find container \"906891c8ea3bf3d3d6e5e126edb8625f100ab5c6cd1225149cc307591690fd72\": container with ID starting with 906891c8ea3bf3d3d6e5e126edb8625f100ab5c6cd1225149cc307591690fd72 not found: ID does not exist" Apr 24 14:28:39.339109 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.339038 2571 scope.go:117] "RemoveContainer" containerID="fc0db63e36592a473ae51fefb4d1dc0c743f025411d3858c91157d7c11bb14eb" Apr 24 14:28:39.339254 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.339236 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc0db63e36592a473ae51fefb4d1dc0c743f025411d3858c91157d7c11bb14eb"} err="failed to get container status \"fc0db63e36592a473ae51fefb4d1dc0c743f025411d3858c91157d7c11bb14eb\": rpc error: code = NotFound desc = could not find container \"fc0db63e36592a473ae51fefb4d1dc0c743f025411d3858c91157d7c11bb14eb\": container with ID starting with fc0db63e36592a473ae51fefb4d1dc0c743f025411d3858c91157d7c11bb14eb not found: ID does not exist" Apr 24 14:28:39.339294 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.339255 2571 scope.go:117] "RemoveContainer" containerID="a0969595f195037d0a6e7afce7b63687325dd6342c2ecd4760aa28248d514315" Apr 24 14:28:39.339459 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.339442 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0969595f195037d0a6e7afce7b63687325dd6342c2ecd4760aa28248d514315"} err="failed to get container status \"a0969595f195037d0a6e7afce7b63687325dd6342c2ecd4760aa28248d514315\": rpc error: code = NotFound desc = could not find container \"a0969595f195037d0a6e7afce7b63687325dd6342c2ecd4760aa28248d514315\": container with ID starting with a0969595f195037d0a6e7afce7b63687325dd6342c2ecd4760aa28248d514315 not found: ID does not exist" Apr 24 14:28:39.354844 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.354775 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:28:39.355020 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.354970 2571 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-config-out\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:28:39.355020 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.355016 2571 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-web-config\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:28:39.355175 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.355033 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-secret-alertmanager-main-tls\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:28:39.355175 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.355048 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:28:39.355175 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.355062 2571 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-tls-assets\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:28:39.355175 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.355075 2571 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-cluster-tls-config\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:28:39.355175 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.355088 2571 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-alertmanager-main-db\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:28:39.355175 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.355103 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ltf29\" (UniqueName: \"kubernetes.io/projected/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-kube-api-access-ltf29\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:28:39.355175 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.355120 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:28:39.355175 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.355135 2571 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-config-volume\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:28:39.355175 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.355150 2571 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dfa4cdab-b9bf-4984-b99d-6aca2dc55292-metrics-client-ca\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:28:39.356109 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.356090 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 14:28:39.356367 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.356356 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dfa4cdab-b9bf-4984-b99d-6aca2dc55292" containerName="config-reloader" Apr 24 14:28:39.356426 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.356369 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa4cdab-b9bf-4984-b99d-6aca2dc55292" containerName="config-reloader" Apr 24 14:28:39.356426 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.356380 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dfa4cdab-b9bf-4984-b99d-6aca2dc55292" containerName="kube-rbac-proxy-metric" Apr 24 14:28:39.356426 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.356386 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa4cdab-b9bf-4984-b99d-6aca2dc55292" containerName="kube-rbac-proxy-metric" Apr 24 14:28:39.356426 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.356394 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dfa4cdab-b9bf-4984-b99d-6aca2dc55292" containerName="init-config-reloader" Apr 24 14:28:39.356426 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.356399 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa4cdab-b9bf-4984-b99d-6aca2dc55292" containerName="init-config-reloader" Apr 24 14:28:39.356426 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.356410 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dfa4cdab-b9bf-4984-b99d-6aca2dc55292" containerName="kube-rbac-proxy-web" Apr 24 14:28:39.356426 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.356416 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa4cdab-b9bf-4984-b99d-6aca2dc55292" containerName="kube-rbac-proxy-web" Apr 24 14:28:39.356426 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.356425 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dfa4cdab-b9bf-4984-b99d-6aca2dc55292" containerName="kube-rbac-proxy" Apr 24 14:28:39.356426 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.356431 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa4cdab-b9bf-4984-b99d-6aca2dc55292" containerName="kube-rbac-proxy" Apr 24 14:28:39.356830 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.356439 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00" containerName="registry" Apr 24 14:28:39.356830 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.356444 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00" containerName="registry" Apr 24 14:28:39.356830 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.356453 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dfa4cdab-b9bf-4984-b99d-6aca2dc55292" containerName="alertmanager" Apr 24 14:28:39.356830 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.356458 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa4cdab-b9bf-4984-b99d-6aca2dc55292" containerName="alertmanager" Apr 24 14:28:39.356830 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.356463 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dfa4cdab-b9bf-4984-b99d-6aca2dc55292" containerName="prom-label-proxy" Apr 24 14:28:39.356830 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.356469 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa4cdab-b9bf-4984-b99d-6aca2dc55292" containerName="prom-label-proxy" Apr 24 14:28:39.356830 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.356509 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="dfa4cdab-b9bf-4984-b99d-6aca2dc55292" containerName="kube-rbac-proxy" Apr 24 14:28:39.356830 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.356517 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="dfa4cdab-b9bf-4984-b99d-6aca2dc55292" containerName="config-reloader" Apr 24 14:28:39.356830 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.356525 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="0fbd6e7d-aa77-4475-bd4c-8a0d6edc0a00" containerName="registry" Apr 24 14:28:39.356830 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.356531 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="dfa4cdab-b9bf-4984-b99d-6aca2dc55292" containerName="kube-rbac-proxy-metric" Apr 24 14:28:39.356830 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.356537 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="dfa4cdab-b9bf-4984-b99d-6aca2dc55292" containerName="prom-label-proxy" Apr 24 14:28:39.356830 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.356543 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="dfa4cdab-b9bf-4984-b99d-6aca2dc55292" containerName="alertmanager" Apr 24 14:28:39.356830 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.356551 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="dfa4cdab-b9bf-4984-b99d-6aca2dc55292" containerName="kube-rbac-proxy-web" Apr 24 14:28:39.364505 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.364465 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.366755 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.366736 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 14:28:39.366857 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.366754 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 14:28:39.366857 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.366736 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 14:28:39.366967 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.366938 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-lcd95\"" Apr 24 14:28:39.366967 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.366959 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 14:28:39.367095 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.367010 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 14:28:39.367131 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.367107 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 14:28:39.367350 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.367336 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 14:28:39.367647 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.367630 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 14:28:39.372649 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.372630 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 14:28:39.398607 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.398570 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 14:28:39.456117 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.456087 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9a8c48b3-0448-4b17-b8a8-5e51dff99526-web-config\") pod \"alertmanager-main-0\" (UID: \"9a8c48b3-0448-4b17-b8a8-5e51dff99526\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.456117 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.456118 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf6dc\" (UniqueName: \"kubernetes.io/projected/9a8c48b3-0448-4b17-b8a8-5e51dff99526-kube-api-access-tf6dc\") pod \"alertmanager-main-0\" (UID: \"9a8c48b3-0448-4b17-b8a8-5e51dff99526\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.456306 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.456138 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9a8c48b3-0448-4b17-b8a8-5e51dff99526-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"9a8c48b3-0448-4b17-b8a8-5e51dff99526\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.456306 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.456188 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9a8c48b3-0448-4b17-b8a8-5e51dff99526-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"9a8c48b3-0448-4b17-b8a8-5e51dff99526\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.456306 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.456214 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9a8c48b3-0448-4b17-b8a8-5e51dff99526-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"9a8c48b3-0448-4b17-b8a8-5e51dff99526\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.456306 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.456242 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9a8c48b3-0448-4b17-b8a8-5e51dff99526-tls-assets\") pod \"alertmanager-main-0\" (UID: \"9a8c48b3-0448-4b17-b8a8-5e51dff99526\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.456306 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.456260 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9a8c48b3-0448-4b17-b8a8-5e51dff99526-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"9a8c48b3-0448-4b17-b8a8-5e51dff99526\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.456306 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.456277 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9a8c48b3-0448-4b17-b8a8-5e51dff99526-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"9a8c48b3-0448-4b17-b8a8-5e51dff99526\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.456306 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.456294 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a8c48b3-0448-4b17-b8a8-5e51dff99526-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"9a8c48b3-0448-4b17-b8a8-5e51dff99526\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.456534 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.456312 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9a8c48b3-0448-4b17-b8a8-5e51dff99526-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"9a8c48b3-0448-4b17-b8a8-5e51dff99526\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.456534 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.456357 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9a8c48b3-0448-4b17-b8a8-5e51dff99526-config-volume\") pod \"alertmanager-main-0\" (UID: \"9a8c48b3-0448-4b17-b8a8-5e51dff99526\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.456534 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.456388 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9a8c48b3-0448-4b17-b8a8-5e51dff99526-config-out\") pod \"alertmanager-main-0\" (UID: \"9a8c48b3-0448-4b17-b8a8-5e51dff99526\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.456534 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.456415 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9a8c48b3-0448-4b17-b8a8-5e51dff99526-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"9a8c48b3-0448-4b17-b8a8-5e51dff99526\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.557470 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.557372 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9a8c48b3-0448-4b17-b8a8-5e51dff99526-tls-assets\") pod \"alertmanager-main-0\" (UID: \"9a8c48b3-0448-4b17-b8a8-5e51dff99526\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.557470 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.557422 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9a8c48b3-0448-4b17-b8a8-5e51dff99526-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"9a8c48b3-0448-4b17-b8a8-5e51dff99526\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.557470 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.557452 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9a8c48b3-0448-4b17-b8a8-5e51dff99526-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"9a8c48b3-0448-4b17-b8a8-5e51dff99526\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.557741 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.557483 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a8c48b3-0448-4b17-b8a8-5e51dff99526-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"9a8c48b3-0448-4b17-b8a8-5e51dff99526\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.557741 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.557511 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9a8c48b3-0448-4b17-b8a8-5e51dff99526-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"9a8c48b3-0448-4b17-b8a8-5e51dff99526\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.557741 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.557550 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9a8c48b3-0448-4b17-b8a8-5e51dff99526-config-volume\") pod \"alertmanager-main-0\" (UID: \"9a8c48b3-0448-4b17-b8a8-5e51dff99526\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.557741 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.557684 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9a8c48b3-0448-4b17-b8a8-5e51dff99526-config-out\") pod \"alertmanager-main-0\" (UID: \"9a8c48b3-0448-4b17-b8a8-5e51dff99526\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.557741 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.557729 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9a8c48b3-0448-4b17-b8a8-5e51dff99526-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"9a8c48b3-0448-4b17-b8a8-5e51dff99526\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.557972 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.557770 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9a8c48b3-0448-4b17-b8a8-5e51dff99526-web-config\") pod \"alertmanager-main-0\" (UID: \"9a8c48b3-0448-4b17-b8a8-5e51dff99526\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.557972 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.557797 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tf6dc\" (UniqueName: \"kubernetes.io/projected/9a8c48b3-0448-4b17-b8a8-5e51dff99526-kube-api-access-tf6dc\") pod \"alertmanager-main-0\" (UID: \"9a8c48b3-0448-4b17-b8a8-5e51dff99526\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.557972 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.557834 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9a8c48b3-0448-4b17-b8a8-5e51dff99526-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"9a8c48b3-0448-4b17-b8a8-5e51dff99526\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.557972 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.557869 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9a8c48b3-0448-4b17-b8a8-5e51dff99526-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"9a8c48b3-0448-4b17-b8a8-5e51dff99526\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.557972 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.557916 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9a8c48b3-0448-4b17-b8a8-5e51dff99526-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"9a8c48b3-0448-4b17-b8a8-5e51dff99526\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.558369 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.558347 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9a8c48b3-0448-4b17-b8a8-5e51dff99526-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"9a8c48b3-0448-4b17-b8a8-5e51dff99526\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.558485 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.558459 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a8c48b3-0448-4b17-b8a8-5e51dff99526-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"9a8c48b3-0448-4b17-b8a8-5e51dff99526\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.559183 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.558712 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9a8c48b3-0448-4b17-b8a8-5e51dff99526-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"9a8c48b3-0448-4b17-b8a8-5e51dff99526\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.560732 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.560677 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9a8c48b3-0448-4b17-b8a8-5e51dff99526-config-out\") pod \"alertmanager-main-0\" (UID: \"9a8c48b3-0448-4b17-b8a8-5e51dff99526\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.560732 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.560690 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9a8c48b3-0448-4b17-b8a8-5e51dff99526-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"9a8c48b3-0448-4b17-b8a8-5e51dff99526\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.560868 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.560783 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9a8c48b3-0448-4b17-b8a8-5e51dff99526-config-volume\") pod \"alertmanager-main-0\" (UID: \"9a8c48b3-0448-4b17-b8a8-5e51dff99526\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.560868 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.560857 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9a8c48b3-0448-4b17-b8a8-5e51dff99526-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"9a8c48b3-0448-4b17-b8a8-5e51dff99526\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.561165 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.561144 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9a8c48b3-0448-4b17-b8a8-5e51dff99526-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"9a8c48b3-0448-4b17-b8a8-5e51dff99526\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.561252 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.561232 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9a8c48b3-0448-4b17-b8a8-5e51dff99526-tls-assets\") pod \"alertmanager-main-0\" (UID: \"9a8c48b3-0448-4b17-b8a8-5e51dff99526\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.561292 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.561240 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9a8c48b3-0448-4b17-b8a8-5e51dff99526-web-config\") pod \"alertmanager-main-0\" (UID: \"9a8c48b3-0448-4b17-b8a8-5e51dff99526\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.561454 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.561439 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9a8c48b3-0448-4b17-b8a8-5e51dff99526-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"9a8c48b3-0448-4b17-b8a8-5e51dff99526\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.562421 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.562403 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9a8c48b3-0448-4b17-b8a8-5e51dff99526-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"9a8c48b3-0448-4b17-b8a8-5e51dff99526\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.567612 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.567595 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf6dc\" (UniqueName: \"kubernetes.io/projected/9a8c48b3-0448-4b17-b8a8-5e51dff99526-kube-api-access-tf6dc\") pod \"alertmanager-main-0\" (UID: \"9a8c48b3-0448-4b17-b8a8-5e51dff99526\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.674352 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.674324 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:28:39.794622 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:39.794593 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 14:28:39.797642 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:28:39.797609 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a8c48b3_0448_4b17_b8a8_5e51dff99526.slice/crio-d5469ee2ebf0497771cf516d9ea7a65f94e73f08e193de6b04a9bc237d73a280 WatchSource:0}: Error finding container d5469ee2ebf0497771cf516d9ea7a65f94e73f08e193de6b04a9bc237d73a280: Status 404 returned error can't find the container with id d5469ee2ebf0497771cf516d9ea7a65f94e73f08e193de6b04a9bc237d73a280 Apr 24 14:28:40.290823 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:40.290790 2571 generic.go:358] "Generic (PLEG): container finished" podID="9a8c48b3-0448-4b17-b8a8-5e51dff99526" containerID="966b3a15a7ab62fb4c989416b1b835bbf3aae76426efc0f84a4c1b3446171131" exitCode=0 Apr 24 14:28:40.291269 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:40.290885 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9a8c48b3-0448-4b17-b8a8-5e51dff99526","Type":"ContainerDied","Data":"966b3a15a7ab62fb4c989416b1b835bbf3aae76426efc0f84a4c1b3446171131"} Apr 24 14:28:40.291269 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:40.290924 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9a8c48b3-0448-4b17-b8a8-5e51dff99526","Type":"ContainerStarted","Data":"d5469ee2ebf0497771cf516d9ea7a65f94e73f08e193de6b04a9bc237d73a280"} Apr 24 14:28:40.462634 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:40.462541 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfa4cdab-b9bf-4984-b99d-6aca2dc55292" path="/var/lib/kubelet/pods/dfa4cdab-b9bf-4984-b99d-6aca2dc55292/volumes" Apr 24 14:28:41.297065 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:41.297028 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9a8c48b3-0448-4b17-b8a8-5e51dff99526","Type":"ContainerStarted","Data":"565564318d1c25ad390c6564c2747008290f4226b86db61221311588b43a124a"} Apr 24 14:28:41.297065 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:41.297063 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9a8c48b3-0448-4b17-b8a8-5e51dff99526","Type":"ContainerStarted","Data":"834308ab92716906fbd3f0e5e06de7f8df10942bc20832f08e9d4f766aa25996"} Apr 24 14:28:41.297065 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:41.297073 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9a8c48b3-0448-4b17-b8a8-5e51dff99526","Type":"ContainerStarted","Data":"3454ed56a0be44709af8b7b81cab704954b94e922508e628324115f8317f29dc"} Apr 24 14:28:41.297490 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:41.297081 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9a8c48b3-0448-4b17-b8a8-5e51dff99526","Type":"ContainerStarted","Data":"22d74f08824874b8ef13d619d75ec57eee962ca80abeac8bde57e3e9ae52be27"} Apr 24 14:28:41.297490 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:41.297090 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9a8c48b3-0448-4b17-b8a8-5e51dff99526","Type":"ContainerStarted","Data":"2f502b9609f10a1b2fa12d82741ee5a99d2aaa3ee4a648fc6c62a727e8394d3e"} Apr 24 14:28:41.297490 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:41.297098 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9a8c48b3-0448-4b17-b8a8-5e51dff99526","Type":"ContainerStarted","Data":"7698a88a318133d183c63ba14edf389c164e02369a347d0e415b7ce533afef6f"} Apr 24 14:28:41.324825 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:41.324774 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.3247583880000002 podStartE2EDuration="2.324758388s" podCreationTimestamp="2026-04-24 14:28:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:28:41.322832956 +0000 UTC m=+257.452810516" watchObservedRunningTime="2026-04-24 14:28:41.324758388 +0000 UTC m=+257.454735947" Apr 24 14:28:42.014834 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:42.014803 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-846d597597-fhql6"] Apr 24 14:28:42.018427 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:42.018406 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-846d597597-fhql6" Apr 24 14:28:42.020405 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:42.020371 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 24 14:28:42.020570 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:42.020549 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 24 14:28:42.020700 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:42.020677 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-shbjn\"" Apr 24 14:28:42.020791 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:42.020711 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 24 14:28:42.020791 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:42.020717 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 24 14:28:42.020864 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:42.020685 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 24 14:28:42.026341 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:42.026312 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 24 14:28:42.032390 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:42.032361 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-846d597597-fhql6"] Apr 24 14:28:42.081532 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:42.081489 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d92925ff-e3a0-4e32-85e8-5e09c60d3530-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-846d597597-fhql6\" (UID: \"d92925ff-e3a0-4e32-85e8-5e09c60d3530\") " pod="openshift-monitoring/telemeter-client-846d597597-fhql6" Apr 24 14:28:42.081532 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:42.081541 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6g84\" (UniqueName: \"kubernetes.io/projected/d92925ff-e3a0-4e32-85e8-5e09c60d3530-kube-api-access-w6g84\") pod \"telemeter-client-846d597597-fhql6\" (UID: \"d92925ff-e3a0-4e32-85e8-5e09c60d3530\") " pod="openshift-monitoring/telemeter-client-846d597597-fhql6" Apr 24 14:28:42.081776 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:42.081564 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/d92925ff-e3a0-4e32-85e8-5e09c60d3530-secret-telemeter-client\") pod \"telemeter-client-846d597597-fhql6\" (UID: \"d92925ff-e3a0-4e32-85e8-5e09c60d3530\") " pod="openshift-monitoring/telemeter-client-846d597597-fhql6" Apr 24 14:28:42.081776 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:42.081638 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d92925ff-e3a0-4e32-85e8-5e09c60d3530-telemeter-trusted-ca-bundle\") pod \"telemeter-client-846d597597-fhql6\" (UID: \"d92925ff-e3a0-4e32-85e8-5e09c60d3530\") " pod="openshift-monitoring/telemeter-client-846d597597-fhql6" Apr 24 14:28:42.081776 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:42.081687 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d92925ff-e3a0-4e32-85e8-5e09c60d3530-serving-certs-ca-bundle\") pod \"telemeter-client-846d597597-fhql6\" (UID: \"d92925ff-e3a0-4e32-85e8-5e09c60d3530\") " pod="openshift-monitoring/telemeter-client-846d597597-fhql6" Apr 24 14:28:42.081776 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:42.081712 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/d92925ff-e3a0-4e32-85e8-5e09c60d3530-federate-client-tls\") pod \"telemeter-client-846d597597-fhql6\" (UID: \"d92925ff-e3a0-4e32-85e8-5e09c60d3530\") " pod="openshift-monitoring/telemeter-client-846d597597-fhql6" Apr 24 14:28:42.081776 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:42.081769 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d92925ff-e3a0-4e32-85e8-5e09c60d3530-telemeter-client-tls\") pod \"telemeter-client-846d597597-fhql6\" (UID: \"d92925ff-e3a0-4e32-85e8-5e09c60d3530\") " pod="openshift-monitoring/telemeter-client-846d597597-fhql6" Apr 24 14:28:42.081975 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:42.081799 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d92925ff-e3a0-4e32-85e8-5e09c60d3530-metrics-client-ca\") pod \"telemeter-client-846d597597-fhql6\" (UID: \"d92925ff-e3a0-4e32-85e8-5e09c60d3530\") " pod="openshift-monitoring/telemeter-client-846d597597-fhql6" Apr 24 14:28:42.182969 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:42.182938 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d92925ff-e3a0-4e32-85e8-5e09c60d3530-telemeter-client-tls\") pod \"telemeter-client-846d597597-fhql6\" (UID: \"d92925ff-e3a0-4e32-85e8-5e09c60d3530\") " pod="openshift-monitoring/telemeter-client-846d597597-fhql6" Apr 24 14:28:42.182969 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:42.182973 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d92925ff-e3a0-4e32-85e8-5e09c60d3530-metrics-client-ca\") pod \"telemeter-client-846d597597-fhql6\" (UID: \"d92925ff-e3a0-4e32-85e8-5e09c60d3530\") " pod="openshift-monitoring/telemeter-client-846d597597-fhql6" Apr 24 14:28:42.183190 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:42.183044 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d92925ff-e3a0-4e32-85e8-5e09c60d3530-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-846d597597-fhql6\" (UID: \"d92925ff-e3a0-4e32-85e8-5e09c60d3530\") " pod="openshift-monitoring/telemeter-client-846d597597-fhql6" Apr 24 14:28:42.183190 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:42.183065 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w6g84\" (UniqueName: \"kubernetes.io/projected/d92925ff-e3a0-4e32-85e8-5e09c60d3530-kube-api-access-w6g84\") pod \"telemeter-client-846d597597-fhql6\" (UID: \"d92925ff-e3a0-4e32-85e8-5e09c60d3530\") " pod="openshift-monitoring/telemeter-client-846d597597-fhql6" Apr 24 14:28:42.183190 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:42.183094 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/d92925ff-e3a0-4e32-85e8-5e09c60d3530-secret-telemeter-client\") pod \"telemeter-client-846d597597-fhql6\" (UID: \"d92925ff-e3a0-4e32-85e8-5e09c60d3530\") " pod="openshift-monitoring/telemeter-client-846d597597-fhql6" Apr 24 14:28:42.183190 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:42.183152 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d92925ff-e3a0-4e32-85e8-5e09c60d3530-telemeter-trusted-ca-bundle\") pod \"telemeter-client-846d597597-fhql6\" (UID: \"d92925ff-e3a0-4e32-85e8-5e09c60d3530\") " pod="openshift-monitoring/telemeter-client-846d597597-fhql6" Apr 24 14:28:42.183190 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:42.183176 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d92925ff-e3a0-4e32-85e8-5e09c60d3530-serving-certs-ca-bundle\") pod \"telemeter-client-846d597597-fhql6\" (UID: \"d92925ff-e3a0-4e32-85e8-5e09c60d3530\") " pod="openshift-monitoring/telemeter-client-846d597597-fhql6" Apr 24 14:28:42.183435 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:42.183195 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/d92925ff-e3a0-4e32-85e8-5e09c60d3530-federate-client-tls\") pod \"telemeter-client-846d597597-fhql6\" (UID: \"d92925ff-e3a0-4e32-85e8-5e09c60d3530\") " pod="openshift-monitoring/telemeter-client-846d597597-fhql6" Apr 24 14:28:42.183936 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:42.183906 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d92925ff-e3a0-4e32-85e8-5e09c60d3530-serving-certs-ca-bundle\") pod \"telemeter-client-846d597597-fhql6\" (UID: \"d92925ff-e3a0-4e32-85e8-5e09c60d3530\") " pod="openshift-monitoring/telemeter-client-846d597597-fhql6" Apr 24 14:28:42.183936 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:42.183929 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d92925ff-e3a0-4e32-85e8-5e09c60d3530-metrics-client-ca\") pod \"telemeter-client-846d597597-fhql6\" (UID: \"d92925ff-e3a0-4e32-85e8-5e09c60d3530\") " pod="openshift-monitoring/telemeter-client-846d597597-fhql6" Apr 24 14:28:42.184292 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:42.184268 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d92925ff-e3a0-4e32-85e8-5e09c60d3530-telemeter-trusted-ca-bundle\") pod \"telemeter-client-846d597597-fhql6\" (UID: \"d92925ff-e3a0-4e32-85e8-5e09c60d3530\") " pod="openshift-monitoring/telemeter-client-846d597597-fhql6" Apr 24 14:28:42.185712 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:42.185689 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d92925ff-e3a0-4e32-85e8-5e09c60d3530-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-846d597597-fhql6\" (UID: \"d92925ff-e3a0-4e32-85e8-5e09c60d3530\") " pod="openshift-monitoring/telemeter-client-846d597597-fhql6" Apr 24 14:28:42.185815 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:42.185789 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/d92925ff-e3a0-4e32-85e8-5e09c60d3530-secret-telemeter-client\") pod \"telemeter-client-846d597597-fhql6\" (UID: \"d92925ff-e3a0-4e32-85e8-5e09c60d3530\") " pod="openshift-monitoring/telemeter-client-846d597597-fhql6" Apr 24 14:28:42.185945 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:42.185929 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/d92925ff-e3a0-4e32-85e8-5e09c60d3530-federate-client-tls\") pod \"telemeter-client-846d597597-fhql6\" (UID: \"d92925ff-e3a0-4e32-85e8-5e09c60d3530\") " pod="openshift-monitoring/telemeter-client-846d597597-fhql6" Apr 24 14:28:42.186168 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:42.186150 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d92925ff-e3a0-4e32-85e8-5e09c60d3530-telemeter-client-tls\") pod \"telemeter-client-846d597597-fhql6\" (UID: \"d92925ff-e3a0-4e32-85e8-5e09c60d3530\") " pod="openshift-monitoring/telemeter-client-846d597597-fhql6" Apr 24 14:28:42.194210 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:42.194187 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6g84\" (UniqueName: \"kubernetes.io/projected/d92925ff-e3a0-4e32-85e8-5e09c60d3530-kube-api-access-w6g84\") pod \"telemeter-client-846d597597-fhql6\" (UID: \"d92925ff-e3a0-4e32-85e8-5e09c60d3530\") " pod="openshift-monitoring/telemeter-client-846d597597-fhql6" Apr 24 14:28:42.328836 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:42.328755 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-846d597597-fhql6" Apr 24 14:28:42.461013 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:42.460972 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-846d597597-fhql6"] Apr 24 14:28:42.462262 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:28:42.462235 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd92925ff_e3a0_4e32_85e8_5e09c60d3530.slice/crio-ce2845687b7613b5e974abbcd7cfc1f184a1560721387b46e7771f9198093abb WatchSource:0}: Error finding container ce2845687b7613b5e974abbcd7cfc1f184a1560721387b46e7771f9198093abb: Status 404 returned error can't find the container with id ce2845687b7613b5e974abbcd7cfc1f184a1560721387b46e7771f9198093abb Apr 24 14:28:43.304299 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:43.304255 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-846d597597-fhql6" event={"ID":"d92925ff-e3a0-4e32-85e8-5e09c60d3530","Type":"ContainerStarted","Data":"ce2845687b7613b5e974abbcd7cfc1f184a1560721387b46e7771f9198093abb"} Apr 24 14:28:44.308577 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:44.308496 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-846d597597-fhql6" event={"ID":"d92925ff-e3a0-4e32-85e8-5e09c60d3530","Type":"ContainerStarted","Data":"591e984eb7f41ae0839d9c884c246c6612dd6a2c26b8e8eef82e2376f00e95ce"} Apr 24 14:28:44.308577 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:44.308531 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-846d597597-fhql6" event={"ID":"d92925ff-e3a0-4e32-85e8-5e09c60d3530","Type":"ContainerStarted","Data":"11924c25da93864717c9c9bbc4e04bcb04da4cf01745023e88099a89328ea5d2"} Apr 24 14:28:44.308577 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:44.308541 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-846d597597-fhql6" event={"ID":"d92925ff-e3a0-4e32-85e8-5e09c60d3530","Type":"ContainerStarted","Data":"322ea74c8fb2d1c50b466b9d2bce9744ca387f8aad6e654603e95712e036d436"} Apr 24 14:28:44.334316 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:44.334267 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-846d597597-fhql6" podStartSLOduration=1.876123703 podStartE2EDuration="3.334254485s" podCreationTimestamp="2026-04-24 14:28:41 +0000 UTC" firstStartedPulling="2026-04-24 14:28:42.463919222 +0000 UTC m=+258.593896764" lastFinishedPulling="2026-04-24 14:28:43.922050009 +0000 UTC m=+260.052027546" observedRunningTime="2026-04-24 14:28:44.333309166 +0000 UTC m=+260.463286726" watchObservedRunningTime="2026-04-24 14:28:44.334254485 +0000 UTC m=+260.464232044" Apr 24 14:28:45.095743 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:45.095703 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-545ffd58-s276g"] Apr 24 14:28:45.099541 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:45.099520 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-545ffd58-s276g" Apr 24 14:28:45.113976 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:45.113943 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-545ffd58-s276g"] Apr 24 14:28:45.213757 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:45.213718 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/93f5f5a2-e7b3-4599-9890-d7ae21212b78-console-serving-cert\") pod \"console-545ffd58-s276g\" (UID: \"93f5f5a2-e7b3-4599-9890-d7ae21212b78\") " pod="openshift-console/console-545ffd58-s276g" Apr 24 14:28:45.213945 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:45.213819 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/93f5f5a2-e7b3-4599-9890-d7ae21212b78-service-ca\") pod \"console-545ffd58-s276g\" (UID: \"93f5f5a2-e7b3-4599-9890-d7ae21212b78\") " pod="openshift-console/console-545ffd58-s276g" Apr 24 14:28:45.213945 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:45.213842 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/93f5f5a2-e7b3-4599-9890-d7ae21212b78-console-config\") pod \"console-545ffd58-s276g\" (UID: \"93f5f5a2-e7b3-4599-9890-d7ae21212b78\") " pod="openshift-console/console-545ffd58-s276g" Apr 24 14:28:45.213945 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:45.213863 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/93f5f5a2-e7b3-4599-9890-d7ae21212b78-oauth-serving-cert\") pod \"console-545ffd58-s276g\" (UID: \"93f5f5a2-e7b3-4599-9890-d7ae21212b78\") " pod="openshift-console/console-545ffd58-s276g" Apr 24 14:28:45.214075 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:45.213941 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/93f5f5a2-e7b3-4599-9890-d7ae21212b78-console-oauth-config\") pod \"console-545ffd58-s276g\" (UID: \"93f5f5a2-e7b3-4599-9890-d7ae21212b78\") " pod="openshift-console/console-545ffd58-s276g" Apr 24 14:28:45.214075 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:45.213969 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93f5f5a2-e7b3-4599-9890-d7ae21212b78-trusted-ca-bundle\") pod \"console-545ffd58-s276g\" (UID: \"93f5f5a2-e7b3-4599-9890-d7ae21212b78\") " pod="openshift-console/console-545ffd58-s276g" Apr 24 14:28:45.214075 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:45.214008 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb7vk\" (UniqueName: \"kubernetes.io/projected/93f5f5a2-e7b3-4599-9890-d7ae21212b78-kube-api-access-qb7vk\") pod \"console-545ffd58-s276g\" (UID: \"93f5f5a2-e7b3-4599-9890-d7ae21212b78\") " pod="openshift-console/console-545ffd58-s276g" Apr 24 14:28:45.314763 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:45.314731 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/93f5f5a2-e7b3-4599-9890-d7ae21212b78-console-serving-cert\") pod \"console-545ffd58-s276g\" (UID: \"93f5f5a2-e7b3-4599-9890-d7ae21212b78\") " pod="openshift-console/console-545ffd58-s276g" Apr 24 14:28:45.315222 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:45.314782 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/93f5f5a2-e7b3-4599-9890-d7ae21212b78-service-ca\") pod \"console-545ffd58-s276g\" (UID: \"93f5f5a2-e7b3-4599-9890-d7ae21212b78\") " pod="openshift-console/console-545ffd58-s276g" Apr 24 14:28:45.315222 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:45.314799 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/93f5f5a2-e7b3-4599-9890-d7ae21212b78-console-config\") pod \"console-545ffd58-s276g\" (UID: \"93f5f5a2-e7b3-4599-9890-d7ae21212b78\") " pod="openshift-console/console-545ffd58-s276g" Apr 24 14:28:45.315222 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:45.314815 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/93f5f5a2-e7b3-4599-9890-d7ae21212b78-oauth-serving-cert\") pod \"console-545ffd58-s276g\" (UID: \"93f5f5a2-e7b3-4599-9890-d7ae21212b78\") " pod="openshift-console/console-545ffd58-s276g" Apr 24 14:28:45.315222 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:45.314841 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/93f5f5a2-e7b3-4599-9890-d7ae21212b78-console-oauth-config\") pod \"console-545ffd58-s276g\" (UID: \"93f5f5a2-e7b3-4599-9890-d7ae21212b78\") " pod="openshift-console/console-545ffd58-s276g" Apr 24 14:28:45.315222 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:45.314855 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93f5f5a2-e7b3-4599-9890-d7ae21212b78-trusted-ca-bundle\") pod \"console-545ffd58-s276g\" (UID: \"93f5f5a2-e7b3-4599-9890-d7ae21212b78\") " pod="openshift-console/console-545ffd58-s276g" Apr 24 14:28:45.315222 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:45.314873 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qb7vk\" (UniqueName: \"kubernetes.io/projected/93f5f5a2-e7b3-4599-9890-d7ae21212b78-kube-api-access-qb7vk\") pod \"console-545ffd58-s276g\" (UID: \"93f5f5a2-e7b3-4599-9890-d7ae21212b78\") " pod="openshift-console/console-545ffd58-s276g" Apr 24 14:28:45.315719 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:45.315689 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/93f5f5a2-e7b3-4599-9890-d7ae21212b78-oauth-serving-cert\") pod \"console-545ffd58-s276g\" (UID: \"93f5f5a2-e7b3-4599-9890-d7ae21212b78\") " pod="openshift-console/console-545ffd58-s276g" Apr 24 14:28:45.315719 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:45.315709 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/93f5f5a2-e7b3-4599-9890-d7ae21212b78-console-config\") pod \"console-545ffd58-s276g\" (UID: \"93f5f5a2-e7b3-4599-9890-d7ae21212b78\") " pod="openshift-console/console-545ffd58-s276g" Apr 24 14:28:45.315860 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:45.315844 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93f5f5a2-e7b3-4599-9890-d7ae21212b78-trusted-ca-bundle\") pod \"console-545ffd58-s276g\" (UID: \"93f5f5a2-e7b3-4599-9890-d7ae21212b78\") " pod="openshift-console/console-545ffd58-s276g" Apr 24 14:28:45.317313 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:45.317288 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/93f5f5a2-e7b3-4599-9890-d7ae21212b78-service-ca\") pod \"console-545ffd58-s276g\" (UID: \"93f5f5a2-e7b3-4599-9890-d7ae21212b78\") " pod="openshift-console/console-545ffd58-s276g" Apr 24 14:28:45.317514 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:45.317496 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/93f5f5a2-e7b3-4599-9890-d7ae21212b78-console-oauth-config\") pod \"console-545ffd58-s276g\" (UID: \"93f5f5a2-e7b3-4599-9890-d7ae21212b78\") " pod="openshift-console/console-545ffd58-s276g" Apr 24 14:28:45.317634 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:45.317620 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/93f5f5a2-e7b3-4599-9890-d7ae21212b78-console-serving-cert\") pod \"console-545ffd58-s276g\" (UID: \"93f5f5a2-e7b3-4599-9890-d7ae21212b78\") " pod="openshift-console/console-545ffd58-s276g" Apr 24 14:28:45.324904 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:45.324878 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb7vk\" (UniqueName: \"kubernetes.io/projected/93f5f5a2-e7b3-4599-9890-d7ae21212b78-kube-api-access-qb7vk\") pod \"console-545ffd58-s276g\" (UID: \"93f5f5a2-e7b3-4599-9890-d7ae21212b78\") " pod="openshift-console/console-545ffd58-s276g" Apr 24 14:28:45.409798 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:45.409750 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-545ffd58-s276g" Apr 24 14:28:45.532232 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:45.532209 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-545ffd58-s276g"] Apr 24 14:28:45.534435 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:28:45.534405 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93f5f5a2_e7b3_4599_9890_d7ae21212b78.slice/crio-b7c67ef261381647e01fdf6cb5dcc8f26f46ec1ec9bb7e161f92eb52365781a6 WatchSource:0}: Error finding container b7c67ef261381647e01fdf6cb5dcc8f26f46ec1ec9bb7e161f92eb52365781a6: Status 404 returned error can't find the container with id b7c67ef261381647e01fdf6cb5dcc8f26f46ec1ec9bb7e161f92eb52365781a6 Apr 24 14:28:46.315562 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:46.315525 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-545ffd58-s276g" event={"ID":"93f5f5a2-e7b3-4599-9890-d7ae21212b78","Type":"ContainerStarted","Data":"712a724037f6956c1e3102ffdabec4b2cc933ab1a6632b738250a651b128bbe9"} Apr 24 14:28:46.315562 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:46.315565 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-545ffd58-s276g" event={"ID":"93f5f5a2-e7b3-4599-9890-d7ae21212b78","Type":"ContainerStarted","Data":"b7c67ef261381647e01fdf6cb5dcc8f26f46ec1ec9bb7e161f92eb52365781a6"} Apr 24 14:28:46.333832 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:46.333788 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-545ffd58-s276g" podStartSLOduration=1.333770954 podStartE2EDuration="1.333770954s" podCreationTimestamp="2026-04-24 14:28:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:28:46.332566587 +0000 UTC m=+262.462544160" watchObservedRunningTime="2026-04-24 14:28:46.333770954 +0000 UTC m=+262.463748515" Apr 24 14:28:55.410101 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:55.410053 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-545ffd58-s276g" Apr 24 14:28:55.410101 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:55.410109 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-545ffd58-s276g" Apr 24 14:28:55.414815 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:55.414788 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-545ffd58-s276g" Apr 24 14:28:56.349294 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:56.349263 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-545ffd58-s276g" Apr 24 14:28:56.395242 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:28:56.395209 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7574bf65c9-q6cvt"] Apr 24 14:29:21.417212 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:21.417151 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7574bf65c9-q6cvt" podUID="e172b560-3655-4d67-86da-7599eb980870" containerName="console" containerID="cri-o://02fcfdaf0512f3e58c485d6de04420c1562ed7624236e1eabfcbdb51cd1cb3ae" gracePeriod=15 Apr 24 14:29:21.662510 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:21.662486 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7574bf65c9-q6cvt_e172b560-3655-4d67-86da-7599eb980870/console/0.log" Apr 24 14:29:21.662623 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:21.662549 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7574bf65c9-q6cvt" Apr 24 14:29:21.722605 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:21.722537 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e172b560-3655-4d67-86da-7599eb980870-service-ca\") pod \"e172b560-3655-4d67-86da-7599eb980870\" (UID: \"e172b560-3655-4d67-86da-7599eb980870\") " Apr 24 14:29:21.722605 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:21.722568 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e172b560-3655-4d67-86da-7599eb980870-console-serving-cert\") pod \"e172b560-3655-4d67-86da-7599eb980870\" (UID: \"e172b560-3655-4d67-86da-7599eb980870\") " Apr 24 14:29:21.722605 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:21.722587 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e172b560-3655-4d67-86da-7599eb980870-trusted-ca-bundle\") pod \"e172b560-3655-4d67-86da-7599eb980870\" (UID: \"e172b560-3655-4d67-86da-7599eb980870\") " Apr 24 14:29:21.722605 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:21.722606 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnffc\" (UniqueName: \"kubernetes.io/projected/e172b560-3655-4d67-86da-7599eb980870-kube-api-access-vnffc\") pod \"e172b560-3655-4d67-86da-7599eb980870\" (UID: \"e172b560-3655-4d67-86da-7599eb980870\") " Apr 24 14:29:21.722907 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:21.722630 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e172b560-3655-4d67-86da-7599eb980870-console-oauth-config\") pod \"e172b560-3655-4d67-86da-7599eb980870\" (UID: \"e172b560-3655-4d67-86da-7599eb980870\") " Apr 24 14:29:21.722907 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:21.722658 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e172b560-3655-4d67-86da-7599eb980870-oauth-serving-cert\") pod \"e172b560-3655-4d67-86da-7599eb980870\" (UID: \"e172b560-3655-4d67-86da-7599eb980870\") " Apr 24 14:29:21.722907 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:21.722711 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e172b560-3655-4d67-86da-7599eb980870-console-config\") pod \"e172b560-3655-4d67-86da-7599eb980870\" (UID: \"e172b560-3655-4d67-86da-7599eb980870\") " Apr 24 14:29:21.723098 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:21.723012 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e172b560-3655-4d67-86da-7599eb980870-service-ca" (OuterVolumeSpecName: "service-ca") pod "e172b560-3655-4d67-86da-7599eb980870" (UID: "e172b560-3655-4d67-86da-7599eb980870"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:29:21.723165 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:21.723138 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e172b560-3655-4d67-86da-7599eb980870-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e172b560-3655-4d67-86da-7599eb980870" (UID: "e172b560-3655-4d67-86da-7599eb980870"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:29:21.723255 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:21.723232 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e172b560-3655-4d67-86da-7599eb980870-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e172b560-3655-4d67-86da-7599eb980870" (UID: "e172b560-3655-4d67-86da-7599eb980870"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:29:21.723325 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:21.723227 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e172b560-3655-4d67-86da-7599eb980870-console-config" (OuterVolumeSpecName: "console-config") pod "e172b560-3655-4d67-86da-7599eb980870" (UID: "e172b560-3655-4d67-86da-7599eb980870"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:29:21.724842 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:21.724808 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e172b560-3655-4d67-86da-7599eb980870-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e172b560-3655-4d67-86da-7599eb980870" (UID: "e172b560-3655-4d67-86da-7599eb980870"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:29:21.724939 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:21.724847 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e172b560-3655-4d67-86da-7599eb980870-kube-api-access-vnffc" (OuterVolumeSpecName: "kube-api-access-vnffc") pod "e172b560-3655-4d67-86da-7599eb980870" (UID: "e172b560-3655-4d67-86da-7599eb980870"). InnerVolumeSpecName "kube-api-access-vnffc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:29:21.724939 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:21.724862 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e172b560-3655-4d67-86da-7599eb980870-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e172b560-3655-4d67-86da-7599eb980870" (UID: "e172b560-3655-4d67-86da-7599eb980870"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:29:21.824081 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:21.824052 2571 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e172b560-3655-4d67-86da-7599eb980870-service-ca\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:29:21.824081 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:21.824078 2571 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e172b560-3655-4d67-86da-7599eb980870-console-serving-cert\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:29:21.824081 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:21.824088 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e172b560-3655-4d67-86da-7599eb980870-trusted-ca-bundle\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:29:21.824282 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:21.824098 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vnffc\" (UniqueName: \"kubernetes.io/projected/e172b560-3655-4d67-86da-7599eb980870-kube-api-access-vnffc\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:29:21.824282 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:21.824107 2571 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e172b560-3655-4d67-86da-7599eb980870-console-oauth-config\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:29:21.824282 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:21.824116 2571 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e172b560-3655-4d67-86da-7599eb980870-oauth-serving-cert\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:29:21.824282 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:21.824125 2571 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e172b560-3655-4d67-86da-7599eb980870-console-config\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:29:22.422841 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:22.422811 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7574bf65c9-q6cvt_e172b560-3655-4d67-86da-7599eb980870/console/0.log" Apr 24 14:29:22.423314 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:22.422851 2571 generic.go:358] "Generic (PLEG): container finished" podID="e172b560-3655-4d67-86da-7599eb980870" containerID="02fcfdaf0512f3e58c485d6de04420c1562ed7624236e1eabfcbdb51cd1cb3ae" exitCode=2 Apr 24 14:29:22.423314 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:22.422888 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7574bf65c9-q6cvt" event={"ID":"e172b560-3655-4d67-86da-7599eb980870","Type":"ContainerDied","Data":"02fcfdaf0512f3e58c485d6de04420c1562ed7624236e1eabfcbdb51cd1cb3ae"} Apr 24 14:29:22.423314 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:22.422924 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7574bf65c9-q6cvt" Apr 24 14:29:22.423314 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:22.422937 2571 scope.go:117] "RemoveContainer" containerID="02fcfdaf0512f3e58c485d6de04420c1562ed7624236e1eabfcbdb51cd1cb3ae" Apr 24 14:29:22.423314 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:22.422926 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7574bf65c9-q6cvt" event={"ID":"e172b560-3655-4d67-86da-7599eb980870","Type":"ContainerDied","Data":"2c733f10f6464d260e5a23c31f48bbb1203b164c891a74fa1e74ae52c11a1ecc"} Apr 24 14:29:22.431681 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:22.431667 2571 scope.go:117] "RemoveContainer" containerID="02fcfdaf0512f3e58c485d6de04420c1562ed7624236e1eabfcbdb51cd1cb3ae" Apr 24 14:29:22.431910 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:29:22.431892 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02fcfdaf0512f3e58c485d6de04420c1562ed7624236e1eabfcbdb51cd1cb3ae\": container with ID starting with 02fcfdaf0512f3e58c485d6de04420c1562ed7624236e1eabfcbdb51cd1cb3ae not found: ID does not exist" containerID="02fcfdaf0512f3e58c485d6de04420c1562ed7624236e1eabfcbdb51cd1cb3ae" Apr 24 14:29:22.431952 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:22.431918 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02fcfdaf0512f3e58c485d6de04420c1562ed7624236e1eabfcbdb51cd1cb3ae"} err="failed to get container status \"02fcfdaf0512f3e58c485d6de04420c1562ed7624236e1eabfcbdb51cd1cb3ae\": rpc error: code = NotFound desc = could not find container \"02fcfdaf0512f3e58c485d6de04420c1562ed7624236e1eabfcbdb51cd1cb3ae\": container with ID starting with 02fcfdaf0512f3e58c485d6de04420c1562ed7624236e1eabfcbdb51cd1cb3ae not found: ID does not exist" Apr 24 14:29:22.443120 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:22.443090 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7574bf65c9-q6cvt"] Apr 24 14:29:22.447566 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:22.447536 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7574bf65c9-q6cvt"] Apr 24 14:29:22.460767 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:22.460743 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e172b560-3655-4d67-86da-7599eb980870" path="/var/lib/kubelet/pods/e172b560-3655-4d67-86da-7599eb980870/volumes" Apr 24 14:29:24.342956 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:24.342924 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mk446_83b693cc-250b-45e7-b205-baf7f0feff6b/console-operator/2.log" Apr 24 14:29:24.343449 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:24.343258 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mk446_83b693cc-250b-45e7-b205-baf7f0feff6b/console-operator/2.log" Apr 24 14:29:24.347692 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:24.347670 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7m9zt_a7b9926d-4f53-4532-8669-16af4fc30cfd/ovn-acl-logging/0.log" Apr 24 14:29:24.348386 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:24.348364 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7m9zt_a7b9926d-4f53-4532-8669-16af4fc30cfd/ovn-acl-logging/0.log" Apr 24 14:29:24.354772 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:24.354755 2571 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 14:29:59.394505 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:59.394470 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5f6b8c7668-fgp8q"] Apr 24 14:29:59.397839 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:59.394826 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e172b560-3655-4d67-86da-7599eb980870" containerName="console" Apr 24 14:29:59.397839 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:59.394838 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="e172b560-3655-4d67-86da-7599eb980870" containerName="console" Apr 24 14:29:59.397839 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:59.394910 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="e172b560-3655-4d67-86da-7599eb980870" containerName="console" Apr 24 14:29:59.398669 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:59.398652 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f6b8c7668-fgp8q" Apr 24 14:29:59.412880 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:59.412854 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f6b8c7668-fgp8q"] Apr 24 14:29:59.536745 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:59.536708 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/026e252b-ed46-4489-b5bb-876cda7f5870-console-serving-cert\") pod \"console-5f6b8c7668-fgp8q\" (UID: \"026e252b-ed46-4489-b5bb-876cda7f5870\") " pod="openshift-console/console-5f6b8c7668-fgp8q" Apr 24 14:29:59.536915 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:59.536762 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/026e252b-ed46-4489-b5bb-876cda7f5870-console-config\") pod \"console-5f6b8c7668-fgp8q\" (UID: \"026e252b-ed46-4489-b5bb-876cda7f5870\") " pod="openshift-console/console-5f6b8c7668-fgp8q" Apr 24 14:29:59.536915 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:59.536787 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/026e252b-ed46-4489-b5bb-876cda7f5870-oauth-serving-cert\") pod \"console-5f6b8c7668-fgp8q\" (UID: \"026e252b-ed46-4489-b5bb-876cda7f5870\") " pod="openshift-console/console-5f6b8c7668-fgp8q" Apr 24 14:29:59.536915 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:59.536803 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfvsl\" (UniqueName: \"kubernetes.io/projected/026e252b-ed46-4489-b5bb-876cda7f5870-kube-api-access-nfvsl\") pod \"console-5f6b8c7668-fgp8q\" (UID: \"026e252b-ed46-4489-b5bb-876cda7f5870\") " pod="openshift-console/console-5f6b8c7668-fgp8q" Apr 24 14:29:59.536915 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:59.536828 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/026e252b-ed46-4489-b5bb-876cda7f5870-console-oauth-config\") pod \"console-5f6b8c7668-fgp8q\" (UID: \"026e252b-ed46-4489-b5bb-876cda7f5870\") " pod="openshift-console/console-5f6b8c7668-fgp8q" Apr 24 14:29:59.536915 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:59.536876 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/026e252b-ed46-4489-b5bb-876cda7f5870-service-ca\") pod \"console-5f6b8c7668-fgp8q\" (UID: \"026e252b-ed46-4489-b5bb-876cda7f5870\") " pod="openshift-console/console-5f6b8c7668-fgp8q" Apr 24 14:29:59.537144 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:59.536927 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/026e252b-ed46-4489-b5bb-876cda7f5870-trusted-ca-bundle\") pod \"console-5f6b8c7668-fgp8q\" (UID: \"026e252b-ed46-4489-b5bb-876cda7f5870\") " pod="openshift-console/console-5f6b8c7668-fgp8q" Apr 24 14:29:59.638285 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:59.638254 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/026e252b-ed46-4489-b5bb-876cda7f5870-trusted-ca-bundle\") pod \"console-5f6b8c7668-fgp8q\" (UID: \"026e252b-ed46-4489-b5bb-876cda7f5870\") " pod="openshift-console/console-5f6b8c7668-fgp8q" Apr 24 14:29:59.638487 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:59.638305 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/026e252b-ed46-4489-b5bb-876cda7f5870-console-serving-cert\") pod \"console-5f6b8c7668-fgp8q\" (UID: \"026e252b-ed46-4489-b5bb-876cda7f5870\") " pod="openshift-console/console-5f6b8c7668-fgp8q" Apr 24 14:29:59.638487 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:59.638341 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/026e252b-ed46-4489-b5bb-876cda7f5870-console-config\") pod \"console-5f6b8c7668-fgp8q\" (UID: \"026e252b-ed46-4489-b5bb-876cda7f5870\") " pod="openshift-console/console-5f6b8c7668-fgp8q" Apr 24 14:29:59.638614 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:59.638492 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/026e252b-ed46-4489-b5bb-876cda7f5870-oauth-serving-cert\") pod \"console-5f6b8c7668-fgp8q\" (UID: \"026e252b-ed46-4489-b5bb-876cda7f5870\") " pod="openshift-console/console-5f6b8c7668-fgp8q" Apr 24 14:29:59.638614 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:59.638532 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nfvsl\" (UniqueName: \"kubernetes.io/projected/026e252b-ed46-4489-b5bb-876cda7f5870-kube-api-access-nfvsl\") pod \"console-5f6b8c7668-fgp8q\" (UID: \"026e252b-ed46-4489-b5bb-876cda7f5870\") " pod="openshift-console/console-5f6b8c7668-fgp8q" Apr 24 14:29:59.638715 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:59.638666 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/026e252b-ed46-4489-b5bb-876cda7f5870-console-oauth-config\") pod \"console-5f6b8c7668-fgp8q\" (UID: \"026e252b-ed46-4489-b5bb-876cda7f5870\") " pod="openshift-console/console-5f6b8c7668-fgp8q" Apr 24 14:29:59.638715 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:59.638701 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/026e252b-ed46-4489-b5bb-876cda7f5870-service-ca\") pod \"console-5f6b8c7668-fgp8q\" (UID: \"026e252b-ed46-4489-b5bb-876cda7f5870\") " pod="openshift-console/console-5f6b8c7668-fgp8q" Apr 24 14:29:59.639170 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:59.639143 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/026e252b-ed46-4489-b5bb-876cda7f5870-console-config\") pod \"console-5f6b8c7668-fgp8q\" (UID: \"026e252b-ed46-4489-b5bb-876cda7f5870\") " pod="openshift-console/console-5f6b8c7668-fgp8q" Apr 24 14:29:59.639302 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:59.639150 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/026e252b-ed46-4489-b5bb-876cda7f5870-oauth-serving-cert\") pod \"console-5f6b8c7668-fgp8q\" (UID: \"026e252b-ed46-4489-b5bb-876cda7f5870\") " pod="openshift-console/console-5f6b8c7668-fgp8q" Apr 24 14:29:59.639371 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:59.639313 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/026e252b-ed46-4489-b5bb-876cda7f5870-service-ca\") pod \"console-5f6b8c7668-fgp8q\" (UID: \"026e252b-ed46-4489-b5bb-876cda7f5870\") " pod="openshift-console/console-5f6b8c7668-fgp8q" Apr 24 14:29:59.639459 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:59.639436 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/026e252b-ed46-4489-b5bb-876cda7f5870-trusted-ca-bundle\") pod \"console-5f6b8c7668-fgp8q\" (UID: \"026e252b-ed46-4489-b5bb-876cda7f5870\") " pod="openshift-console/console-5f6b8c7668-fgp8q" Apr 24 14:29:59.641015 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:59.640964 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/026e252b-ed46-4489-b5bb-876cda7f5870-console-serving-cert\") pod \"console-5f6b8c7668-fgp8q\" (UID: \"026e252b-ed46-4489-b5bb-876cda7f5870\") " pod="openshift-console/console-5f6b8c7668-fgp8q" Apr 24 14:29:59.641125 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:59.641081 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/026e252b-ed46-4489-b5bb-876cda7f5870-console-oauth-config\") pod \"console-5f6b8c7668-fgp8q\" (UID: \"026e252b-ed46-4489-b5bb-876cda7f5870\") " pod="openshift-console/console-5f6b8c7668-fgp8q" Apr 24 14:29:59.647370 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:59.647314 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfvsl\" (UniqueName: \"kubernetes.io/projected/026e252b-ed46-4489-b5bb-876cda7f5870-kube-api-access-nfvsl\") pod \"console-5f6b8c7668-fgp8q\" (UID: \"026e252b-ed46-4489-b5bb-876cda7f5870\") " pod="openshift-console/console-5f6b8c7668-fgp8q" Apr 24 14:29:59.709719 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:59.709689 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f6b8c7668-fgp8q" Apr 24 14:29:59.829666 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:59.829638 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f6b8c7668-fgp8q"] Apr 24 14:29:59.832105 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:29:59.832077 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod026e252b_ed46_4489_b5bb_876cda7f5870.slice/crio-dd1cd580242e3c4d8a2cb69f772ea558b5309243a901baed54de700ec2b915d4 WatchSource:0}: Error finding container dd1cd580242e3c4d8a2cb69f772ea558b5309243a901baed54de700ec2b915d4: Status 404 returned error can't find the container with id dd1cd580242e3c4d8a2cb69f772ea558b5309243a901baed54de700ec2b915d4 Apr 24 14:29:59.834039 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:29:59.834022 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 14:30:00.529588 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:00.529551 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f6b8c7668-fgp8q" event={"ID":"026e252b-ed46-4489-b5bb-876cda7f5870","Type":"ContainerStarted","Data":"5cb9c7258f4e8be0c7f04ac28d726905e593ecaf8042a338ced9a215a04a86c8"} Apr 24 14:30:00.529588 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:00.529588 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f6b8c7668-fgp8q" event={"ID":"026e252b-ed46-4489-b5bb-876cda7f5870","Type":"ContainerStarted","Data":"dd1cd580242e3c4d8a2cb69f772ea558b5309243a901baed54de700ec2b915d4"} Apr 24 14:30:00.548525 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:00.548477 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5f6b8c7668-fgp8q" podStartSLOduration=1.548462749 podStartE2EDuration="1.548462749s" podCreationTimestamp="2026-04-24 14:29:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:30:00.54566163 +0000 UTC m=+336.675639187" watchObservedRunningTime="2026-04-24 14:30:00.548462749 +0000 UTC m=+336.678440309" Apr 24 14:30:09.710119 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:09.710034 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5f6b8c7668-fgp8q" Apr 24 14:30:09.710119 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:09.710073 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5f6b8c7668-fgp8q" Apr 24 14:30:09.715001 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:09.714970 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5f6b8c7668-fgp8q" Apr 24 14:30:10.562503 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:10.562476 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5f6b8c7668-fgp8q" Apr 24 14:30:10.626399 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:10.626370 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-545ffd58-s276g"] Apr 24 14:30:35.648159 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:35.648101 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-545ffd58-s276g" podUID="93f5f5a2-e7b3-4599-9890-d7ae21212b78" containerName="console" containerID="cri-o://712a724037f6956c1e3102ffdabec4b2cc933ab1a6632b738250a651b128bbe9" gracePeriod=15 Apr 24 14:30:35.888894 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:35.888870 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-545ffd58-s276g_93f5f5a2-e7b3-4599-9890-d7ae21212b78/console/0.log" Apr 24 14:30:35.889029 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:35.888932 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-545ffd58-s276g" Apr 24 14:30:36.034737 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:36.034649 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93f5f5a2-e7b3-4599-9890-d7ae21212b78-trusted-ca-bundle\") pod \"93f5f5a2-e7b3-4599-9890-d7ae21212b78\" (UID: \"93f5f5a2-e7b3-4599-9890-d7ae21212b78\") " Apr 24 14:30:36.034737 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:36.034706 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/93f5f5a2-e7b3-4599-9890-d7ae21212b78-oauth-serving-cert\") pod \"93f5f5a2-e7b3-4599-9890-d7ae21212b78\" (UID: \"93f5f5a2-e7b3-4599-9890-d7ae21212b78\") " Apr 24 14:30:36.034737 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:36.034740 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qb7vk\" (UniqueName: \"kubernetes.io/projected/93f5f5a2-e7b3-4599-9890-d7ae21212b78-kube-api-access-qb7vk\") pod \"93f5f5a2-e7b3-4599-9890-d7ae21212b78\" (UID: \"93f5f5a2-e7b3-4599-9890-d7ae21212b78\") " Apr 24 14:30:36.035025 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:36.034761 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/93f5f5a2-e7b3-4599-9890-d7ae21212b78-service-ca\") pod \"93f5f5a2-e7b3-4599-9890-d7ae21212b78\" (UID: \"93f5f5a2-e7b3-4599-9890-d7ae21212b78\") " Apr 24 14:30:36.035025 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:36.034780 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/93f5f5a2-e7b3-4599-9890-d7ae21212b78-console-config\") pod \"93f5f5a2-e7b3-4599-9890-d7ae21212b78\" (UID: \"93f5f5a2-e7b3-4599-9890-d7ae21212b78\") " Apr 24 14:30:36.035025 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:36.034802 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/93f5f5a2-e7b3-4599-9890-d7ae21212b78-console-serving-cert\") pod \"93f5f5a2-e7b3-4599-9890-d7ae21212b78\" (UID: \"93f5f5a2-e7b3-4599-9890-d7ae21212b78\") " Apr 24 14:30:36.035025 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:36.034847 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/93f5f5a2-e7b3-4599-9890-d7ae21212b78-console-oauth-config\") pod \"93f5f5a2-e7b3-4599-9890-d7ae21212b78\" (UID: \"93f5f5a2-e7b3-4599-9890-d7ae21212b78\") " Apr 24 14:30:36.035248 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:36.035223 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93f5f5a2-e7b3-4599-9890-d7ae21212b78-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "93f5f5a2-e7b3-4599-9890-d7ae21212b78" (UID: "93f5f5a2-e7b3-4599-9890-d7ae21212b78"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:30:36.035437 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:36.035293 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93f5f5a2-e7b3-4599-9890-d7ae21212b78-console-config" (OuterVolumeSpecName: "console-config") pod "93f5f5a2-e7b3-4599-9890-d7ae21212b78" (UID: "93f5f5a2-e7b3-4599-9890-d7ae21212b78"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:30:36.035437 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:36.035314 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93f5f5a2-e7b3-4599-9890-d7ae21212b78-service-ca" (OuterVolumeSpecName: "service-ca") pod "93f5f5a2-e7b3-4599-9890-d7ae21212b78" (UID: "93f5f5a2-e7b3-4599-9890-d7ae21212b78"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:30:36.035578 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:36.035451 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93f5f5a2-e7b3-4599-9890-d7ae21212b78-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "93f5f5a2-e7b3-4599-9890-d7ae21212b78" (UID: "93f5f5a2-e7b3-4599-9890-d7ae21212b78"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:30:36.036948 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:36.036927 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93f5f5a2-e7b3-4599-9890-d7ae21212b78-kube-api-access-qb7vk" (OuterVolumeSpecName: "kube-api-access-qb7vk") pod "93f5f5a2-e7b3-4599-9890-d7ae21212b78" (UID: "93f5f5a2-e7b3-4599-9890-d7ae21212b78"). InnerVolumeSpecName "kube-api-access-qb7vk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:30:36.036948 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:36.036933 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93f5f5a2-e7b3-4599-9890-d7ae21212b78-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "93f5f5a2-e7b3-4599-9890-d7ae21212b78" (UID: "93f5f5a2-e7b3-4599-9890-d7ae21212b78"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:30:36.037125 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:36.037103 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93f5f5a2-e7b3-4599-9890-d7ae21212b78-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "93f5f5a2-e7b3-4599-9890-d7ae21212b78" (UID: "93f5f5a2-e7b3-4599-9890-d7ae21212b78"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:30:36.135680 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:36.135642 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93f5f5a2-e7b3-4599-9890-d7ae21212b78-trusted-ca-bundle\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:30:36.135680 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:36.135674 2571 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/93f5f5a2-e7b3-4599-9890-d7ae21212b78-oauth-serving-cert\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:30:36.135680 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:36.135683 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qb7vk\" (UniqueName: \"kubernetes.io/projected/93f5f5a2-e7b3-4599-9890-d7ae21212b78-kube-api-access-qb7vk\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:30:36.135909 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:36.135695 2571 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/93f5f5a2-e7b3-4599-9890-d7ae21212b78-service-ca\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:30:36.135909 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:36.135704 2571 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/93f5f5a2-e7b3-4599-9890-d7ae21212b78-console-config\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:30:36.135909 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:36.135713 2571 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/93f5f5a2-e7b3-4599-9890-d7ae21212b78-console-serving-cert\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:30:36.135909 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:36.135721 2571 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/93f5f5a2-e7b3-4599-9890-d7ae21212b78-console-oauth-config\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:30:36.634888 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:36.634861 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-545ffd58-s276g_93f5f5a2-e7b3-4599-9890-d7ae21212b78/console/0.log" Apr 24 14:30:36.635084 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:36.634902 2571 generic.go:358] "Generic (PLEG): container finished" podID="93f5f5a2-e7b3-4599-9890-d7ae21212b78" containerID="712a724037f6956c1e3102ffdabec4b2cc933ab1a6632b738250a651b128bbe9" exitCode=2 Apr 24 14:30:36.635084 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:36.634937 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-545ffd58-s276g" event={"ID":"93f5f5a2-e7b3-4599-9890-d7ae21212b78","Type":"ContainerDied","Data":"712a724037f6956c1e3102ffdabec4b2cc933ab1a6632b738250a651b128bbe9"} Apr 24 14:30:36.635084 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:36.634979 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-545ffd58-s276g" event={"ID":"93f5f5a2-e7b3-4599-9890-d7ae21212b78","Type":"ContainerDied","Data":"b7c67ef261381647e01fdf6cb5dcc8f26f46ec1ec9bb7e161f92eb52365781a6"} Apr 24 14:30:36.635084 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:36.634977 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-545ffd58-s276g" Apr 24 14:30:36.635084 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:36.635012 2571 scope.go:117] "RemoveContainer" containerID="712a724037f6956c1e3102ffdabec4b2cc933ab1a6632b738250a651b128bbe9" Apr 24 14:30:36.643100 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:36.643082 2571 scope.go:117] "RemoveContainer" containerID="712a724037f6956c1e3102ffdabec4b2cc933ab1a6632b738250a651b128bbe9" Apr 24 14:30:36.643356 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:30:36.643336 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"712a724037f6956c1e3102ffdabec4b2cc933ab1a6632b738250a651b128bbe9\": container with ID starting with 712a724037f6956c1e3102ffdabec4b2cc933ab1a6632b738250a651b128bbe9 not found: ID does not exist" containerID="712a724037f6956c1e3102ffdabec4b2cc933ab1a6632b738250a651b128bbe9" Apr 24 14:30:36.643406 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:36.643364 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"712a724037f6956c1e3102ffdabec4b2cc933ab1a6632b738250a651b128bbe9"} err="failed to get container status \"712a724037f6956c1e3102ffdabec4b2cc933ab1a6632b738250a651b128bbe9\": rpc error: code = NotFound desc = could not find container \"712a724037f6956c1e3102ffdabec4b2cc933ab1a6632b738250a651b128bbe9\": container with ID starting with 712a724037f6956c1e3102ffdabec4b2cc933ab1a6632b738250a651b128bbe9 not found: ID does not exist" Apr 24 14:30:36.653332 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:36.653305 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-545ffd58-s276g"] Apr 24 14:30:36.657827 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:36.657804 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-545ffd58-s276g"] Apr 24 14:30:38.460925 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:30:38.460891 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93f5f5a2-e7b3-4599-9890-d7ae21212b78" path="/var/lib/kubelet/pods/93f5f5a2-e7b3-4599-9890-d7ae21212b78/volumes" Apr 24 14:31:39.653161 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:39.653082 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxx62"] Apr 24 14:31:39.653641 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:39.653516 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="93f5f5a2-e7b3-4599-9890-d7ae21212b78" containerName="console" Apr 24 14:31:39.653641 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:39.653534 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="93f5f5a2-e7b3-4599-9890-d7ae21212b78" containerName="console" Apr 24 14:31:39.653641 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:39.653641 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="93f5f5a2-e7b3-4599-9890-d7ae21212b78" containerName="console" Apr 24 14:31:39.656764 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:39.656748 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxx62" Apr 24 14:31:39.659297 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:39.659273 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 24 14:31:39.659465 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:39.659422 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-kpbrk\"" Apr 24 14:31:39.659579 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:39.659476 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 24 14:31:39.659579 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:39.659520 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 24 14:31:39.670952 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:39.670930 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxx62"] Apr 24 14:31:39.779497 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:39.779463 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/a0227615-93dc-42ec-81d0-eca643fd9b96-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-bxx62\" (UID: \"a0227615-93dc-42ec-81d0-eca643fd9b96\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxx62" Apr 24 14:31:39.779667 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:39.779527 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ghm5\" (UniqueName: \"kubernetes.io/projected/a0227615-93dc-42ec-81d0-eca643fd9b96-kube-api-access-4ghm5\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-bxx62\" (UID: \"a0227615-93dc-42ec-81d0-eca643fd9b96\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxx62" Apr 24 14:31:39.879997 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:39.879964 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/a0227615-93dc-42ec-81d0-eca643fd9b96-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-bxx62\" (UID: \"a0227615-93dc-42ec-81d0-eca643fd9b96\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxx62" Apr 24 14:31:39.880171 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:39.880074 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4ghm5\" (UniqueName: \"kubernetes.io/projected/a0227615-93dc-42ec-81d0-eca643fd9b96-kube-api-access-4ghm5\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-bxx62\" (UID: \"a0227615-93dc-42ec-81d0-eca643fd9b96\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxx62" Apr 24 14:31:39.882242 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:39.882220 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/a0227615-93dc-42ec-81d0-eca643fd9b96-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-bxx62\" (UID: \"a0227615-93dc-42ec-81d0-eca643fd9b96\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxx62" Apr 24 14:31:39.889364 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:39.889336 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ghm5\" (UniqueName: \"kubernetes.io/projected/a0227615-93dc-42ec-81d0-eca643fd9b96-kube-api-access-4ghm5\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-bxx62\" (UID: \"a0227615-93dc-42ec-81d0-eca643fd9b96\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxx62" Apr 24 14:31:39.967703 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:39.967632 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxx62" Apr 24 14:31:40.095364 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:40.095230 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxx62"] Apr 24 14:31:40.097779 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:31:40.097754 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0227615_93dc_42ec_81d0_eca643fd9b96.slice/crio-ee9ad4cdf13db38df668de7b2758b048e2b03608bd972fa3d6195d99522a610f WatchSource:0}: Error finding container ee9ad4cdf13db38df668de7b2758b048e2b03608bd972fa3d6195d99522a610f: Status 404 returned error can't find the container with id ee9ad4cdf13db38df668de7b2758b048e2b03608bd972fa3d6195d99522a610f Apr 24 14:31:40.824271 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:40.824215 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxx62" event={"ID":"a0227615-93dc-42ec-81d0-eca643fd9b96","Type":"ContainerStarted","Data":"ee9ad4cdf13db38df668de7b2758b048e2b03608bd972fa3d6195d99522a610f"} Apr 24 14:31:43.842427 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:43.842379 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxx62" event={"ID":"a0227615-93dc-42ec-81d0-eca643fd9b96","Type":"ContainerStarted","Data":"26757f4b490c4fea63cf840ecd77450334e68e86b9d60442aab5d06fd24feb7e"} Apr 24 14:31:43.842922 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:43.842457 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxx62" Apr 24 14:31:43.864939 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:43.864886 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxx62" podStartSLOduration=1.2978033070000001 podStartE2EDuration="4.86487121s" podCreationTimestamp="2026-04-24 14:31:39 +0000 UTC" firstStartedPulling="2026-04-24 14:31:40.099574691 +0000 UTC m=+436.229552232" lastFinishedPulling="2026-04-24 14:31:43.666642596 +0000 UTC m=+439.796620135" observedRunningTime="2026-04-24 14:31:43.862575299 +0000 UTC m=+439.992552859" watchObservedRunningTime="2026-04-24 14:31:43.86487121 +0000 UTC m=+439.994848770" Apr 24 14:31:44.233932 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:44.233898 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-j2x2n"] Apr 24 14:31:44.237423 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:44.237407 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-j2x2n" Apr 24 14:31:44.240603 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:44.240583 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 24 14:31:44.240875 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:44.240845 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 24 14:31:44.240974 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:44.240858 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-c4j4f\"" Apr 24 14:31:44.250233 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:44.250212 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-j2x2n"] Apr 24 14:31:44.419096 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:44.419058 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/df92e785-e796-4f24-9249-80f89847be54-certificates\") pod \"keda-operator-ffbb595cb-j2x2n\" (UID: \"df92e785-e796-4f24-9249-80f89847be54\") " pod="openshift-keda/keda-operator-ffbb595cb-j2x2n" Apr 24 14:31:44.419264 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:44.419115 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/df92e785-e796-4f24-9249-80f89847be54-cabundle0\") pod \"keda-operator-ffbb595cb-j2x2n\" (UID: \"df92e785-e796-4f24-9249-80f89847be54\") " pod="openshift-keda/keda-operator-ffbb595cb-j2x2n" Apr 24 14:31:44.419264 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:44.419134 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d67vx\" (UniqueName: \"kubernetes.io/projected/df92e785-e796-4f24-9249-80f89847be54-kube-api-access-d67vx\") pod \"keda-operator-ffbb595cb-j2x2n\" (UID: \"df92e785-e796-4f24-9249-80f89847be54\") " pod="openshift-keda/keda-operator-ffbb595cb-j2x2n" Apr 24 14:31:44.519957 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:44.519879 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/df92e785-e796-4f24-9249-80f89847be54-certificates\") pod \"keda-operator-ffbb595cb-j2x2n\" (UID: \"df92e785-e796-4f24-9249-80f89847be54\") " pod="openshift-keda/keda-operator-ffbb595cb-j2x2n" Apr 24 14:31:44.519957 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:44.519941 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/df92e785-e796-4f24-9249-80f89847be54-cabundle0\") pod \"keda-operator-ffbb595cb-j2x2n\" (UID: \"df92e785-e796-4f24-9249-80f89847be54\") " pod="openshift-keda/keda-operator-ffbb595cb-j2x2n" Apr 24 14:31:44.520172 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:44.519961 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d67vx\" (UniqueName: \"kubernetes.io/projected/df92e785-e796-4f24-9249-80f89847be54-kube-api-access-d67vx\") pod \"keda-operator-ffbb595cb-j2x2n\" (UID: \"df92e785-e796-4f24-9249-80f89847be54\") " pod="openshift-keda/keda-operator-ffbb595cb-j2x2n" Apr 24 14:31:44.520172 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:31:44.520072 2571 secret.go:281] references non-existent secret key: ca.crt Apr 24 14:31:44.520172 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:31:44.520096 2571 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 14:31:44.520172 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:31:44.520108 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-j2x2n: references non-existent secret key: ca.crt Apr 24 14:31:44.520172 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:31:44.520169 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/df92e785-e796-4f24-9249-80f89847be54-certificates podName:df92e785-e796-4f24-9249-80f89847be54 nodeName:}" failed. No retries permitted until 2026-04-24 14:31:45.020149578 +0000 UTC m=+441.150127118 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/df92e785-e796-4f24-9249-80f89847be54-certificates") pod "keda-operator-ffbb595cb-j2x2n" (UID: "df92e785-e796-4f24-9249-80f89847be54") : references non-existent secret key: ca.crt Apr 24 14:31:44.520567 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:44.520551 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/df92e785-e796-4f24-9249-80f89847be54-cabundle0\") pod \"keda-operator-ffbb595cb-j2x2n\" (UID: \"df92e785-e796-4f24-9249-80f89847be54\") " pod="openshift-keda/keda-operator-ffbb595cb-j2x2n" Apr 24 14:31:44.533005 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:44.532951 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d67vx\" (UniqueName: \"kubernetes.io/projected/df92e785-e796-4f24-9249-80f89847be54-kube-api-access-d67vx\") pod \"keda-operator-ffbb595cb-j2x2n\" (UID: \"df92e785-e796-4f24-9249-80f89847be54\") " pod="openshift-keda/keda-operator-ffbb595cb-j2x2n" Apr 24 14:31:45.024837 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:45.024801 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/df92e785-e796-4f24-9249-80f89847be54-certificates\") pod \"keda-operator-ffbb595cb-j2x2n\" (UID: \"df92e785-e796-4f24-9249-80f89847be54\") " pod="openshift-keda/keda-operator-ffbb595cb-j2x2n" Apr 24 14:31:45.025241 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:31:45.024978 2571 secret.go:281] references non-existent secret key: ca.crt Apr 24 14:31:45.025241 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:31:45.025024 2571 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 14:31:45.025241 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:31:45.025038 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-j2x2n: references non-existent secret key: ca.crt Apr 24 14:31:45.025241 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:31:45.025114 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/df92e785-e796-4f24-9249-80f89847be54-certificates podName:df92e785-e796-4f24-9249-80f89847be54 nodeName:}" failed. No retries permitted until 2026-04-24 14:31:46.025091122 +0000 UTC m=+442.155068666 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/df92e785-e796-4f24-9249-80f89847be54-certificates") pod "keda-operator-ffbb595cb-j2x2n" (UID: "df92e785-e796-4f24-9249-80f89847be54") : references non-existent secret key: ca.crt Apr 24 14:31:46.034341 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:46.034274 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/df92e785-e796-4f24-9249-80f89847be54-certificates\") pod \"keda-operator-ffbb595cb-j2x2n\" (UID: \"df92e785-e796-4f24-9249-80f89847be54\") " pod="openshift-keda/keda-operator-ffbb595cb-j2x2n" Apr 24 14:31:46.036727 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:46.036701 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/df92e785-e796-4f24-9249-80f89847be54-certificates\") pod \"keda-operator-ffbb595cb-j2x2n\" (UID: \"df92e785-e796-4f24-9249-80f89847be54\") " pod="openshift-keda/keda-operator-ffbb595cb-j2x2n" Apr 24 14:31:46.047700 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:46.047671 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-j2x2n" Apr 24 14:31:46.167125 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:46.167096 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-j2x2n"] Apr 24 14:31:46.169892 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:31:46.169864 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf92e785_e796_4f24_9249_80f89847be54.slice/crio-38cfda8e2d2fa230e401949fb15a7eea922af978b9cbb33eb7b44453f984fcd8 WatchSource:0}: Error finding container 38cfda8e2d2fa230e401949fb15a7eea922af978b9cbb33eb7b44453f984fcd8: Status 404 returned error can't find the container with id 38cfda8e2d2fa230e401949fb15a7eea922af978b9cbb33eb7b44453f984fcd8 Apr 24 14:31:46.854827 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:46.854787 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-j2x2n" event={"ID":"df92e785-e796-4f24-9249-80f89847be54","Type":"ContainerStarted","Data":"38cfda8e2d2fa230e401949fb15a7eea922af978b9cbb33eb7b44453f984fcd8"} Apr 24 14:31:49.867175 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:49.867135 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-j2x2n" event={"ID":"df92e785-e796-4f24-9249-80f89847be54","Type":"ContainerStarted","Data":"2cb175eb64e26d063cd0ed03e5a6c0c15d0eb048ede0ffff787615a1dc1d1aab"} Apr 24 14:31:49.867638 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:49.867287 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-j2x2n" Apr 24 14:31:49.884571 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:31:49.884518 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-j2x2n" podStartSLOduration=2.737864749 podStartE2EDuration="5.884503414s" podCreationTimestamp="2026-04-24 14:31:44 +0000 UTC" firstStartedPulling="2026-04-24 14:31:46.171738957 +0000 UTC m=+442.301716501" lastFinishedPulling="2026-04-24 14:31:49.31837762 +0000 UTC m=+445.448355166" observedRunningTime="2026-04-24 14:31:49.883370089 +0000 UTC m=+446.013347650" watchObservedRunningTime="2026-04-24 14:31:49.884503414 +0000 UTC m=+446.014480973" Apr 24 14:32:04.848319 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:32:04.848287 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxx62" Apr 24 14:32:10.872174 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:32:10.872135 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-j2x2n" Apr 24 14:32:51.273155 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:32:51.273120 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-49r8b"] Apr 24 14:32:51.277148 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:32:51.277128 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-49r8b" Apr 24 14:32:51.280069 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:32:51.280039 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 14:32:51.280189 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:32:51.280075 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 14:32:51.280189 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:32:51.280143 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-bnfqk\"" Apr 24 14:32:51.280433 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:32:51.280418 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 24 14:32:51.284927 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:32:51.284530 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-49r8b"] Apr 24 14:32:51.368068 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:32:51.368035 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9541f4ff-d53d-4def-bbc2-e736c381a1dd-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-49r8b\" (UID: \"9541f4ff-d53d-4def-bbc2-e736c381a1dd\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-49r8b" Apr 24 14:32:51.368236 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:32:51.368077 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh7gp\" (UniqueName: \"kubernetes.io/projected/9541f4ff-d53d-4def-bbc2-e736c381a1dd-kube-api-access-xh7gp\") pod \"llmisvc-controller-manager-68cc5db7c4-49r8b\" (UID: \"9541f4ff-d53d-4def-bbc2-e736c381a1dd\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-49r8b" Apr 24 14:32:51.469013 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:32:51.468964 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xh7gp\" (UniqueName: \"kubernetes.io/projected/9541f4ff-d53d-4def-bbc2-e736c381a1dd-kube-api-access-xh7gp\") pod \"llmisvc-controller-manager-68cc5db7c4-49r8b\" (UID: \"9541f4ff-d53d-4def-bbc2-e736c381a1dd\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-49r8b" Apr 24 14:32:51.469177 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:32:51.469116 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9541f4ff-d53d-4def-bbc2-e736c381a1dd-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-49r8b\" (UID: \"9541f4ff-d53d-4def-bbc2-e736c381a1dd\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-49r8b" Apr 24 14:32:51.471495 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:32:51.471477 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9541f4ff-d53d-4def-bbc2-e736c381a1dd-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-49r8b\" (UID: \"9541f4ff-d53d-4def-bbc2-e736c381a1dd\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-49r8b" Apr 24 14:32:51.478569 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:32:51.478542 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh7gp\" (UniqueName: \"kubernetes.io/projected/9541f4ff-d53d-4def-bbc2-e736c381a1dd-kube-api-access-xh7gp\") pod \"llmisvc-controller-manager-68cc5db7c4-49r8b\" (UID: \"9541f4ff-d53d-4def-bbc2-e736c381a1dd\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-49r8b" Apr 24 14:32:51.588554 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:32:51.588470 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-49r8b" Apr 24 14:32:51.710349 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:32:51.710323 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-49r8b"] Apr 24 14:32:51.712736 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:32:51.712693 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9541f4ff_d53d_4def_bbc2_e736c381a1dd.slice/crio-3b944cbb8071da761437c44bb9c2d6225be0231e66a0d9fb656f527571bbd391 WatchSource:0}: Error finding container 3b944cbb8071da761437c44bb9c2d6225be0231e66a0d9fb656f527571bbd391: Status 404 returned error can't find the container with id 3b944cbb8071da761437c44bb9c2d6225be0231e66a0d9fb656f527571bbd391 Apr 24 14:32:52.074646 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:32:52.074609 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-49r8b" event={"ID":"9541f4ff-d53d-4def-bbc2-e736c381a1dd","Type":"ContainerStarted","Data":"3b944cbb8071da761437c44bb9c2d6225be0231e66a0d9fb656f527571bbd391"} Apr 24 14:32:54.087879 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:32:54.087839 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-49r8b" event={"ID":"9541f4ff-d53d-4def-bbc2-e736c381a1dd","Type":"ContainerStarted","Data":"791802f1f7567e6087b59c777afed9b895bde96044592a1c062709f5af9ddf69"} Apr 24 14:32:54.088299 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:32:54.087967 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-49r8b" Apr 24 14:32:54.109177 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:32:54.109132 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-49r8b" podStartSLOduration=1.318407632 podStartE2EDuration="3.109118608s" podCreationTimestamp="2026-04-24 14:32:51 +0000 UTC" firstStartedPulling="2026-04-24 14:32:51.713935616 +0000 UTC m=+507.843913155" lastFinishedPulling="2026-04-24 14:32:53.504646593 +0000 UTC m=+509.634624131" observedRunningTime="2026-04-24 14:32:54.107402377 +0000 UTC m=+510.237379938" watchObservedRunningTime="2026-04-24 14:32:54.109118608 +0000 UTC m=+510.239096167" Apr 24 14:33:25.092667 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:33:25.092634 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-49r8b" Apr 24 14:33:59.700578 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:33:59.700540 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-2p7sn"] Apr 24 14:33:59.703962 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:33:59.703943 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-2p7sn" Apr 24 14:33:59.706273 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:33:59.706253 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-snf7p\"" Apr 24 14:33:59.706421 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:33:59.706259 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 24 14:33:59.715112 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:33:59.715082 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-2p7sn"] Apr 24 14:33:59.718107 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:33:59.718079 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf9k2\" (UniqueName: \"kubernetes.io/projected/4686071d-b58f-45b7-88ab-296a3c044797-kube-api-access-gf9k2\") pod \"model-serving-api-86f7b4b499-2p7sn\" (UID: \"4686071d-b58f-45b7-88ab-296a3c044797\") " pod="kserve/model-serving-api-86f7b4b499-2p7sn" Apr 24 14:33:59.718280 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:33:59.718262 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4686071d-b58f-45b7-88ab-296a3c044797-tls-certs\") pod \"model-serving-api-86f7b4b499-2p7sn\" (UID: \"4686071d-b58f-45b7-88ab-296a3c044797\") " pod="kserve/model-serving-api-86f7b4b499-2p7sn" Apr 24 14:33:59.719478 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:33:59.719459 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-bd8lf"] Apr 24 14:33:59.722621 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:33:59.722607 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-bd8lf" Apr 24 14:33:59.724825 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:33:59.724799 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 24 14:33:59.724920 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:33:59.724899 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-kgqst\"" Apr 24 14:33:59.735079 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:33:59.735058 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-bd8lf"] Apr 24 14:33:59.818920 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:33:59.818882 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4686071d-b58f-45b7-88ab-296a3c044797-tls-certs\") pod \"model-serving-api-86f7b4b499-2p7sn\" (UID: \"4686071d-b58f-45b7-88ab-296a3c044797\") " pod="kserve/model-serving-api-86f7b4b499-2p7sn" Apr 24 14:33:59.818920 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:33:59.818927 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gf9k2\" (UniqueName: \"kubernetes.io/projected/4686071d-b58f-45b7-88ab-296a3c044797-kube-api-access-gf9k2\") pod \"model-serving-api-86f7b4b499-2p7sn\" (UID: \"4686071d-b58f-45b7-88ab-296a3c044797\") " pod="kserve/model-serving-api-86f7b4b499-2p7sn" Apr 24 14:33:59.819158 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:33:59.819037 2571 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 24 14:33:59.819158 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:33:59.819046 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac069a16-3258-4ed3-b927-268b5187b5be-cert\") pod \"odh-model-controller-696fc77849-bd8lf\" (UID: \"ac069a16-3258-4ed3-b927-268b5187b5be\") " pod="kserve/odh-model-controller-696fc77849-bd8lf" Apr 24 14:33:59.819158 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:33:59.819096 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4686071d-b58f-45b7-88ab-296a3c044797-tls-certs podName:4686071d-b58f-45b7-88ab-296a3c044797 nodeName:}" failed. No retries permitted until 2026-04-24 14:34:00.319078249 +0000 UTC m=+576.449055789 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/4686071d-b58f-45b7-88ab-296a3c044797-tls-certs") pod "model-serving-api-86f7b4b499-2p7sn" (UID: "4686071d-b58f-45b7-88ab-296a3c044797") : secret "model-serving-api-tls" not found Apr 24 14:33:59.819158 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:33:59.819139 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bntmj\" (UniqueName: \"kubernetes.io/projected/ac069a16-3258-4ed3-b927-268b5187b5be-kube-api-access-bntmj\") pod \"odh-model-controller-696fc77849-bd8lf\" (UID: \"ac069a16-3258-4ed3-b927-268b5187b5be\") " pod="kserve/odh-model-controller-696fc77849-bd8lf" Apr 24 14:33:59.828453 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:33:59.828432 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf9k2\" (UniqueName: \"kubernetes.io/projected/4686071d-b58f-45b7-88ab-296a3c044797-kube-api-access-gf9k2\") pod \"model-serving-api-86f7b4b499-2p7sn\" (UID: \"4686071d-b58f-45b7-88ab-296a3c044797\") " pod="kserve/model-serving-api-86f7b4b499-2p7sn" Apr 24 14:33:59.909144 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:33:59.909108 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5f6b8c7668-fgp8q"] Apr 24 14:33:59.919749 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:33:59.919720 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bntmj\" (UniqueName: \"kubernetes.io/projected/ac069a16-3258-4ed3-b927-268b5187b5be-kube-api-access-bntmj\") pod \"odh-model-controller-696fc77849-bd8lf\" (UID: \"ac069a16-3258-4ed3-b927-268b5187b5be\") " pod="kserve/odh-model-controller-696fc77849-bd8lf" Apr 24 14:33:59.919925 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:33:59.919856 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac069a16-3258-4ed3-b927-268b5187b5be-cert\") pod \"odh-model-controller-696fc77849-bd8lf\" (UID: \"ac069a16-3258-4ed3-b927-268b5187b5be\") " pod="kserve/odh-model-controller-696fc77849-bd8lf" Apr 24 14:33:59.920040 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:33:59.920020 2571 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 24 14:33:59.920112 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:33:59.920101 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac069a16-3258-4ed3-b927-268b5187b5be-cert podName:ac069a16-3258-4ed3-b927-268b5187b5be nodeName:}" failed. No retries permitted until 2026-04-24 14:34:00.420080432 +0000 UTC m=+576.550057986 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ac069a16-3258-4ed3-b927-268b5187b5be-cert") pod "odh-model-controller-696fc77849-bd8lf" (UID: "ac069a16-3258-4ed3-b927-268b5187b5be") : secret "odh-model-controller-webhook-cert" not found Apr 24 14:33:59.932711 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:33:59.932683 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bntmj\" (UniqueName: \"kubernetes.io/projected/ac069a16-3258-4ed3-b927-268b5187b5be-kube-api-access-bntmj\") pod \"odh-model-controller-696fc77849-bd8lf\" (UID: \"ac069a16-3258-4ed3-b927-268b5187b5be\") " pod="kserve/odh-model-controller-696fc77849-bd8lf" Apr 24 14:34:00.323534 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:00.323505 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4686071d-b58f-45b7-88ab-296a3c044797-tls-certs\") pod \"model-serving-api-86f7b4b499-2p7sn\" (UID: \"4686071d-b58f-45b7-88ab-296a3c044797\") " pod="kserve/model-serving-api-86f7b4b499-2p7sn" Apr 24 14:34:00.325922 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:00.325899 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4686071d-b58f-45b7-88ab-296a3c044797-tls-certs\") pod \"model-serving-api-86f7b4b499-2p7sn\" (UID: \"4686071d-b58f-45b7-88ab-296a3c044797\") " pod="kserve/model-serving-api-86f7b4b499-2p7sn" Apr 24 14:34:00.424102 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:00.424068 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac069a16-3258-4ed3-b927-268b5187b5be-cert\") pod \"odh-model-controller-696fc77849-bd8lf\" (UID: \"ac069a16-3258-4ed3-b927-268b5187b5be\") " pod="kserve/odh-model-controller-696fc77849-bd8lf" Apr 24 14:34:00.426498 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:00.426479 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac069a16-3258-4ed3-b927-268b5187b5be-cert\") pod \"odh-model-controller-696fc77849-bd8lf\" (UID: \"ac069a16-3258-4ed3-b927-268b5187b5be\") " pod="kserve/odh-model-controller-696fc77849-bd8lf" Apr 24 14:34:00.616036 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:00.616015 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-2p7sn" Apr 24 14:34:00.635803 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:00.635777 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-bd8lf" Apr 24 14:34:00.747615 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:00.747584 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-2p7sn"] Apr 24 14:34:00.750263 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:34:00.750237 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4686071d_b58f_45b7_88ab_296a3c044797.slice/crio-c93e511300a164bd57ea2be955c5eed9f8aa273040f615f0a7c31041f3cdb7ba WatchSource:0}: Error finding container c93e511300a164bd57ea2be955c5eed9f8aa273040f615f0a7c31041f3cdb7ba: Status 404 returned error can't find the container with id c93e511300a164bd57ea2be955c5eed9f8aa273040f615f0a7c31041f3cdb7ba Apr 24 14:34:00.769117 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:00.769097 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-bd8lf"] Apr 24 14:34:00.770162 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:34:00.770130 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac069a16_3258_4ed3_b927_268b5187b5be.slice/crio-826bb2ba4210900614bb131a8710c4ee175d1595efd1536bd8994153c113782c WatchSource:0}: Error finding container 826bb2ba4210900614bb131a8710c4ee175d1595efd1536bd8994153c113782c: Status 404 returned error can't find the container with id 826bb2ba4210900614bb131a8710c4ee175d1595efd1536bd8994153c113782c Apr 24 14:34:01.300521 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:01.300483 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-bd8lf" event={"ID":"ac069a16-3258-4ed3-b927-268b5187b5be","Type":"ContainerStarted","Data":"826bb2ba4210900614bb131a8710c4ee175d1595efd1536bd8994153c113782c"} Apr 24 14:34:01.301907 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:01.301874 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-2p7sn" event={"ID":"4686071d-b58f-45b7-88ab-296a3c044797","Type":"ContainerStarted","Data":"c93e511300a164bd57ea2be955c5eed9f8aa273040f615f0a7c31041f3cdb7ba"} Apr 24 14:34:05.322739 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:05.322696 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-bd8lf" event={"ID":"ac069a16-3258-4ed3-b927-268b5187b5be","Type":"ContainerStarted","Data":"86d57361551aee17182f7c4202245f77ea93ffc741d895dbd2f6a89fa4abe64a"} Apr 24 14:34:05.323289 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:05.322766 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-bd8lf" Apr 24 14:34:05.324096 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:05.324073 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-2p7sn" event={"ID":"4686071d-b58f-45b7-88ab-296a3c044797","Type":"ContainerStarted","Data":"618bac06d06daa912041832677ca44036e5a1b05fba30de1c0e7c992572fa158"} Apr 24 14:34:05.324201 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:05.324184 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-2p7sn" Apr 24 14:34:05.341368 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:05.341317 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-bd8lf" podStartSLOduration=2.593892119 podStartE2EDuration="6.34130112s" podCreationTimestamp="2026-04-24 14:33:59 +0000 UTC" firstStartedPulling="2026-04-24 14:34:00.771371324 +0000 UTC m=+576.901348866" lastFinishedPulling="2026-04-24 14:34:04.518780321 +0000 UTC m=+580.648757867" observedRunningTime="2026-04-24 14:34:05.340514984 +0000 UTC m=+581.470492545" watchObservedRunningTime="2026-04-24 14:34:05.34130112 +0000 UTC m=+581.471278682" Apr 24 14:34:05.359373 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:05.359328 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-2p7sn" podStartSLOduration=2.5973558260000003 podStartE2EDuration="6.359314421s" podCreationTimestamp="2026-04-24 14:33:59 +0000 UTC" firstStartedPulling="2026-04-24 14:34:00.752057613 +0000 UTC m=+576.882035152" lastFinishedPulling="2026-04-24 14:34:04.514016205 +0000 UTC m=+580.643993747" observedRunningTime="2026-04-24 14:34:05.358024319 +0000 UTC m=+581.488001880" watchObservedRunningTime="2026-04-24 14:34:05.359314421 +0000 UTC m=+581.489291981" Apr 24 14:34:16.330117 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:16.330088 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-bd8lf" Apr 24 14:34:16.332169 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:16.332151 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-2p7sn" Apr 24 14:34:24.372796 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:24.372766 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mk446_83b693cc-250b-45e7-b205-baf7f0feff6b/console-operator/2.log" Apr 24 14:34:24.375227 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:24.375202 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mk446_83b693cc-250b-45e7-b205-baf7f0feff6b/console-operator/2.log" Apr 24 14:34:24.377332 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:24.377311 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7m9zt_a7b9926d-4f53-4532-8669-16af4fc30cfd/ovn-acl-logging/0.log" Apr 24 14:34:24.379463 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:24.379442 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7m9zt_a7b9926d-4f53-4532-8669-16af4fc30cfd/ovn-acl-logging/0.log" Apr 24 14:34:24.930214 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:24.930176 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5f6b8c7668-fgp8q" podUID="026e252b-ed46-4489-b5bb-876cda7f5870" containerName="console" containerID="cri-o://5cb9c7258f4e8be0c7f04ac28d726905e593ecaf8042a338ced9a215a04a86c8" gracePeriod=15 Apr 24 14:34:25.187016 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:25.186947 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f6b8c7668-fgp8q_026e252b-ed46-4489-b5bb-876cda7f5870/console/0.log" Apr 24 14:34:25.187109 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:25.187030 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f6b8c7668-fgp8q" Apr 24 14:34:25.334048 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:25.334017 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/026e252b-ed46-4489-b5bb-876cda7f5870-console-config\") pod \"026e252b-ed46-4489-b5bb-876cda7f5870\" (UID: \"026e252b-ed46-4489-b5bb-876cda7f5870\") " Apr 24 14:34:25.334239 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:25.334058 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/026e252b-ed46-4489-b5bb-876cda7f5870-oauth-serving-cert\") pod \"026e252b-ed46-4489-b5bb-876cda7f5870\" (UID: \"026e252b-ed46-4489-b5bb-876cda7f5870\") " Apr 24 14:34:25.334239 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:25.334098 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/026e252b-ed46-4489-b5bb-876cda7f5870-trusted-ca-bundle\") pod \"026e252b-ed46-4489-b5bb-876cda7f5870\" (UID: \"026e252b-ed46-4489-b5bb-876cda7f5870\") " Apr 24 14:34:25.334239 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:25.334125 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfvsl\" (UniqueName: \"kubernetes.io/projected/026e252b-ed46-4489-b5bb-876cda7f5870-kube-api-access-nfvsl\") pod \"026e252b-ed46-4489-b5bb-876cda7f5870\" (UID: \"026e252b-ed46-4489-b5bb-876cda7f5870\") " Apr 24 14:34:25.334239 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:25.334156 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/026e252b-ed46-4489-b5bb-876cda7f5870-console-oauth-config\") pod \"026e252b-ed46-4489-b5bb-876cda7f5870\" (UID: \"026e252b-ed46-4489-b5bb-876cda7f5870\") " Apr 24 14:34:25.334239 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:25.334197 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/026e252b-ed46-4489-b5bb-876cda7f5870-console-serving-cert\") pod \"026e252b-ed46-4489-b5bb-876cda7f5870\" (UID: \"026e252b-ed46-4489-b5bb-876cda7f5870\") " Apr 24 14:34:25.334239 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:25.334224 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/026e252b-ed46-4489-b5bb-876cda7f5870-service-ca\") pod \"026e252b-ed46-4489-b5bb-876cda7f5870\" (UID: \"026e252b-ed46-4489-b5bb-876cda7f5870\") " Apr 24 14:34:25.334562 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:25.334437 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/026e252b-ed46-4489-b5bb-876cda7f5870-console-config" (OuterVolumeSpecName: "console-config") pod "026e252b-ed46-4489-b5bb-876cda7f5870" (UID: "026e252b-ed46-4489-b5bb-876cda7f5870"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:34:25.334562 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:25.334504 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/026e252b-ed46-4489-b5bb-876cda7f5870-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "026e252b-ed46-4489-b5bb-876cda7f5870" (UID: "026e252b-ed46-4489-b5bb-876cda7f5870"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:34:25.334695 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:25.334551 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/026e252b-ed46-4489-b5bb-876cda7f5870-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "026e252b-ed46-4489-b5bb-876cda7f5870" (UID: "026e252b-ed46-4489-b5bb-876cda7f5870"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:34:25.334799 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:25.334756 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/026e252b-ed46-4489-b5bb-876cda7f5870-service-ca" (OuterVolumeSpecName: "service-ca") pod "026e252b-ed46-4489-b5bb-876cda7f5870" (UID: "026e252b-ed46-4489-b5bb-876cda7f5870"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:34:25.336385 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:25.336360 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/026e252b-ed46-4489-b5bb-876cda7f5870-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "026e252b-ed46-4489-b5bb-876cda7f5870" (UID: "026e252b-ed46-4489-b5bb-876cda7f5870"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:34:25.336759 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:25.336741 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/026e252b-ed46-4489-b5bb-876cda7f5870-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "026e252b-ed46-4489-b5bb-876cda7f5870" (UID: "026e252b-ed46-4489-b5bb-876cda7f5870"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:34:25.336820 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:25.336753 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/026e252b-ed46-4489-b5bb-876cda7f5870-kube-api-access-nfvsl" (OuterVolumeSpecName: "kube-api-access-nfvsl") pod "026e252b-ed46-4489-b5bb-876cda7f5870" (UID: "026e252b-ed46-4489-b5bb-876cda7f5870"). InnerVolumeSpecName "kube-api-access-nfvsl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:34:25.389408 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:25.389385 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f6b8c7668-fgp8q_026e252b-ed46-4489-b5bb-876cda7f5870/console/0.log" Apr 24 14:34:25.389747 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:25.389424 2571 generic.go:358] "Generic (PLEG): container finished" podID="026e252b-ed46-4489-b5bb-876cda7f5870" containerID="5cb9c7258f4e8be0c7f04ac28d726905e593ecaf8042a338ced9a215a04a86c8" exitCode=2 Apr 24 14:34:25.389747 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:25.389455 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f6b8c7668-fgp8q" event={"ID":"026e252b-ed46-4489-b5bb-876cda7f5870","Type":"ContainerDied","Data":"5cb9c7258f4e8be0c7f04ac28d726905e593ecaf8042a338ced9a215a04a86c8"} Apr 24 14:34:25.389747 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:25.389494 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f6b8c7668-fgp8q" event={"ID":"026e252b-ed46-4489-b5bb-876cda7f5870","Type":"ContainerDied","Data":"dd1cd580242e3c4d8a2cb69f772ea558b5309243a901baed54de700ec2b915d4"} Apr 24 14:34:25.389747 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:25.389494 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f6b8c7668-fgp8q" Apr 24 14:34:25.389747 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:25.389562 2571 scope.go:117] "RemoveContainer" containerID="5cb9c7258f4e8be0c7f04ac28d726905e593ecaf8042a338ced9a215a04a86c8" Apr 24 14:34:25.397553 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:25.397532 2571 scope.go:117] "RemoveContainer" containerID="5cb9c7258f4e8be0c7f04ac28d726905e593ecaf8042a338ced9a215a04a86c8" Apr 24 14:34:25.397797 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:34:25.397779 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cb9c7258f4e8be0c7f04ac28d726905e593ecaf8042a338ced9a215a04a86c8\": container with ID starting with 5cb9c7258f4e8be0c7f04ac28d726905e593ecaf8042a338ced9a215a04a86c8 not found: ID does not exist" containerID="5cb9c7258f4e8be0c7f04ac28d726905e593ecaf8042a338ced9a215a04a86c8" Apr 24 14:34:25.397848 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:25.397804 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cb9c7258f4e8be0c7f04ac28d726905e593ecaf8042a338ced9a215a04a86c8"} err="failed to get container status \"5cb9c7258f4e8be0c7f04ac28d726905e593ecaf8042a338ced9a215a04a86c8\": rpc error: code = NotFound desc = could not find container \"5cb9c7258f4e8be0c7f04ac28d726905e593ecaf8042a338ced9a215a04a86c8\": container with ID starting with 5cb9c7258f4e8be0c7f04ac28d726905e593ecaf8042a338ced9a215a04a86c8 not found: ID does not exist" Apr 24 14:34:25.411484 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:25.411461 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5f6b8c7668-fgp8q"] Apr 24 14:34:25.416624 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:25.416599 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5f6b8c7668-fgp8q"] Apr 24 14:34:25.435372 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:25.435349 2571 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/026e252b-ed46-4489-b5bb-876cda7f5870-console-serving-cert\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:34:25.435372 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:25.435371 2571 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/026e252b-ed46-4489-b5bb-876cda7f5870-service-ca\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:34:25.435504 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:25.435382 2571 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/026e252b-ed46-4489-b5bb-876cda7f5870-console-config\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:34:25.435504 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:25.435390 2571 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/026e252b-ed46-4489-b5bb-876cda7f5870-oauth-serving-cert\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:34:25.435504 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:25.435404 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/026e252b-ed46-4489-b5bb-876cda7f5870-trusted-ca-bundle\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:34:25.435504 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:25.435414 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nfvsl\" (UniqueName: \"kubernetes.io/projected/026e252b-ed46-4489-b5bb-876cda7f5870-kube-api-access-nfvsl\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:34:25.435504 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:25.435423 2571 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/026e252b-ed46-4489-b5bb-876cda7f5870-console-oauth-config\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:34:26.461315 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:34:26.461281 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="026e252b-ed46-4489-b5bb-876cda7f5870" path="/var/lib/kubelet/pods/026e252b-ed46-4489-b5bb-876cda7f5870/volumes" Apr 24 14:37:53.796019 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:37:53.795922 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-a4202-598ddcf66c-jmhdd"] Apr 24 14:37:53.796465 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:37:53.796450 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="026e252b-ed46-4489-b5bb-876cda7f5870" containerName="console" Apr 24 14:37:53.796508 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:37:53.796468 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="026e252b-ed46-4489-b5bb-876cda7f5870" containerName="console" Apr 24 14:37:53.796558 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:37:53.796548 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="026e252b-ed46-4489-b5bb-876cda7f5870" containerName="console" Apr 24 14:37:53.799773 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:37:53.799756 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-a4202-598ddcf66c-jmhdd" Apr 24 14:37:53.801946 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:37:53.801923 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-a4202-serving-cert\"" Apr 24 14:37:53.802107 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:37:53.802025 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-qcgw2\"" Apr 24 14:37:53.802107 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:37:53.802025 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 14:37:53.802536 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:37:53.802515 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-a4202-kube-rbac-proxy-sar-config\"" Apr 24 14:37:53.807526 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:37:53.807498 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-a4202-598ddcf66c-jmhdd"] Apr 24 14:37:53.874461 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:37:53.874432 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ceba8a83-9f4c-4aed-bac0-7b735085dba4-openshift-service-ca-bundle\") pod \"model-chainer-raw-a4202-598ddcf66c-jmhdd\" (UID: \"ceba8a83-9f4c-4aed-bac0-7b735085dba4\") " pod="kserve-ci-e2e-test/model-chainer-raw-a4202-598ddcf66c-jmhdd" Apr 24 14:37:53.874591 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:37:53.874493 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ceba8a83-9f4c-4aed-bac0-7b735085dba4-proxy-tls\") pod \"model-chainer-raw-a4202-598ddcf66c-jmhdd\" (UID: \"ceba8a83-9f4c-4aed-bac0-7b735085dba4\") " pod="kserve-ci-e2e-test/model-chainer-raw-a4202-598ddcf66c-jmhdd" Apr 24 14:37:53.975225 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:37:53.975199 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ceba8a83-9f4c-4aed-bac0-7b735085dba4-proxy-tls\") pod \"model-chainer-raw-a4202-598ddcf66c-jmhdd\" (UID: \"ceba8a83-9f4c-4aed-bac0-7b735085dba4\") " pod="kserve-ci-e2e-test/model-chainer-raw-a4202-598ddcf66c-jmhdd" Apr 24 14:37:53.975394 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:37:53.975254 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ceba8a83-9f4c-4aed-bac0-7b735085dba4-openshift-service-ca-bundle\") pod \"model-chainer-raw-a4202-598ddcf66c-jmhdd\" (UID: \"ceba8a83-9f4c-4aed-bac0-7b735085dba4\") " pod="kserve-ci-e2e-test/model-chainer-raw-a4202-598ddcf66c-jmhdd" Apr 24 14:37:53.975851 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:37:53.975832 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ceba8a83-9f4c-4aed-bac0-7b735085dba4-openshift-service-ca-bundle\") pod \"model-chainer-raw-a4202-598ddcf66c-jmhdd\" (UID: \"ceba8a83-9f4c-4aed-bac0-7b735085dba4\") " pod="kserve-ci-e2e-test/model-chainer-raw-a4202-598ddcf66c-jmhdd" Apr 24 14:37:53.977529 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:37:53.977508 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ceba8a83-9f4c-4aed-bac0-7b735085dba4-proxy-tls\") pod \"model-chainer-raw-a4202-598ddcf66c-jmhdd\" (UID: \"ceba8a83-9f4c-4aed-bac0-7b735085dba4\") " pod="kserve-ci-e2e-test/model-chainer-raw-a4202-598ddcf66c-jmhdd" Apr 24 14:37:54.110750 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:37:54.110725 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-a4202-598ddcf66c-jmhdd" Apr 24 14:37:54.226330 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:37:54.226306 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-a4202-598ddcf66c-jmhdd"] Apr 24 14:37:54.229077 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:37:54.229051 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podceba8a83_9f4c_4aed_bac0_7b735085dba4.slice/crio-025b7a906a2a95003556c008313a0d353b47fbcbf3128a934828384aeb013333 WatchSource:0}: Error finding container 025b7a906a2a95003556c008313a0d353b47fbcbf3128a934828384aeb013333: Status 404 returned error can't find the container with id 025b7a906a2a95003556c008313a0d353b47fbcbf3128a934828384aeb013333 Apr 24 14:37:54.230838 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:37:54.230813 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 14:37:55.082295 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:37:55.082253 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-a4202-598ddcf66c-jmhdd" event={"ID":"ceba8a83-9f4c-4aed-bac0-7b735085dba4","Type":"ContainerStarted","Data":"025b7a906a2a95003556c008313a0d353b47fbcbf3128a934828384aeb013333"} Apr 24 14:37:57.090188 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:37:57.090144 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-a4202-598ddcf66c-jmhdd" event={"ID":"ceba8a83-9f4c-4aed-bac0-7b735085dba4","Type":"ContainerStarted","Data":"e2ed67229599a0a5743be806bb47fc1e67781cd814c0b979e9325e483ee40099"} Apr 24 14:37:57.090621 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:37:57.090214 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-a4202-598ddcf66c-jmhdd" Apr 24 14:37:57.107602 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:37:57.107554 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-a4202-598ddcf66c-jmhdd" podStartSLOduration=1.470757646 podStartE2EDuration="4.107540757s" podCreationTimestamp="2026-04-24 14:37:53 +0000 UTC" firstStartedPulling="2026-04-24 14:37:54.231033113 +0000 UTC m=+810.361010659" lastFinishedPulling="2026-04-24 14:37:56.86781622 +0000 UTC m=+812.997793770" observedRunningTime="2026-04-24 14:37:57.105509301 +0000 UTC m=+813.235487097" watchObservedRunningTime="2026-04-24 14:37:57.107540757 +0000 UTC m=+813.237518355" Apr 24 14:38:03.099195 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:38:03.099169 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-a4202-598ddcf66c-jmhdd" Apr 24 14:38:03.858693 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:38:03.858658 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-a4202-598ddcf66c-jmhdd"] Apr 24 14:38:03.858947 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:38:03.858921 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-a4202-598ddcf66c-jmhdd" podUID="ceba8a83-9f4c-4aed-bac0-7b735085dba4" containerName="model-chainer-raw-a4202" containerID="cri-o://e2ed67229599a0a5743be806bb47fc1e67781cd814c0b979e9325e483ee40099" gracePeriod=30 Apr 24 14:38:08.097905 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:38:08.097863 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-a4202-598ddcf66c-jmhdd" podUID="ceba8a83-9f4c-4aed-bac0-7b735085dba4" containerName="model-chainer-raw-a4202" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:38:13.098839 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:38:13.098794 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-a4202-598ddcf66c-jmhdd" podUID="ceba8a83-9f4c-4aed-bac0-7b735085dba4" containerName="model-chainer-raw-a4202" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:38:18.097250 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:38:18.097210 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-a4202-598ddcf66c-jmhdd" podUID="ceba8a83-9f4c-4aed-bac0-7b735085dba4" containerName="model-chainer-raw-a4202" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:38:18.097678 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:38:18.097351 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-a4202-598ddcf66c-jmhdd" Apr 24 14:38:23.097738 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:38:23.097697 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-a4202-598ddcf66c-jmhdd" podUID="ceba8a83-9f4c-4aed-bac0-7b735085dba4" containerName="model-chainer-raw-a4202" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:38:28.098070 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:38:28.098025 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-a4202-598ddcf66c-jmhdd" podUID="ceba8a83-9f4c-4aed-bac0-7b735085dba4" containerName="model-chainer-raw-a4202" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:38:33.098134 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:38:33.098085 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-a4202-598ddcf66c-jmhdd" podUID="ceba8a83-9f4c-4aed-bac0-7b735085dba4" containerName="model-chainer-raw-a4202" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:38:34.210425 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:38:34.210339 2571 generic.go:358] "Generic (PLEG): container finished" podID="ceba8a83-9f4c-4aed-bac0-7b735085dba4" containerID="e2ed67229599a0a5743be806bb47fc1e67781cd814c0b979e9325e483ee40099" exitCode=0 Apr 24 14:38:34.210425 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:38:34.210395 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-a4202-598ddcf66c-jmhdd" event={"ID":"ceba8a83-9f4c-4aed-bac0-7b735085dba4","Type":"ContainerDied","Data":"e2ed67229599a0a5743be806bb47fc1e67781cd814c0b979e9325e483ee40099"} Apr 24 14:38:34.507093 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:38:34.507060 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-a4202-598ddcf66c-jmhdd" Apr 24 14:38:34.608215 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:38:34.608186 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ceba8a83-9f4c-4aed-bac0-7b735085dba4-proxy-tls\") pod \"ceba8a83-9f4c-4aed-bac0-7b735085dba4\" (UID: \"ceba8a83-9f4c-4aed-bac0-7b735085dba4\") " Apr 24 14:38:34.608386 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:38:34.608239 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ceba8a83-9f4c-4aed-bac0-7b735085dba4-openshift-service-ca-bundle\") pod \"ceba8a83-9f4c-4aed-bac0-7b735085dba4\" (UID: \"ceba8a83-9f4c-4aed-bac0-7b735085dba4\") " Apr 24 14:38:34.608648 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:38:34.608616 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ceba8a83-9f4c-4aed-bac0-7b735085dba4-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "ceba8a83-9f4c-4aed-bac0-7b735085dba4" (UID: "ceba8a83-9f4c-4aed-bac0-7b735085dba4"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:38:34.610463 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:38:34.610440 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceba8a83-9f4c-4aed-bac0-7b735085dba4-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ceba8a83-9f4c-4aed-bac0-7b735085dba4" (UID: "ceba8a83-9f4c-4aed-bac0-7b735085dba4"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:38:34.709186 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:38:34.709154 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ceba8a83-9f4c-4aed-bac0-7b735085dba4-proxy-tls\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:38:34.709186 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:38:34.709180 2571 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ceba8a83-9f4c-4aed-bac0-7b735085dba4-openshift-service-ca-bundle\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:38:35.214817 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:38:35.214782 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-a4202-598ddcf66c-jmhdd" Apr 24 14:38:35.214817 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:38:35.214794 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-a4202-598ddcf66c-jmhdd" event={"ID":"ceba8a83-9f4c-4aed-bac0-7b735085dba4","Type":"ContainerDied","Data":"025b7a906a2a95003556c008313a0d353b47fbcbf3128a934828384aeb013333"} Apr 24 14:38:35.215313 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:38:35.214837 2571 scope.go:117] "RemoveContainer" containerID="e2ed67229599a0a5743be806bb47fc1e67781cd814c0b979e9325e483ee40099" Apr 24 14:38:35.237539 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:38:35.237509 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-a4202-598ddcf66c-jmhdd"] Apr 24 14:38:35.243895 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:38:35.243875 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-a4202-598ddcf66c-jmhdd"] Apr 24 14:38:36.460683 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:38:36.460641 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ceba8a83-9f4c-4aed-bac0-7b735085dba4" path="/var/lib/kubelet/pods/ceba8a83-9f4c-4aed-bac0-7b735085dba4/volumes" Apr 24 14:39:24.398773 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:24.398745 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mk446_83b693cc-250b-45e7-b205-baf7f0feff6b/console-operator/2.log" Apr 24 14:39:24.401265 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:24.401244 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mk446_83b693cc-250b-45e7-b205-baf7f0feff6b/console-operator/2.log" Apr 24 14:39:24.402771 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:24.402749 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7m9zt_a7b9926d-4f53-4532-8669-16af4fc30cfd/ovn-acl-logging/0.log" Apr 24 14:39:24.405248 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:24.405229 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7m9zt_a7b9926d-4f53-4532-8669-16af4fc30cfd/ovn-acl-logging/0.log" Apr 24 14:39:34.147223 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:34.147186 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg"] Apr 24 14:39:34.147577 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:34.147529 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ceba8a83-9f4c-4aed-bac0-7b735085dba4" containerName="model-chainer-raw-a4202" Apr 24 14:39:34.147577 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:34.147539 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceba8a83-9f4c-4aed-bac0-7b735085dba4" containerName="model-chainer-raw-a4202" Apr 24 14:39:34.147650 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:34.147608 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="ceba8a83-9f4c-4aed-bac0-7b735085dba4" containerName="model-chainer-raw-a4202" Apr 24 14:39:34.151635 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:34.151614 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg" Apr 24 14:39:34.153633 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:34.153608 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-29d74-kube-rbac-proxy-sar-config\"" Apr 24 14:39:34.153753 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:34.153659 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 14:39:34.153753 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:34.153617 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-qcgw2\"" Apr 24 14:39:34.153849 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:34.153610 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-29d74-serving-cert\"" Apr 24 14:39:34.158318 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:34.158290 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg"] Apr 24 14:39:34.211091 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:34.211044 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b-proxy-tls\") pod \"model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg\" (UID: \"ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg" Apr 24 14:39:34.211286 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:34.211113 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg\" (UID: \"ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg" Apr 24 14:39:34.312410 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:34.312381 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg\" (UID: \"ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg" Apr 24 14:39:34.312580 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:34.312558 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b-proxy-tls\") pod \"model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg\" (UID: \"ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg" Apr 24 14:39:34.312693 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:39:34.312679 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-raw-hpa-29d74-serving-cert: secret "model-chainer-raw-hpa-29d74-serving-cert" not found Apr 24 14:39:34.312753 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:39:34.312745 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b-proxy-tls podName:ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b nodeName:}" failed. No retries permitted until 2026-04-24 14:39:34.812729832 +0000 UTC m=+910.942707369 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b-proxy-tls") pod "model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg" (UID: "ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b") : secret "model-chainer-raw-hpa-29d74-serving-cert" not found Apr 24 14:39:34.313120 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:34.313104 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg\" (UID: \"ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg" Apr 24 14:39:34.816558 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:34.816514 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b-proxy-tls\") pod \"model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg\" (UID: \"ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg" Apr 24 14:39:34.818977 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:34.818949 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b-proxy-tls\") pod \"model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg\" (UID: \"ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg" Apr 24 14:39:35.064284 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:35.064246 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg" Apr 24 14:39:35.187694 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:35.187665 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg"] Apr 24 14:39:35.190405 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:39:35.190372 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef57f6a9_2ca9_4ed1_b191_d2f663ea1d2b.slice/crio-b2d21ef81046342ed75b042088542e56eb0e54c493a4af4939320dfa09f859f5 WatchSource:0}: Error finding container b2d21ef81046342ed75b042088542e56eb0e54c493a4af4939320dfa09f859f5: Status 404 returned error can't find the container with id b2d21ef81046342ed75b042088542e56eb0e54c493a4af4939320dfa09f859f5 Apr 24 14:39:35.416723 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:35.416682 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg" event={"ID":"ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b","Type":"ContainerStarted","Data":"aa0cb944f35a4659c5991987434c1cd86f2c8d06b2cb539e692beebc5d587f46"} Apr 24 14:39:35.416723 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:35.416723 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg" event={"ID":"ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b","Type":"ContainerStarted","Data":"b2d21ef81046342ed75b042088542e56eb0e54c493a4af4939320dfa09f859f5"} Apr 24 14:39:35.416959 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:35.416830 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg" Apr 24 14:39:35.434538 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:35.434493 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg" podStartSLOduration=1.4344778169999999 podStartE2EDuration="1.434477817s" podCreationTimestamp="2026-04-24 14:39:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:39:35.433808945 +0000 UTC m=+911.563786506" watchObservedRunningTime="2026-04-24 14:39:35.434477817 +0000 UTC m=+911.564455377" Apr 24 14:39:41.426663 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:41.426635 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg" Apr 24 14:39:44.199561 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:44.199525 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg"] Apr 24 14:39:44.199927 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:44.199739 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg" podUID="ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b" containerName="model-chainer-raw-hpa-29d74" containerID="cri-o://aa0cb944f35a4659c5991987434c1cd86f2c8d06b2cb539e692beebc5d587f46" gracePeriod=30 Apr 24 14:39:44.387542 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:44.387510 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-cf657-predictor-b44ccb865-rt4mg"] Apr 24 14:39:44.390828 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:44.390802 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-cf657-predictor-b44ccb865-rt4mg" Apr 24 14:39:44.398268 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:44.398225 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-cf657-predictor-b44ccb865-rt4mg"] Apr 24 14:39:44.401892 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:44.401872 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-cf657-predictor-b44ccb865-rt4mg" Apr 24 14:39:44.534284 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:44.534261 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-cf657-predictor-b44ccb865-rt4mg"] Apr 24 14:39:44.536722 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:39:44.536694 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7c32f6f_cfaa_4d6f_937a_950a5560eb00.slice/crio-c9d5190251387df5c8e825c0fb206a0350e9b2dec9f22920a98dccbdd7b3da13 WatchSource:0}: Error finding container c9d5190251387df5c8e825c0fb206a0350e9b2dec9f22920a98dccbdd7b3da13: Status 404 returned error can't find the container with id c9d5190251387df5c8e825c0fb206a0350e9b2dec9f22920a98dccbdd7b3da13 Apr 24 14:39:45.451589 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:45.451546 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-cf657-predictor-b44ccb865-rt4mg" event={"ID":"d7c32f6f-cfaa-4d6f-937a-950a5560eb00","Type":"ContainerStarted","Data":"c9d5190251387df5c8e825c0fb206a0350e9b2dec9f22920a98dccbdd7b3da13"} Apr 24 14:39:46.423474 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:46.423431 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg" podUID="ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b" containerName="model-chainer-raw-hpa-29d74" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:39:46.460755 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:46.460728 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-cf657-predictor-b44ccb865-rt4mg" Apr 24 14:39:46.461143 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:46.460779 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-cf657-predictor-b44ccb865-rt4mg" Apr 24 14:39:46.461143 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:46.460789 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-cf657-predictor-b44ccb865-rt4mg" event={"ID":"d7c32f6f-cfaa-4d6f-937a-950a5560eb00","Type":"ContainerStarted","Data":"0601210d4525435191f497c8a2ddf8ccade8a051f771f0f8a1dadf32d43cfa86"} Apr 24 14:39:46.473150 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:46.473102 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-raw-cf657-predictor-b44ccb865-rt4mg" podStartSLOduration=1.277889671 podStartE2EDuration="2.473086741s" podCreationTimestamp="2026-04-24 14:39:44 +0000 UTC" firstStartedPulling="2026-04-24 14:39:44.538488578 +0000 UTC m=+920.668466117" lastFinishedPulling="2026-04-24 14:39:45.733685646 +0000 UTC m=+921.863663187" observedRunningTime="2026-04-24 14:39:46.471726758 +0000 UTC m=+922.601704317" watchObservedRunningTime="2026-04-24 14:39:46.473086741 +0000 UTC m=+922.603064300" Apr 24 14:39:51.424092 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:51.424045 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg" podUID="ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b" containerName="model-chainer-raw-hpa-29d74" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:39:56.424036 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:56.423968 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg" podUID="ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b" containerName="model-chainer-raw-hpa-29d74" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:39:56.424501 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:39:56.424102 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg" Apr 24 14:40:01.424169 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:40:01.424130 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg" podUID="ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b" containerName="model-chainer-raw-hpa-29d74" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:40:06.423560 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:40:06.423516 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg" podUID="ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b" containerName="model-chainer-raw-hpa-29d74" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:40:11.424496 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:40:11.424456 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg" podUID="ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b" containerName="model-chainer-raw-hpa-29d74" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:40:14.249100 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:40:14.249065 2571 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef57f6a9_2ca9_4ed1_b191_d2f663ea1d2b.slice/crio-aa0cb944f35a4659c5991987434c1cd86f2c8d06b2cb539e692beebc5d587f46.scope\": RecentStats: unable to find data in memory cache]" Apr 24 14:40:14.337350 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:40:14.337327 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg" Apr 24 14:40:14.452644 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:40:14.452615 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b-openshift-service-ca-bundle\") pod \"ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b\" (UID: \"ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b\") " Apr 24 14:40:14.452794 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:40:14.452675 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b-proxy-tls\") pod \"ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b\" (UID: \"ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b\") " Apr 24 14:40:14.452944 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:40:14.452923 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b" (UID: "ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:40:14.454694 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:40:14.454639 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b" (UID: "ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:40:14.553383 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:40:14.553357 2571 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b-openshift-service-ca-bundle\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:40:14.553520 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:40:14.553386 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b-proxy-tls\") on node \"ip-10-0-137-95.ec2.internal\" DevicePath \"\"" Apr 24 14:40:14.558883 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:40:14.558853 2571 generic.go:358] "Generic (PLEG): container finished" podID="ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b" containerID="aa0cb944f35a4659c5991987434c1cd86f2c8d06b2cb539e692beebc5d587f46" exitCode=0 Apr 24 14:40:14.559025 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:40:14.558883 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg" event={"ID":"ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b","Type":"ContainerDied","Data":"aa0cb944f35a4659c5991987434c1cd86f2c8d06b2cb539e692beebc5d587f46"} Apr 24 14:40:14.559025 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:40:14.558919 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg" Apr 24 14:40:14.559025 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:40:14.558930 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg" event={"ID":"ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b","Type":"ContainerDied","Data":"b2d21ef81046342ed75b042088542e56eb0e54c493a4af4939320dfa09f859f5"} Apr 24 14:40:14.559025 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:40:14.558952 2571 scope.go:117] "RemoveContainer" containerID="aa0cb944f35a4659c5991987434c1cd86f2c8d06b2cb539e692beebc5d587f46" Apr 24 14:40:14.570397 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:40:14.570385 2571 scope.go:117] "RemoveContainer" containerID="aa0cb944f35a4659c5991987434c1cd86f2c8d06b2cb539e692beebc5d587f46" Apr 24 14:40:14.570680 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:40:14.570653 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa0cb944f35a4659c5991987434c1cd86f2c8d06b2cb539e692beebc5d587f46\": container with ID starting with aa0cb944f35a4659c5991987434c1cd86f2c8d06b2cb539e692beebc5d587f46 not found: ID does not exist" containerID="aa0cb944f35a4659c5991987434c1cd86f2c8d06b2cb539e692beebc5d587f46" Apr 24 14:40:14.570773 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:40:14.570687 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa0cb944f35a4659c5991987434c1cd86f2c8d06b2cb539e692beebc5d587f46"} err="failed to get container status \"aa0cb944f35a4659c5991987434c1cd86f2c8d06b2cb539e692beebc5d587f46\": rpc error: code = NotFound desc = could not find container \"aa0cb944f35a4659c5991987434c1cd86f2c8d06b2cb539e692beebc5d587f46\": container with ID starting with aa0cb944f35a4659c5991987434c1cd86f2c8d06b2cb539e692beebc5d587f46 not found: ID does not exist" Apr 24 14:40:14.578066 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:40:14.578045 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg"] Apr 24 14:40:14.582897 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:40:14.582877 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-29d74-6d5c5cdccf-4q7cg"] Apr 24 14:40:16.461679 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:40:16.461642 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b" path="/var/lib/kubelet/pods/ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b/volumes" Apr 24 14:41:19.453956 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:41:19.453915 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-cf657-predictor-b44ccb865-rt4mg_d7c32f6f-cfaa-4d6f-937a-950a5560eb00/kserve-container/0.log" Apr 24 14:41:19.783239 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:41:19.783154 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-cf657-predictor-b44ccb865-rt4mg"] Apr 24 14:41:19.783399 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:41:19.783381 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-cf657-predictor-b44ccb865-rt4mg" podUID="d7c32f6f-cfaa-4d6f-937a-950a5560eb00" containerName="kserve-container" containerID="cri-o://0601210d4525435191f497c8a2ddf8ccade8a051f771f0f8a1dadf32d43cfa86" gracePeriod=30 Apr 24 14:41:20.032517 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:41:20.032495 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-cf657-predictor-b44ccb865-rt4mg" Apr 24 14:41:20.784835 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:41:20.784745 2571 generic.go:358] "Generic (PLEG): container finished" podID="d7c32f6f-cfaa-4d6f-937a-950a5560eb00" containerID="0601210d4525435191f497c8a2ddf8ccade8a051f771f0f8a1dadf32d43cfa86" exitCode=2 Apr 24 14:41:20.784835 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:41:20.784797 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-cf657-predictor-b44ccb865-rt4mg" event={"ID":"d7c32f6f-cfaa-4d6f-937a-950a5560eb00","Type":"ContainerDied","Data":"0601210d4525435191f497c8a2ddf8ccade8a051f771f0f8a1dadf32d43cfa86"} Apr 24 14:41:20.784835 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:41:20.784813 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-cf657-predictor-b44ccb865-rt4mg" Apr 24 14:41:20.784835 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:41:20.784832 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-cf657-predictor-b44ccb865-rt4mg" event={"ID":"d7c32f6f-cfaa-4d6f-937a-950a5560eb00","Type":"ContainerDied","Data":"c9d5190251387df5c8e825c0fb206a0350e9b2dec9f22920a98dccbdd7b3da13"} Apr 24 14:41:20.785443 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:41:20.784853 2571 scope.go:117] "RemoveContainer" containerID="0601210d4525435191f497c8a2ddf8ccade8a051f771f0f8a1dadf32d43cfa86" Apr 24 14:41:20.792574 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:41:20.792543 2571 scope.go:117] "RemoveContainer" containerID="0601210d4525435191f497c8a2ddf8ccade8a051f771f0f8a1dadf32d43cfa86" Apr 24 14:41:20.792812 ip-10-0-137-95 kubenswrapper[2571]: E0424 14:41:20.792792 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0601210d4525435191f497c8a2ddf8ccade8a051f771f0f8a1dadf32d43cfa86\": container with ID starting with 0601210d4525435191f497c8a2ddf8ccade8a051f771f0f8a1dadf32d43cfa86 not found: ID does not exist" containerID="0601210d4525435191f497c8a2ddf8ccade8a051f771f0f8a1dadf32d43cfa86" Apr 24 14:41:20.792892 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:41:20.792825 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0601210d4525435191f497c8a2ddf8ccade8a051f771f0f8a1dadf32d43cfa86"} err="failed to get container status \"0601210d4525435191f497c8a2ddf8ccade8a051f771f0f8a1dadf32d43cfa86\": rpc error: code = NotFound desc = could not find container \"0601210d4525435191f497c8a2ddf8ccade8a051f771f0f8a1dadf32d43cfa86\": container with ID starting with 0601210d4525435191f497c8a2ddf8ccade8a051f771f0f8a1dadf32d43cfa86 not found: ID does not exist" Apr 24 14:41:20.800551 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:41:20.800528 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-cf657-predictor-b44ccb865-rt4mg"] Apr 24 14:41:20.804473 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:41:20.804451 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-cf657-predictor-b44ccb865-rt4mg"] Apr 24 14:41:22.461254 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:41:22.461222 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7c32f6f-cfaa-4d6f-937a-950a5560eb00" path="/var/lib/kubelet/pods/d7c32f6f-cfaa-4d6f-937a-950a5560eb00/volumes" Apr 24 14:44:24.420389 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:44:24.420356 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mk446_83b693cc-250b-45e7-b205-baf7f0feff6b/console-operator/2.log" Apr 24 14:44:24.424448 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:44:24.424428 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7m9zt_a7b9926d-4f53-4532-8669-16af4fc30cfd/ovn-acl-logging/0.log" Apr 24 14:44:24.424855 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:44:24.424835 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mk446_83b693cc-250b-45e7-b205-baf7f0feff6b/console-operator/2.log" Apr 24 14:44:24.428822 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:44:24.428804 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7m9zt_a7b9926d-4f53-4532-8669-16af4fc30cfd/ovn-acl-logging/0.log" Apr 24 14:48:40.323232 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:40.323196 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7d7dd/must-gather-z772m"] Apr 24 14:48:40.323644 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:40.323585 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7c32f6f-cfaa-4d6f-937a-950a5560eb00" containerName="kserve-container" Apr 24 14:48:40.323644 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:40.323600 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7c32f6f-cfaa-4d6f-937a-950a5560eb00" containerName="kserve-container" Apr 24 14:48:40.323644 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:40.323621 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b" containerName="model-chainer-raw-hpa-29d74" Apr 24 14:48:40.323644 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:40.323627 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b" containerName="model-chainer-raw-hpa-29d74" Apr 24 14:48:40.323775 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:40.323688 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7c32f6f-cfaa-4d6f-937a-950a5560eb00" containerName="kserve-container" Apr 24 14:48:40.323775 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:40.323701 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="ef57f6a9-2ca9-4ed1-b191-d2f663ea1d2b" containerName="model-chainer-raw-hpa-29d74" Apr 24 14:48:40.326874 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:40.326858 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7d7dd/must-gather-z772m" Apr 24 14:48:40.328644 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:40.328619 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7d7dd\"/\"openshift-service-ca.crt\"" Apr 24 14:48:40.328787 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:40.328760 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7d7dd\"/\"kube-root-ca.crt\"" Apr 24 14:48:40.328863 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:40.328769 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-7d7dd\"/\"default-dockercfg-64fdx\"" Apr 24 14:48:40.337275 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:40.337253 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7d7dd/must-gather-z772m"] Apr 24 14:48:40.423574 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:40.423538 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkwws\" (UniqueName: \"kubernetes.io/projected/468e6ff2-966f-4a63-88b9-e8ee6f71cba3-kube-api-access-mkwws\") pod \"must-gather-z772m\" (UID: \"468e6ff2-966f-4a63-88b9-e8ee6f71cba3\") " pod="openshift-must-gather-7d7dd/must-gather-z772m" Apr 24 14:48:40.423744 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:40.423665 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/468e6ff2-966f-4a63-88b9-e8ee6f71cba3-must-gather-output\") pod \"must-gather-z772m\" (UID: \"468e6ff2-966f-4a63-88b9-e8ee6f71cba3\") " pod="openshift-must-gather-7d7dd/must-gather-z772m" Apr 24 14:48:40.524032 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:40.523978 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/468e6ff2-966f-4a63-88b9-e8ee6f71cba3-must-gather-output\") pod \"must-gather-z772m\" (UID: \"468e6ff2-966f-4a63-88b9-e8ee6f71cba3\") " pod="openshift-must-gather-7d7dd/must-gather-z772m" Apr 24 14:48:40.524231 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:40.524143 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mkwws\" (UniqueName: \"kubernetes.io/projected/468e6ff2-966f-4a63-88b9-e8ee6f71cba3-kube-api-access-mkwws\") pod \"must-gather-z772m\" (UID: \"468e6ff2-966f-4a63-88b9-e8ee6f71cba3\") " pod="openshift-must-gather-7d7dd/must-gather-z772m" Apr 24 14:48:40.524375 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:40.524354 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/468e6ff2-966f-4a63-88b9-e8ee6f71cba3-must-gather-output\") pod \"must-gather-z772m\" (UID: \"468e6ff2-966f-4a63-88b9-e8ee6f71cba3\") " pod="openshift-must-gather-7d7dd/must-gather-z772m" Apr 24 14:48:40.533664 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:40.533644 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkwws\" (UniqueName: \"kubernetes.io/projected/468e6ff2-966f-4a63-88b9-e8ee6f71cba3-kube-api-access-mkwws\") pod \"must-gather-z772m\" (UID: \"468e6ff2-966f-4a63-88b9-e8ee6f71cba3\") " pod="openshift-must-gather-7d7dd/must-gather-z772m" Apr 24 14:48:40.647679 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:40.647647 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7d7dd/must-gather-z772m" Apr 24 14:48:40.767036 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:40.767012 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7d7dd/must-gather-z772m"] Apr 24 14:48:40.769806 ip-10-0-137-95 kubenswrapper[2571]: W0424 14:48:40.769780 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod468e6ff2_966f_4a63_88b9_e8ee6f71cba3.slice/crio-0e0eb729824b0b77ea62a7ea673fb9c11b73022064c1a6200cf442e11467b5f8 WatchSource:0}: Error finding container 0e0eb729824b0b77ea62a7ea673fb9c11b73022064c1a6200cf442e11467b5f8: Status 404 returned error can't find the container with id 0e0eb729824b0b77ea62a7ea673fb9c11b73022064c1a6200cf442e11467b5f8 Apr 24 14:48:40.771788 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:40.771773 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 14:48:41.250463 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:41.250429 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7d7dd/must-gather-z772m" event={"ID":"468e6ff2-966f-4a63-88b9-e8ee6f71cba3","Type":"ContainerStarted","Data":"0e0eb729824b0b77ea62a7ea673fb9c11b73022064c1a6200cf442e11467b5f8"} Apr 24 14:48:42.257470 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:42.257428 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7d7dd/must-gather-z772m" event={"ID":"468e6ff2-966f-4a63-88b9-e8ee6f71cba3","Type":"ContainerStarted","Data":"44e4428dd61a363def712a502595280cc0a19a222ca9aaef53beb1435bdadf22"} Apr 24 14:48:42.257470 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:42.257475 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7d7dd/must-gather-z772m" event={"ID":"468e6ff2-966f-4a63-88b9-e8ee6f71cba3","Type":"ContainerStarted","Data":"e2a0715c04c259b047d8ea61b53f28795428c351c3a9cc4f7976fd7c58fd26a7"} Apr 24 14:48:42.278012 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:42.277923 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7d7dd/must-gather-z772m" podStartSLOduration=1.141945097 podStartE2EDuration="2.277902703s" podCreationTimestamp="2026-04-24 14:48:40 +0000 UTC" firstStartedPulling="2026-04-24 14:48:40.771895997 +0000 UTC m=+1456.901873535" lastFinishedPulling="2026-04-24 14:48:41.907853599 +0000 UTC m=+1458.037831141" observedRunningTime="2026-04-24 14:48:42.275352634 +0000 UTC m=+1458.405330195" watchObservedRunningTime="2026-04-24 14:48:42.277902703 +0000 UTC m=+1458.407880266" Apr 24 14:48:43.590491 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:43.590462 2571 ???:1] "http: TLS handshake error from 10.0.143.92:54878: EOF" Apr 24 14:48:43.596533 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:43.596509 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-trzpj_a06dcc9a-a5d4-44ad-8f76-6943b3f58258/global-pull-secret-syncer/0.log" Apr 24 14:48:43.694536 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:43.694506 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-v9wvt_15cb0256-87b8-40a0-812d-7931f453c264/konnectivity-agent/0.log" Apr 24 14:48:43.785038 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:43.785003 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-95.ec2.internal_c07dfa5590a30bfb9dbec92d1fcd686b/haproxy/0.log" Apr 24 14:48:47.071713 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:47.071614 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_9a8c48b3-0448-4b17-b8a8-5e51dff99526/alertmanager/0.log" Apr 24 14:48:47.105604 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:47.105550 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_9a8c48b3-0448-4b17-b8a8-5e51dff99526/config-reloader/0.log" Apr 24 14:48:47.142582 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:47.142464 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_9a8c48b3-0448-4b17-b8a8-5e51dff99526/kube-rbac-proxy-web/0.log" Apr 24 14:48:47.178654 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:47.178615 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_9a8c48b3-0448-4b17-b8a8-5e51dff99526/kube-rbac-proxy/0.log" Apr 24 14:48:47.216335 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:47.216307 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_9a8c48b3-0448-4b17-b8a8-5e51dff99526/kube-rbac-proxy-metric/0.log" Apr 24 14:48:47.256629 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:47.256603 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_9a8c48b3-0448-4b17-b8a8-5e51dff99526/prom-label-proxy/0.log" Apr 24 14:48:47.290901 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:47.290870 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_9a8c48b3-0448-4b17-b8a8-5e51dff99526/init-config-reloader/0.log" Apr 24 14:48:47.341162 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:47.341105 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-8jnpj_75b83f1b-1fca-4a30-867f-a76c5b6bfe4f/cluster-monitoring-operator/0.log" Apr 24 14:48:47.372389 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:47.372354 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-bp2rg_76bc4d18-59d9-46a2-97c6-4b157dd5bd77/kube-state-metrics/0.log" Apr 24 14:48:47.400625 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:47.400547 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-bp2rg_76bc4d18-59d9-46a2-97c6-4b157dd5bd77/kube-rbac-proxy-main/0.log" Apr 24 14:48:47.434299 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:47.434267 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-bp2rg_76bc4d18-59d9-46a2-97c6-4b157dd5bd77/kube-rbac-proxy-self/0.log" Apr 24 14:48:47.707026 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:47.706930 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-v9p7d_2acef0d4-dc55-4e0c-8027-950b1d7e2fb1/node-exporter/0.log" Apr 24 14:48:47.738539 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:47.738509 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-v9p7d_2acef0d4-dc55-4e0c-8027-950b1d7e2fb1/kube-rbac-proxy/0.log" Apr 24 14:48:47.767036 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:47.767007 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-v9p7d_2acef0d4-dc55-4e0c-8027-950b1d7e2fb1/init-textfile/0.log" Apr 24 14:48:47.802897 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:47.802870 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-9g2xl_b0493229-a011-4a74-9ccf-6fde5588cde8/kube-rbac-proxy-main/0.log" Apr 24 14:48:47.832508 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:47.832473 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-9g2xl_b0493229-a011-4a74-9ccf-6fde5588cde8/kube-rbac-proxy-self/0.log" Apr 24 14:48:47.868410 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:47.868380 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-9g2xl_b0493229-a011-4a74-9ccf-6fde5588cde8/openshift-state-metrics/0.log" Apr 24 14:48:48.209899 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:48.209865 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-846d597597-fhql6_d92925ff-e3a0-4e32-85e8-5e09c60d3530/telemeter-client/0.log" Apr 24 14:48:48.252487 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:48.252453 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-846d597597-fhql6_d92925ff-e3a0-4e32-85e8-5e09c60d3530/reload/0.log" Apr 24 14:48:48.294789 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:48.294748 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-846d597597-fhql6_d92925ff-e3a0-4e32-85e8-5e09c60d3530/kube-rbac-proxy/0.log" Apr 24 14:48:49.521606 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:49.521576 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-jdhx4_03107b23-78b0-454f-9952-be259de46a01/networking-console-plugin/0.log" Apr 24 14:48:49.967857 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:49.967821 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mk446_83b693cc-250b-45e7-b205-baf7f0feff6b/console-operator/2.log" Apr 24 14:48:49.975628 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:49.975600 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mk446_83b693cc-250b-45e7-b205-baf7f0feff6b/console-operator/3.log" Apr 24 14:48:50.396925 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:50.396888 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-zc4wq_7e9f030d-42db-425f-8dec-4073b006cce9/download-server/0.log" Apr 24 14:48:50.572290 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:50.572256 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7d7dd/perf-node-gather-daemonset-m6vtk"] Apr 24 14:48:50.576965 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:50.576934 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-m6vtk" Apr 24 14:48:50.587692 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:50.587650 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7d7dd/perf-node-gather-daemonset-m6vtk"] Apr 24 14:48:50.631001 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:50.630899 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q5hc\" (UniqueName: \"kubernetes.io/projected/26417650-44e7-4851-a2e1-834ca3687459-kube-api-access-9q5hc\") pod \"perf-node-gather-daemonset-m6vtk\" (UID: \"26417650-44e7-4851-a2e1-834ca3687459\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-m6vtk" Apr 24 14:48:50.631181 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:50.631045 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/26417650-44e7-4851-a2e1-834ca3687459-proc\") pod \"perf-node-gather-daemonset-m6vtk\" (UID: \"26417650-44e7-4851-a2e1-834ca3687459\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-m6vtk" Apr 24 14:48:50.631181 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:50.631120 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/26417650-44e7-4851-a2e1-834ca3687459-sys\") pod \"perf-node-gather-daemonset-m6vtk\" (UID: \"26417650-44e7-4851-a2e1-834ca3687459\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-m6vtk" Apr 24 14:48:50.631321 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:50.631261 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/26417650-44e7-4851-a2e1-834ca3687459-lib-modules\") pod \"perf-node-gather-daemonset-m6vtk\" (UID: \"26417650-44e7-4851-a2e1-834ca3687459\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-m6vtk" Apr 24 14:48:50.631321 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:50.631307 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/26417650-44e7-4851-a2e1-834ca3687459-podres\") pod \"perf-node-gather-daemonset-m6vtk\" (UID: \"26417650-44e7-4851-a2e1-834ca3687459\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-m6vtk" Apr 24 14:48:50.732467 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:50.732381 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9q5hc\" (UniqueName: \"kubernetes.io/projected/26417650-44e7-4851-a2e1-834ca3687459-kube-api-access-9q5hc\") pod \"perf-node-gather-daemonset-m6vtk\" (UID: \"26417650-44e7-4851-a2e1-834ca3687459\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-m6vtk" Apr 24 14:48:50.732467 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:50.732460 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/26417650-44e7-4851-a2e1-834ca3687459-proc\") pod \"perf-node-gather-daemonset-m6vtk\" (UID: \"26417650-44e7-4851-a2e1-834ca3687459\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-m6vtk" Apr 24 14:48:50.732661 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:50.732491 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/26417650-44e7-4851-a2e1-834ca3687459-sys\") pod \"perf-node-gather-daemonset-m6vtk\" (UID: \"26417650-44e7-4851-a2e1-834ca3687459\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-m6vtk" Apr 24 14:48:50.732661 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:50.732560 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/26417650-44e7-4851-a2e1-834ca3687459-lib-modules\") pod \"perf-node-gather-daemonset-m6vtk\" (UID: \"26417650-44e7-4851-a2e1-834ca3687459\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-m6vtk" Apr 24 14:48:50.732661 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:50.732583 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/26417650-44e7-4851-a2e1-834ca3687459-proc\") pod \"perf-node-gather-daemonset-m6vtk\" (UID: \"26417650-44e7-4851-a2e1-834ca3687459\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-m6vtk" Apr 24 14:48:50.732661 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:50.732591 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/26417650-44e7-4851-a2e1-834ca3687459-podres\") pod \"perf-node-gather-daemonset-m6vtk\" (UID: \"26417650-44e7-4851-a2e1-834ca3687459\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-m6vtk" Apr 24 14:48:50.732661 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:50.732648 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/26417650-44e7-4851-a2e1-834ca3687459-sys\") pod \"perf-node-gather-daemonset-m6vtk\" (UID: \"26417650-44e7-4851-a2e1-834ca3687459\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-m6vtk" Apr 24 14:48:50.732856 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:50.732694 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/26417650-44e7-4851-a2e1-834ca3687459-podres\") pod \"perf-node-gather-daemonset-m6vtk\" (UID: \"26417650-44e7-4851-a2e1-834ca3687459\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-m6vtk" Apr 24 14:48:50.732856 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:50.732728 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/26417650-44e7-4851-a2e1-834ca3687459-lib-modules\") pod \"perf-node-gather-daemonset-m6vtk\" (UID: \"26417650-44e7-4851-a2e1-834ca3687459\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-m6vtk" Apr 24 14:48:50.743604 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:50.743581 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q5hc\" (UniqueName: \"kubernetes.io/projected/26417650-44e7-4851-a2e1-834ca3687459-kube-api-access-9q5hc\") pod \"perf-node-gather-daemonset-m6vtk\" (UID: \"26417650-44e7-4851-a2e1-834ca3687459\") " pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-m6vtk" Apr 24 14:48:50.874913 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:50.874861 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-g7vcx_ead538d9-0357-4366-978f-3383d34778d6/volume-data-source-validator/0.log" Apr 24 14:48:50.887900 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:50.887877 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-m6vtk" Apr 24 14:48:51.038802 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:51.038767 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7d7dd/perf-node-gather-daemonset-m6vtk"] Apr 24 14:48:51.300147 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:51.300053 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-m6vtk" event={"ID":"26417650-44e7-4851-a2e1-834ca3687459","Type":"ContainerStarted","Data":"0634e55fe9ff55b8545a20a9379d004ad402e832a7e9783bc50fd68a2ee335e4"} Apr 24 14:48:51.300147 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:51.300102 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-m6vtk" event={"ID":"26417650-44e7-4851-a2e1-834ca3687459","Type":"ContainerStarted","Data":"47b0a955efe846b02ffed561f60043e8f50606d3a1453899187e04df434ca7cd"} Apr 24 14:48:51.300400 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:51.300310 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-m6vtk" Apr 24 14:48:51.319058 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:51.318963 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-m6vtk" podStartSLOduration=1.31894299 podStartE2EDuration="1.31894299s" podCreationTimestamp="2026-04-24 14:48:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:48:51.316192109 +0000 UTC m=+1467.446169670" watchObservedRunningTime="2026-04-24 14:48:51.31894299 +0000 UTC m=+1467.448920551" Apr 24 14:48:51.633201 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:51.633173 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-752tj_5b9edb70-aaea-4a5d-bd77-289fb7865065/dns/0.log" Apr 24 14:48:51.659933 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:51.659905 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-752tj_5b9edb70-aaea-4a5d-bd77-289fb7865065/kube-rbac-proxy/0.log" Apr 24 14:48:51.798766 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:51.798739 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-bmpbb_b97fcfbf-fa5a-4b45-8446-f25172d545bb/dns-node-resolver/0.log" Apr 24 14:48:52.378030 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:52.378003 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-hgntn_87bcc27f-4b5f-48a3-9ae3-f93cb520eea0/node-ca/0.log" Apr 24 14:48:53.171433 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:53.171402 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-dc7c85968-65n67_fbe79ce3-ee9a-4d46-a8e8-345e1e315824/router/0.log" Apr 24 14:48:53.550273 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:53.550196 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-k2t2r_21ba7795-411f-46a3-93c5-fedef51a27ea/serve-healthcheck-canary/0.log" Apr 24 14:48:54.005126 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:54.005088 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-rkjfd_6f64bec9-dd77-4222-8388-3b584743cfa7/insights-operator/1.log" Apr 24 14:48:54.005658 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:54.005632 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-rkjfd_6f64bec9-dd77-4222-8388-3b584743cfa7/insights-operator/0.log" Apr 24 14:48:54.031673 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:54.031639 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hp5nm_95fb1bf9-aa81-4ad5-950e-fec708dadd4a/kube-rbac-proxy/0.log" Apr 24 14:48:54.059117 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:54.059090 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hp5nm_95fb1bf9-aa81-4ad5-950e-fec708dadd4a/exporter/0.log" Apr 24 14:48:54.087125 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:54.087089 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hp5nm_95fb1bf9-aa81-4ad5-950e-fec708dadd4a/extractor/0.log" Apr 24 14:48:56.225043 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:56.225009 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-49r8b_9541f4ff-d53d-4def-bbc2-e736c381a1dd/manager/0.log" Apr 24 14:48:56.273950 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:56.273918 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-2p7sn_4686071d-b58f-45b7-88ab-296a3c044797/server/0.log" Apr 24 14:48:56.421125 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:56.421090 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-bd8lf_ac069a16-3258-4ed3-b927-268b5187b5be/manager/0.log" Apr 24 14:48:57.316925 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:48:57.316894 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-7d7dd/perf-node-gather-daemonset-m6vtk" Apr 24 14:49:02.145107 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:49:02.145077 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-42l4z_6c91c545-ee17-43ad-8a08-42be9b2cda48/kube-multus-additional-cni-plugins/0.log" Apr 24 14:49:02.172996 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:49:02.172896 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-42l4z_6c91c545-ee17-43ad-8a08-42be9b2cda48/egress-router-binary-copy/0.log" Apr 24 14:49:02.201524 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:49:02.201489 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-42l4z_6c91c545-ee17-43ad-8a08-42be9b2cda48/cni-plugins/0.log" Apr 24 14:49:02.227875 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:49:02.227849 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-42l4z_6c91c545-ee17-43ad-8a08-42be9b2cda48/bond-cni-plugin/0.log" Apr 24 14:49:02.254616 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:49:02.254588 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-42l4z_6c91c545-ee17-43ad-8a08-42be9b2cda48/routeoverride-cni/0.log" Apr 24 14:49:02.282859 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:49:02.282836 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-42l4z_6c91c545-ee17-43ad-8a08-42be9b2cda48/whereabouts-cni-bincopy/0.log" Apr 24 14:49:02.311066 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:49:02.311038 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-42l4z_6c91c545-ee17-43ad-8a08-42be9b2cda48/whereabouts-cni/0.log" Apr 24 14:49:02.748775 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:49:02.748742 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gwdlv_5b200c2b-497d-4b24-9fcb-1ed9a2b007c1/kube-multus/0.log" Apr 24 14:49:02.896339 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:49:02.896311 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-fsnj5_a022a0ca-5e80-43a6-8ee0-69dcf197d1a8/network-metrics-daemon/0.log" Apr 24 14:49:02.920517 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:49:02.920488 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-fsnj5_a022a0ca-5e80-43a6-8ee0-69dcf197d1a8/kube-rbac-proxy/0.log" Apr 24 14:49:03.801508 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:49:03.801480 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7m9zt_a7b9926d-4f53-4532-8669-16af4fc30cfd/ovn-controller/0.log" Apr 24 14:49:03.825574 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:49:03.825539 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7m9zt_a7b9926d-4f53-4532-8669-16af4fc30cfd/ovn-acl-logging/0.log" Apr 24 14:49:03.840814 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:49:03.840771 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7m9zt_a7b9926d-4f53-4532-8669-16af4fc30cfd/ovn-acl-logging/1.log" Apr 24 14:49:03.868098 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:49:03.868074 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7m9zt_a7b9926d-4f53-4532-8669-16af4fc30cfd/kube-rbac-proxy-node/0.log" Apr 24 14:49:03.895033 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:49:03.894979 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7m9zt_a7b9926d-4f53-4532-8669-16af4fc30cfd/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 14:49:03.916274 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:49:03.916246 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7m9zt_a7b9926d-4f53-4532-8669-16af4fc30cfd/northd/0.log" Apr 24 14:49:03.950069 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:49:03.950038 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7m9zt_a7b9926d-4f53-4532-8669-16af4fc30cfd/nbdb/0.log" Apr 24 14:49:03.976845 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:49:03.976808 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7m9zt_a7b9926d-4f53-4532-8669-16af4fc30cfd/sbdb/0.log" Apr 24 14:49:04.159901 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:49:04.159867 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7m9zt_a7b9926d-4f53-4532-8669-16af4fc30cfd/ovnkube-controller/0.log" Apr 24 14:49:06.052004 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:49:06.051966 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-7dqjb_ef39bb3c-1f6c-4174-b851-2464c20d74cf/check-endpoints/0.log" Apr 24 14:49:06.112820 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:49:06.112794 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-dgzq2_4ebd1686-821a-4fb0-b091-8a636b80f78e/network-check-target-container/0.log" Apr 24 14:49:07.198666 ip-10-0-137-95 kubenswrapper[2571]: I0424 14:49:07.198635 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-j9xv9_865ca2f7-3380-490c-b40d-e9c4fb7c799a/iptables-alerter/0.log"