Apr 22 19:21:17.313709 ip-10-0-134-22 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 19:21:17.313718 ip-10-0-134-22 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 19:21:17.313726 ip-10-0-134-22 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 19:21:17.313937 ip-10-0-134-22 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 19:21:27.386095 ip-10-0-134-22 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 19:21:27.386111 ip-10-0-134-22 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot bc27f919e9c5423c973d09a89e17177f -- Apr 22 19:23:42.581063 ip-10-0-134-22 systemd[1]: Starting Kubernetes Kubelet... Apr 22 19:23:43.081314 ip-10-0-134-22 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:23:43.081314 ip-10-0-134-22 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 19:23:43.081314 ip-10-0-134-22 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:23:43.081314 ip-10-0-134-22 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 19:23:43.081314 ip-10-0-134-22 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:23:43.085926 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.085836 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 19:23:43.088163 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088147 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:43.088200 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088164 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:43.088200 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088168 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:43.088200 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088171 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:43.088200 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088174 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:43.088200 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088177 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:43.088200 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088181 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:43.088200 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088185 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:43.088200 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088188 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:43.088200 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088191 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:43.088200 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088194 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:43.088200 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088196 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:43.088200 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088200 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:43.088200 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088202 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:43.088200 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088205 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:43.088545 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088208 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:43.088545 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088211 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:43.088545 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088214 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:43.088545 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088217 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:43.088545 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088220 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:43.088545 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088222 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:43.088545 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088225 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:43.088545 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088227 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:43.088545 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088231 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:43.088545 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088235 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:43.088545 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088238 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:43.088545 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088241 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:43.088545 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088245 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:43.088545 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088248 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:43.088545 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088251 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:43.088545 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088254 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:43.088545 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088256 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:43.088545 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088259 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:43.088545 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088262 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:43.089018 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088265 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:43.089018 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088268 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:43.089018 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088271 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:43.089018 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088273 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:43.089018 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088276 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:43.089018 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088279 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:43.089018 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088281 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:43.089018 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088284 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:43.089018 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088287 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:43.089018 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088290 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:43.089018 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088292 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:43.089018 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088295 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:43.089018 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088297 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:43.089018 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088301 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:43.089018 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088304 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:43.089018 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088307 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:43.089018 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088310 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:43.089018 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088312 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:43.089018 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088315 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:43.089018 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088317 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:43.089524 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088320 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:43.089524 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088322 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:43.089524 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088325 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:43.089524 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088327 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:43.089524 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088330 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:43.089524 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088332 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:43.089524 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088335 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:43.089524 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088338 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:43.089524 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088340 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:43.089524 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088343 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:43.089524 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088345 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:43.089524 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088348 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:43.089524 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088351 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:43.089524 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088353 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:43.089524 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088356 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:43.089524 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088358 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:43.089524 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088361 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:43.089524 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088363 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:43.089524 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088366 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:43.089524 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088368 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:43.090058 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088371 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:43.090058 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088373 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:43.090058 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088376 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:43.090058 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088378 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:43.090058 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088381 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:43.090058 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088384 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:43.090058 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088387 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:43.090058 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088390 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:43.090058 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088392 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:43.090058 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088395 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:43.090058 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088398 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:43.090058 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088402 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:43.090058 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088771 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:43.090058 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088789 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:43.090058 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088793 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:43.090058 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088795 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:43.090058 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088798 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:43.090058 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088801 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:43.090058 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088805 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:43.090058 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088807 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:43.090550 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088811 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:43.090550 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088815 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:43.090550 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088818 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:43.090550 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088821 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:43.090550 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088824 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:43.090550 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088826 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:43.090550 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088829 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:43.090550 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088832 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:43.090550 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088835 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:43.090550 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088837 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:43.090550 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088840 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:43.090550 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088842 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:43.090550 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088845 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:43.090550 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088848 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:43.090550 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088851 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:43.090550 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088853 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:43.090550 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088856 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:43.090550 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088859 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:43.090550 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088862 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:43.091033 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088865 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:43.091033 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088869 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:43.091033 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088873 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:43.091033 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088876 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:43.091033 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088880 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:43.091033 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088883 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:43.091033 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088886 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:43.091033 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088888 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:43.091033 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088891 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:43.091033 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088894 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:43.091033 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088897 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:43.091033 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088900 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:43.091033 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088904 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:43.091033 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088907 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:43.091033 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088909 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:43.091033 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088912 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:43.091033 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088915 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:43.091033 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088918 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:43.091033 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088921 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:43.091033 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088923 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:43.091529 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088926 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:43.091529 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088928 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:43.091529 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088931 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:43.091529 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088933 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:43.091529 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088936 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:43.091529 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088938 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:43.091529 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088941 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:43.091529 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088943 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:43.091529 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088946 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:43.091529 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088949 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:43.091529 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088952 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:43.091529 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088955 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:43.091529 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088957 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:43.091529 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088960 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:43.091529 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088962 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:43.091529 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088965 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:43.091529 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088968 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:43.091529 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088970 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:43.091529 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088973 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:43.091529 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088976 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:43.092041 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088978 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:43.092041 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088980 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:43.092041 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088983 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:43.092041 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088986 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:43.092041 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088988 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:43.092041 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088991 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:43.092041 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088994 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:43.092041 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088997 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:43.092041 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.088999 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:43.092041 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.089002 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:43.092041 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.089005 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:43.092041 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.089008 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:43.092041 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.089010 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:43.092041 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.089013 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:43.092041 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.089016 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:43.092041 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.089018 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:43.092041 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.089021 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:43.092041 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.089024 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:43.092041 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.089026 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:43.092041 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090501 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 19:23:43.092541 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090510 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 19:23:43.092541 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090517 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 19:23:43.092541 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090521 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 19:23:43.092541 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090526 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 19:23:43.092541 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090529 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 19:23:43.092541 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090533 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 19:23:43.092541 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090538 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 19:23:43.092541 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090541 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 19:23:43.092541 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090544 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 19:23:43.092541 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090547 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 19:23:43.092541 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090550 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 19:23:43.092541 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090554 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 19:23:43.092541 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090557 2574 flags.go:64] FLAG: --cgroup-root="" Apr 22 19:23:43.092541 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090560 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 19:23:43.092541 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090563 2574 flags.go:64] FLAG: --client-ca-file="" Apr 22 19:23:43.092541 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090566 2574 flags.go:64] FLAG: --cloud-config="" Apr 22 19:23:43.092541 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090569 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 22 19:23:43.092541 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090572 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 19:23:43.092541 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090577 2574 flags.go:64] FLAG: --cluster-domain="" Apr 22 19:23:43.092541 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090581 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 19:23:43.092541 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090584 2574 flags.go:64] FLAG: --config-dir="" Apr 22 19:23:43.092541 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090587 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 19:23:43.092541 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090590 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 19:23:43.092541 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090594 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 19:23:43.093132 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090597 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 19:23:43.093132 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090600 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 19:23:43.093132 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090603 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 19:23:43.093132 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090606 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 22 19:23:43.093132 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090609 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 19:23:43.093132 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090612 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 19:23:43.093132 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090615 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 19:23:43.093132 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090619 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 19:23:43.093132 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090624 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 19:23:43.093132 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090627 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 19:23:43.093132 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090630 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 19:23:43.093132 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090633 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 19:23:43.093132 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090636 2574 flags.go:64] FLAG: --enable-server="true" Apr 22 19:23:43.093132 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090639 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 19:23:43.093132 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090643 2574 flags.go:64] FLAG: --event-burst="100" Apr 22 19:23:43.093132 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090647 2574 flags.go:64] FLAG: --event-qps="50" Apr 22 19:23:43.093132 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090649 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 19:23:43.093132 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090653 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 19:23:43.093132 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090656 2574 flags.go:64] FLAG: --eviction-hard="" Apr 22 19:23:43.093132 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090660 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 19:23:43.093132 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090663 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 19:23:43.093132 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090666 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 19:23:43.093132 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090669 2574 flags.go:64] FLAG: --eviction-soft="" Apr 22 19:23:43.093132 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090672 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 19:23:43.093132 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090675 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 19:23:43.093752 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090678 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 19:23:43.093752 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090681 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 19:23:43.093752 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090685 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 19:23:43.093752 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090688 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 19:23:43.093752 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090691 2574 flags.go:64] FLAG: --feature-gates="" Apr 22 19:23:43.093752 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090695 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 19:23:43.093752 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090698 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 19:23:43.093752 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090701 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 19:23:43.093752 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090704 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 19:23:43.093752 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090707 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 22 19:23:43.093752 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090711 2574 flags.go:64] FLAG: --help="false" Apr 22 19:23:43.093752 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090714 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-134-22.ec2.internal" Apr 22 19:23:43.093752 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090717 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 19:23:43.093752 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090720 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 19:23:43.093752 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090723 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 19:23:43.093752 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090727 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 19:23:43.093752 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090730 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 19:23:43.093752 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090733 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 19:23:43.093752 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090736 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 19:23:43.093752 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090739 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 19:23:43.093752 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090742 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 19:23:43.093752 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090745 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 19:23:43.093752 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090748 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 19:23:43.093752 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090751 2574 flags.go:64] FLAG: --kube-reserved="" Apr 22 19:23:43.094359 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090754 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 19:23:43.094359 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090756 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 19:23:43.094359 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090760 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 19:23:43.094359 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090762 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 19:23:43.094359 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090766 2574 flags.go:64] FLAG: --lock-file="" Apr 22 19:23:43.094359 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090768 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 19:23:43.094359 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090771 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 19:23:43.094359 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090774 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 19:23:43.094359 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090792 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 19:23:43.094359 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090795 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 19:23:43.094359 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090798 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 19:23:43.094359 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090801 2574 flags.go:64] FLAG: --logging-format="text" Apr 22 19:23:43.094359 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090805 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 19:23:43.094359 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090808 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 19:23:43.094359 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090811 2574 flags.go:64] FLAG: --manifest-url="" Apr 22 19:23:43.094359 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090814 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 22 19:23:43.094359 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090823 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 19:23:43.094359 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090840 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 19:23:43.094359 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090845 2574 flags.go:64] FLAG: --max-pods="110" Apr 22 19:23:43.094359 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090849 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 19:23:43.094359 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090852 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 19:23:43.094359 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090856 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 19:23:43.094359 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090859 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 19:23:43.094359 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090863 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 19:23:43.094359 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090866 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 19:23:43.095044 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090869 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 19:23:43.095044 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090877 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 19:23:43.095044 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090880 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 19:23:43.095044 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090883 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 19:23:43.095044 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090886 2574 flags.go:64] FLAG: --pod-cidr="" Apr 22 19:23:43.095044 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090889 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 19:23:43.095044 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090894 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 19:23:43.095044 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090897 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 19:23:43.095044 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090900 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 22 19:23:43.095044 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090903 2574 flags.go:64] FLAG: --port="10250" Apr 22 19:23:43.095044 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090906 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 19:23:43.095044 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090909 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0c1fae6eaf76c315a" Apr 22 19:23:43.095044 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090912 2574 flags.go:64] FLAG: --qos-reserved="" Apr 22 19:23:43.095044 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090915 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 22 19:23:43.095044 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090918 2574 flags.go:64] FLAG: --register-node="true" Apr 22 19:23:43.095044 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090921 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 22 19:23:43.095044 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090923 2574 flags.go:64] FLAG: --register-with-taints="" Apr 22 19:23:43.095044 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090927 2574 flags.go:64] FLAG: --registry-burst="10" Apr 22 19:23:43.095044 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090930 2574 flags.go:64] FLAG: --registry-qps="5" Apr 22 19:23:43.095044 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090933 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 22 19:23:43.095044 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090936 2574 flags.go:64] FLAG: --reserved-memory="" Apr 22 19:23:43.095044 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090940 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 19:23:43.095044 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090943 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 19:23:43.095044 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090946 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 19:23:43.095044 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090948 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 19:23:43.095642 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090951 2574 flags.go:64] FLAG: --runonce="false" Apr 22 19:23:43.095642 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090954 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 19:23:43.095642 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090957 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 19:23:43.095642 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090960 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 22 19:23:43.095642 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090963 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 19:23:43.095642 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090968 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 19:23:43.095642 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090972 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 19:23:43.095642 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090974 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 19:23:43.095642 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090978 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 19:23:43.095642 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090981 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 19:23:43.095642 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090984 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 19:23:43.095642 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090987 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 19:23:43.095642 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090989 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 19:23:43.095642 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090993 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 19:23:43.095642 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090996 2574 flags.go:64] FLAG: --system-cgroups="" Apr 22 19:23:43.095642 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.090999 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 19:23:43.095642 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.091004 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 19:23:43.095642 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.091007 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 22 19:23:43.095642 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.091010 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 19:23:43.095642 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.091015 2574 flags.go:64] FLAG: --tls-min-version="" Apr 22 19:23:43.095642 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.091018 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 19:23:43.095642 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.091020 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 19:23:43.095642 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.091023 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 19:23:43.095642 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.091026 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 19:23:43.095642 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.091029 2574 flags.go:64] FLAG: --v="2" Apr 22 19:23:43.096253 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.091033 2574 flags.go:64] FLAG: --version="false" Apr 22 19:23:43.096253 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.091037 2574 flags.go:64] FLAG: --vmodule="" Apr 22 19:23:43.096253 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.091042 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 19:23:43.096253 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.091045 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 19:23:43.096253 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091138 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:43.096253 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091141 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:43.096253 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091145 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:43.096253 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091148 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:43.096253 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091151 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:43.096253 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091153 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:43.096253 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091156 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:43.096253 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091159 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:43.096253 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091163 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:43.096253 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091166 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:43.096253 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091168 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:43.096253 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091171 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:43.096253 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091173 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:43.096253 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091176 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:43.096253 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091179 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:43.096253 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091181 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:43.096904 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091184 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:43.096904 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091187 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:43.096904 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091189 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:43.096904 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091192 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:43.096904 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091194 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:43.096904 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091197 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:43.096904 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091199 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:43.096904 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091202 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:43.096904 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091205 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:43.096904 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091207 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:43.096904 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091210 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:43.096904 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091214 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:43.096904 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091218 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:43.096904 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091222 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:43.096904 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091225 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:43.096904 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091229 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:43.096904 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091232 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:43.096904 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091236 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:43.096904 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091240 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:43.097716 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091243 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:43.097716 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091246 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:43.097716 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091248 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:43.097716 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091251 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:43.097716 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091253 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:43.097716 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091257 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:43.097716 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091260 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:43.097716 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091262 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:43.097716 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091265 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:43.097716 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091267 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:43.097716 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091270 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:43.097716 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091273 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:43.097716 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091275 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:43.097716 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091278 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:43.097716 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091281 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:43.097716 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091283 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:43.097716 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091286 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:43.097716 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091288 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:43.097716 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091291 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:43.097716 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091294 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:43.098258 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091296 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:43.098258 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091299 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:43.098258 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091301 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:43.098258 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091304 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:43.098258 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091308 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:43.098258 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091310 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:43.098258 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091313 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:43.098258 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091315 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:43.098258 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091318 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:43.098258 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091321 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:43.098258 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091323 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:43.098258 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091325 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:43.098258 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091328 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:43.098258 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091330 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:43.098258 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091333 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:43.098258 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091335 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:43.098258 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091338 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:43.098258 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091342 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:43.098258 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091344 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:43.098258 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091347 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:43.098796 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091349 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:43.098796 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091352 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:43.098796 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091354 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:43.098796 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091357 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:43.098796 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091359 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:43.098796 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091362 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:43.098796 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091365 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:43.098796 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091367 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:43.098796 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091370 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:43.098796 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091372 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:43.098796 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.091375 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:43.098796 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.091382 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:23:43.098796 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.098705 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 19:23:43.098796 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.098723 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 19:23:43.098796 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098772 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:43.099170 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098793 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:43.099170 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098797 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:43.099170 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098800 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:43.099170 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098803 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:43.099170 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098806 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:43.099170 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098810 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:43.099170 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098813 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:43.099170 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098815 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:43.099170 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098818 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:43.099170 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098822 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:43.099170 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098827 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:43.099170 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098830 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:43.099170 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098833 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:43.099170 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098836 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:43.099170 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098839 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:43.099170 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098841 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:43.099170 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098844 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:43.099170 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098846 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:43.099170 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098849 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:43.099633 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098852 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:43.099633 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098854 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:43.099633 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098857 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:43.099633 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098860 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:43.099633 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098863 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:43.099633 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098866 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:43.099633 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098868 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:43.099633 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098871 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:43.099633 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098874 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:43.099633 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098877 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:43.099633 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098880 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:43.099633 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098882 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:43.099633 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098885 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:43.099633 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098887 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:43.099633 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098890 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:43.099633 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098894 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:43.099633 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098898 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:43.099633 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098901 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:43.099633 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098905 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:43.099633 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098908 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:43.100131 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098910 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:43.100131 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098913 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:43.100131 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098916 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:43.100131 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098919 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:43.100131 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098921 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:43.100131 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098924 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:43.100131 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098926 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:43.100131 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098929 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:43.100131 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098932 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:43.100131 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098934 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:43.100131 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098937 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:43.100131 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098940 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:43.100131 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098942 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:43.100131 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098945 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:43.100131 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098948 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:43.100131 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098950 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:43.100131 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098953 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:43.100131 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098956 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:43.100131 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098958 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:43.100131 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098961 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:43.100651 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098963 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:43.100651 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098966 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:43.100651 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098969 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:43.100651 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098971 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:43.100651 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098974 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:43.100651 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098976 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:43.100651 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098979 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:43.100651 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098982 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:43.100651 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098984 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:43.100651 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098987 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:43.100651 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098989 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:43.100651 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098992 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:43.100651 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098994 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:43.100651 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098996 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:43.100651 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.098999 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:43.100651 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099002 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:43.100651 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099005 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:43.100651 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099007 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:43.100651 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099010 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:43.100651 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099013 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:43.101235 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099015 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:43.101235 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099017 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:43.101235 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099020 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:43.101235 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099022 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:43.101235 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099025 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:43.101235 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099028 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:43.101235 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.099033 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:23:43.101235 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099125 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:43.101235 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099130 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:43.101235 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099133 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:43.101235 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099135 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:43.101235 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099138 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:43.101235 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099141 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:43.101235 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099143 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:43.101235 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099146 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:43.101618 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099149 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:43.101618 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099152 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:43.101618 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099154 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:43.101618 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099157 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:43.101618 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099160 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:43.101618 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099163 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:43.101618 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099166 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:43.101618 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099168 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:43.101618 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099171 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:43.101618 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099174 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:43.101618 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099178 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:43.101618 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099182 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:43.101618 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099187 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:43.101618 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099190 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:43.101618 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099194 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:43.101618 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099197 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:43.101618 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099200 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:43.101618 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099202 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:43.101618 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099205 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:43.102115 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099208 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:43.102115 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099210 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:43.102115 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099213 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:43.102115 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099215 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:43.102115 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099218 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:43.102115 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099221 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:43.102115 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099223 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:43.102115 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099225 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:43.102115 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099228 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:43.102115 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099230 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:43.102115 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099233 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:43.102115 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099235 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:43.102115 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099238 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:43.102115 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099240 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:43.102115 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099244 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:43.102115 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099246 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:43.102115 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099250 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:43.102115 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099252 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:43.102115 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099255 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:43.102115 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099258 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:43.102617 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099260 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:43.102617 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099263 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:43.102617 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099265 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:43.102617 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099268 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:43.102617 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099270 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:43.102617 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099273 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:43.102617 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099276 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:43.102617 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099278 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:43.102617 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099281 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:43.102617 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099284 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:43.102617 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099286 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:43.102617 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099289 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:43.102617 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099291 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:43.102617 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099294 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:43.102617 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099296 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:43.102617 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099299 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:43.102617 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099301 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:43.102617 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099304 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:43.102617 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099306 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:43.102617 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099309 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:43.103120 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099311 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:43.103120 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099314 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:43.103120 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099316 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:43.103120 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099319 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:43.103120 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099322 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:43.103120 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099324 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:43.103120 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099327 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:43.103120 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099330 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:43.103120 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099333 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:43.103120 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099335 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:43.103120 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099338 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:43.103120 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099340 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:43.103120 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099343 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:43.103120 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099345 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:43.103120 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099348 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:43.103120 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099350 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:43.103120 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099353 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:43.103120 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099355 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:43.103120 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:43.099358 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:43.103590 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.099363 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:23:43.103590 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.100154 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 19:23:43.105589 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.105574 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 19:23:43.106857 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.106822 2574 server.go:1019] "Starting client certificate rotation" Apr 22 19:23:43.106951 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.106933 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:23:43.106986 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.106980 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:23:43.136602 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.136584 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:23:43.139414 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.139395 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:23:43.157113 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.157092 2574 log.go:25] "Validated CRI v1 runtime API" Apr 22 19:23:43.163208 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.163175 2574 log.go:25] "Validated CRI v1 image API" Apr 22 19:23:43.165841 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.165822 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 19:23:43.168426 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.168405 2574 fs.go:135] Filesystem UUIDs: map[40fa70c5-d96d-4635-b4d5-3e3ac185a435:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 fffd8d10-4ae4-48ae-8332-c324f5b4848f:/dev/nvme0n1p3] Apr 22 19:23:43.168503 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.168424 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 19:23:43.171767 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.171745 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:23:43.173416 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.173318 2574 manager.go:217] Machine: {Timestamp:2026-04-22 19:23:43.172377664 +0000 UTC m=+0.456048128 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3095497 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2299e5d314f68e1e6ae99323b1d940 SystemUUID:ec2299e5-d314-f68e-1e6a-e99323b1d940 BootID:bc27f919-e9c5-423c-973d-09a89e17177f Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:dd:9f:5b:26:ff Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:dd:9f:5b:26:ff Speed:0 Mtu:9001} {Name:ovs-system MacAddress:6a:23:62:3a:79:59 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 19:23:43.173416 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.173416 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 19:23:43.173526 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.173495 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 19:23:43.174720 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.174698 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 19:23:43.174869 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.174723 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-22.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 19:23:43.174913 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.174880 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 19:23:43.174913 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.174889 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 19:23:43.174913 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.174902 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:23:43.175532 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.175522 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:23:43.176356 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.176347 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:23:43.176465 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.176456 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 19:23:43.179285 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.179276 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 22 19:23:43.179317 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.179290 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 19:23:43.179317 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.179302 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 19:23:43.179317 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.179311 2574 kubelet.go:397] "Adding apiserver pod source" Apr 22 19:23:43.179434 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.179320 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 19:23:43.180480 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.180469 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:23:43.180515 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.180487 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:23:43.183692 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.183676 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 19:23:43.186359 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.186346 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 19:23:43.188700 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.188689 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 19:23:43.188746 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.188706 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 19:23:43.188746 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.188712 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 19:23:43.188746 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.188719 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 19:23:43.188746 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.188725 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 19:23:43.188746 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.188732 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 19:23:43.188746 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.188738 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 19:23:43.188746 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.188744 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 19:23:43.188942 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.188751 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 19:23:43.188942 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.188758 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 19:23:43.188942 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.188771 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 19:23:43.188942 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.188795 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 19:23:43.189507 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.189498 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 19:23:43.189540 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.189507 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 19:23:43.192988 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.192975 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 19:23:43.193051 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.193012 2574 server.go:1295] "Started kubelet" Apr 22 19:23:43.193131 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.193107 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 19:23:43.193185 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.193110 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 19:23:43.193185 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.193173 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 19:23:43.193853 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:43.193830 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-22.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 19:23:43.193923 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:43.193908 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 19:23:43.193955 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.193935 2574 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-22.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 19:23:43.193984 ip-10-0-134-22 systemd[1]: Started Kubernetes Kubelet. Apr 22 19:23:43.195666 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.195651 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 22 19:23:43.197328 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.197312 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 19:23:43.205156 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:43.203948 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-22.ec2.internal.18a8c43de9d87734 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-22.ec2.internal,UID:ip-10-0-134-22.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-134-22.ec2.internal,},FirstTimestamp:2026-04-22 19:23:43.192987444 +0000 UTC m=+0.476657909,LastTimestamp:2026-04-22 19:23:43.192987444 +0000 UTC m=+0.476657909,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-22.ec2.internal,}" Apr 22 19:23:43.206549 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:43.206527 2574 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 19:23:43.209033 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.209002 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 19:23:43.209615 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.209595 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 19:23:43.210199 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.210174 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 19:23:43.210199 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.210176 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 19:23:43.210335 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.210203 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 19:23:43.210335 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.210329 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 22 19:23:43.210402 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.210340 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 22 19:23:43.210453 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:43.210432 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-22.ec2.internal\" not found" Apr 22 19:23:43.211082 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.211067 2574 factory.go:55] Registering systemd factory Apr 22 19:23:43.211082 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.211084 2574 factory.go:223] Registration of the systemd container factory successfully Apr 22 19:23:43.211314 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:43.211287 2574 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-134-22.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 19:23:43.211408 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.211332 2574 factory.go:153] Registering CRI-O factory Apr 22 19:23:43.211408 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.211348 2574 factory.go:223] Registration of the crio container factory successfully Apr 22 19:23:43.211408 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:43.211327 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 19:23:43.211538 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.211415 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 19:23:43.211538 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.211434 2574 factory.go:103] Registering Raw factory Apr 22 19:23:43.211538 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.211450 2574 manager.go:1196] Started watching for new ooms in manager Apr 22 19:23:43.212113 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.212097 2574 manager.go:319] Starting recovery of all containers Apr 22 19:23:43.218025 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.217860 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-cgtcg" Apr 22 19:23:43.220508 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.220491 2574 manager.go:324] Recovery completed Apr 22 19:23:43.224876 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.224862 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:43.225996 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.225980 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-cgtcg" Apr 22 19:23:43.227244 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.227230 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:43.227311 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.227258 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:43.227311 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.227272 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:43.227760 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.227746 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 19:23:43.227760 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.227758 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 19:23:43.227900 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.227791 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:23:43.229748 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.229733 2574 policy_none.go:49] "None policy: Start" Apr 22 19:23:43.229853 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.229753 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 19:23:43.229853 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.229767 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 22 19:23:43.278339 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.269406 2574 manager.go:341] "Starting Device Plugin manager" Apr 22 19:23:43.278339 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:43.269436 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 19:23:43.278339 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.269446 2574 server.go:85] "Starting device plugin registration server" Apr 22 19:23:43.278339 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.269748 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 19:23:43.278339 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.269761 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 19:23:43.278339 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.269853 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 19:23:43.278339 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.269942 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 19:23:43.278339 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.269952 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 19:23:43.278339 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:43.270449 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 19:23:43.278339 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:43.270489 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-22.ec2.internal\" not found" Apr 22 19:23:43.343327 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.343244 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 19:23:43.344511 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.344498 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 19:23:43.344585 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.344524 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 19:23:43.344585 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.344551 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 19:23:43.344585 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.344559 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 19:23:43.344724 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:43.344595 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 19:23:43.348871 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.348848 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:43.369923 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.369903 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:43.370827 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.370809 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:43.370928 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.370838 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:43.370928 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.370849 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:43.370928 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.370872 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-22.ec2.internal" Apr 22 19:23:43.382212 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.382194 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-22.ec2.internal" Apr 22 19:23:43.382265 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:43.382217 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-22.ec2.internal\": node \"ip-10-0-134-22.ec2.internal\" not found" Apr 22 19:23:43.405198 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:43.405169 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-22.ec2.internal\" not found" Apr 22 19:23:43.444886 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.444836 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-134-22.ec2.internal"] Apr 22 19:23:43.444978 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.444933 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:43.447733 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.447717 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:43.447838 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.447744 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:43.447838 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.447754 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:43.448955 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.448942 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:43.449092 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.449079 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" Apr 22 19:23:43.449133 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.449108 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:43.449702 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.449688 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:43.449773 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.449710 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:43.449773 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.449720 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:43.449773 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.449732 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:43.449773 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.449752 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:43.449773 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.449762 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:43.450895 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.450882 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-22.ec2.internal" Apr 22 19:23:43.450959 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.450907 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:43.451644 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.451619 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:43.451705 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.451647 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:43.451705 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.451658 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:43.480506 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:43.480484 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-22.ec2.internal\" not found" node="ip-10-0-134-22.ec2.internal" Apr 22 19:23:43.484887 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:43.484872 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-22.ec2.internal\" not found" node="ip-10-0-134-22.ec2.internal" Apr 22 19:23:43.505502 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:43.505481 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-22.ec2.internal\" not found" Apr 22 19:23:43.512842 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.512819 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f546dccbfe88d958c8bad79dd015e11c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal\" (UID: \"f546dccbfe88d958c8bad79dd015e11c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" Apr 22 19:23:43.512902 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.512850 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f546dccbfe88d958c8bad79dd015e11c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal\" (UID: \"f546dccbfe88d958c8bad79dd015e11c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" Apr 22 19:23:43.512902 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.512867 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1b90ee820fd4186f1e6cd40d24ef3276-config\") pod \"kube-apiserver-proxy-ip-10-0-134-22.ec2.internal\" (UID: \"1b90ee820fd4186f1e6cd40d24ef3276\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-22.ec2.internal" Apr 22 19:23:43.605665 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:43.605596 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-22.ec2.internal\" not found" Apr 22 19:23:43.613948 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.613925 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f546dccbfe88d958c8bad79dd015e11c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal\" (UID: \"f546dccbfe88d958c8bad79dd015e11c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" Apr 22 19:23:43.614006 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.613954 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f546dccbfe88d958c8bad79dd015e11c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal\" (UID: \"f546dccbfe88d958c8bad79dd015e11c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" Apr 22 19:23:43.614006 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.613971 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1b90ee820fd4186f1e6cd40d24ef3276-config\") pod \"kube-apiserver-proxy-ip-10-0-134-22.ec2.internal\" (UID: \"1b90ee820fd4186f1e6cd40d24ef3276\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-22.ec2.internal" Apr 22 19:23:43.614077 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.614024 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f546dccbfe88d958c8bad79dd015e11c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal\" (UID: \"f546dccbfe88d958c8bad79dd015e11c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" Apr 22 19:23:43.614117 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.614093 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1b90ee820fd4186f1e6cd40d24ef3276-config\") pod \"kube-apiserver-proxy-ip-10-0-134-22.ec2.internal\" (UID: \"1b90ee820fd4186f1e6cd40d24ef3276\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-22.ec2.internal" Apr 22 19:23:43.614160 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.614133 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f546dccbfe88d958c8bad79dd015e11c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal\" (UID: \"f546dccbfe88d958c8bad79dd015e11c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" Apr 22 19:23:43.706339 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:43.706298 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-22.ec2.internal\" not found" Apr 22 19:23:43.782852 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.782824 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" Apr 22 19:23:43.787232 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:43.787211 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-22.ec2.internal" Apr 22 19:23:43.806997 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:43.806973 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-22.ec2.internal\" not found" Apr 22 19:23:43.907619 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:43.907537 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-22.ec2.internal\" not found" Apr 22 19:23:44.008065 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:44.008036 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-22.ec2.internal\" not found" Apr 22 19:23:44.106637 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:44.106602 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 19:23:44.107181 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:44.106756 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:23:44.108765 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:44.108744 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-22.ec2.internal\" not found" Apr 22 19:23:44.151818 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:44.151773 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:44.209531 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:44.209465 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-22.ec2.internal\" not found" Apr 22 19:23:44.209531 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:44.209503 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 19:23:44.224098 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:44.224074 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:23:44.229080 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:44.229054 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 19:18:43 +0000 UTC" deadline="2027-12-26 18:51:36.154501979 +0000 UTC" Apr 22 19:23:44.229170 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:44.229080 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14711h27m51.925424582s" Apr 22 19:23:44.249560 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:44.249535 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-727ql" Apr 22 19:23:44.256295 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:44.256264 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-727ql" Apr 22 19:23:44.309951 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:44.309923 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-22.ec2.internal\" not found" Apr 22 19:23:44.319987 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:44.319959 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b90ee820fd4186f1e6cd40d24ef3276.slice/crio-d5aa1576e84917273436c4a12271f62324b8bb349d25afb5e6862a3ca2fb19e4 WatchSource:0}: Error finding container d5aa1576e84917273436c4a12271f62324b8bb349d25afb5e6862a3ca2fb19e4: Status 404 returned error can't find the container with id d5aa1576e84917273436c4a12271f62324b8bb349d25afb5e6862a3ca2fb19e4 Apr 22 19:23:44.320457 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:44.320443 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf546dccbfe88d958c8bad79dd015e11c.slice/crio-c178d5972788bd0eb62295104172bc76fbced008be89cf0c2100cc715afee072 WatchSource:0}: Error finding container c178d5972788bd0eb62295104172bc76fbced008be89cf0c2100cc715afee072: Status 404 returned error can't find the container with id c178d5972788bd0eb62295104172bc76fbced008be89cf0c2100cc715afee072 Apr 22 19:23:44.324273 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:44.324258 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:23:44.347490 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:44.347448 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-22.ec2.internal" event={"ID":"1b90ee820fd4186f1e6cd40d24ef3276","Type":"ContainerStarted","Data":"d5aa1576e84917273436c4a12271f62324b8bb349d25afb5e6862a3ca2fb19e4"} Apr 22 19:23:44.348396 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:44.348374 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" event={"ID":"f546dccbfe88d958c8bad79dd015e11c","Type":"ContainerStarted","Data":"c178d5972788bd0eb62295104172bc76fbced008be89cf0c2100cc715afee072"} Apr 22 19:23:44.410629 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:44.410599 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-22.ec2.internal\" not found" Apr 22 19:23:44.511086 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:44.511057 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-22.ec2.internal\" not found" Apr 22 19:23:44.611533 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:44.611500 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-22.ec2.internal\" not found" Apr 22 19:23:44.712128 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:44.712093 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:44.712298 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:44.712272 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-22.ec2.internal\" not found" Apr 22 19:23:44.771013 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:44.770936 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:44.810524 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:44.810487 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" Apr 22 19:23:44.827265 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:44.827238 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:23:44.828814 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:44.828569 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-22.ec2.internal" Apr 22 19:23:44.837372 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:44.837253 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:23:45.181106 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.181024 2574 apiserver.go:52] "Watching apiserver" Apr 22 19:23:45.191097 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.191070 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 19:23:45.193390 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.193365 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-284wh","openshift-ovn-kubernetes/ovnkube-node-pqkn7","kube-system/kube-apiserver-proxy-ip-10-0-134-22.ec2.internal","openshift-cluster-node-tuning-operator/tuned-89wlr","openshift-image-registry/node-ca-95pbb","openshift-multus/multus-additional-cni-plugins-z6hh4","openshift-multus/network-metrics-daemon-mdzdp","kube-system/konnectivity-agent-ns824","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2rzjr","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal","openshift-multus/multus-gkjzf","openshift-network-diagnostics/network-check-target-jzltp"] Apr 22 19:23:45.194622 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.194602 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-284wh" Apr 22 19:23:45.195974 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.195839 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.196844 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.196757 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.197774 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.197742 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 19:23:45.197880 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.197751 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-xnlqg\"" Apr 22 19:23:45.197880 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.197823 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 19:23:45.197880 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.197865 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:23:45.198199 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.198170 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-95pbb" Apr 22 19:23:45.198474 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.198452 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 19:23:45.198732 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.198707 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 19:23:45.199073 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.199054 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 19:23:45.199158 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.199142 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 19:23:45.199273 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.199202 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 19:23:45.199273 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.199226 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-9b8wr\"" Apr 22 19:23:45.199273 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.199262 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 19:23:45.199397 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.199361 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-l955b\"" Apr 22 19:23:45.199431 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.199397 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:23:45.199559 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.199543 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 19:23:45.200476 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.200458 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-z6hh4" Apr 22 19:23:45.200587 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.200549 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdzdp" Apr 22 19:23:45.200646 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:45.200612 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdzdp" podUID="93109160-9bbe-497f-9b25-d7fa7e08508f" Apr 22 19:23:45.200950 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.200932 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 19:23:45.201047 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.200974 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 19:23:45.201047 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.201029 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-v5nhj\"" Apr 22 19:23:45.201047 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.201039 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 19:23:45.201973 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.201957 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-ns824" Apr 22 19:23:45.202735 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.202718 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 19:23:45.203805 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.203724 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 19:23:45.204054 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.203888 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 19:23:45.204054 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.203995 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-wbbcm\"" Apr 22 19:23:45.204054 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.204048 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 19:23:45.204288 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.204267 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-wm46v\"" Apr 22 19:23:45.204389 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.204318 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 19:23:45.204573 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.204550 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 19:23:45.205167 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.204713 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2rzjr" Apr 22 19:23:45.205167 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.204816 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 19:23:45.207384 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.207366 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 19:23:45.207850 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.207833 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 19:23:45.207961 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.207864 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-97wkg\"" Apr 22 19:23:45.208064 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.208047 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 19:23:45.208878 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.208859 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.210105 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.210089 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jzltp" Apr 22 19:23:45.210205 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:45.210162 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jzltp" podUID="9bbee64a-2154-4ea2-9299-c15d3614e769" Apr 22 19:23:45.211226 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.211208 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 19:23:45.213677 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.213660 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 19:23:45.213933 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.213912 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-vqgbg\"" Apr 22 19:23:45.222125 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.222101 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2b0ddba0-98a1-4870-acbb-832572b5d62b-lib-modules\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.222246 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.222136 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2b0ddba0-98a1-4870-acbb-832572b5d62b-var-lib-kubelet\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.222246 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.222170 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-etc-openvswitch\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.222246 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.222199 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-hostroot\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.222246 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.222222 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/20682e94-9131-4a8d-a325-b9f45d2fd64f-os-release\") pod \"multus-additional-cni-plugins-z6hh4\" (UID: \"20682e94-9131-4a8d-a325-b9f45d2fd64f\") " pod="openshift-multus/multus-additional-cni-plugins-z6hh4" Apr 22 19:23:45.222246 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.222244 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-multus-cni-dir\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.222537 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.222268 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-var-lib-openvswitch\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.222537 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.222321 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-log-socket\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.222537 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.222350 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-run-systemd\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.222537 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.222383 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-host-cni-netd\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.222537 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.222406 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93109160-9bbe-497f-9b25-d7fa7e08508f-metrics-certs\") pod \"network-metrics-daemon-mdzdp\" (UID: \"93109160-9bbe-497f-9b25-d7fa7e08508f\") " pod="openshift-multus/network-metrics-daemon-mdzdp" Apr 22 19:23:45.222537 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.222427 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4b878ced-b265-4741-a08e-35a9b9c87d5a-socket-dir\") pod \"aws-ebs-csi-driver-node-2rzjr\" (UID: \"4b878ced-b265-4741-a08e-35a9b9c87d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2rzjr" Apr 22 19:23:45.222537 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.222470 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq7bf\" (UniqueName: \"kubernetes.io/projected/b9527658-a6b8-4270-ae5d-f451e61ca79f-kube-api-access-kq7bf\") pod \"iptables-alerter-284wh\" (UID: \"b9527658-a6b8-4270-ae5d-f451e61ca79f\") " pod="openshift-network-operator/iptables-alerter-284wh" Apr 22 19:23:45.222537 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.222524 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4blq2\" (UniqueName: \"kubernetes.io/projected/2b0ddba0-98a1-4870-acbb-832572b5d62b-kube-api-access-4blq2\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.222886 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.222553 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/06227928-4a6e-4e0e-b991-1f9a395b21c4-ovnkube-config\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.222886 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.222640 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/20682e94-9131-4a8d-a325-b9f45d2fd64f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-z6hh4\" (UID: \"20682e94-9131-4a8d-a325-b9f45d2fd64f\") " pod="openshift-multus/multus-additional-cni-plugins-z6hh4" Apr 22 19:23:45.222886 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.222675 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sckkq\" (UniqueName: \"kubernetes.io/projected/4b878ced-b265-4741-a08e-35a9b9c87d5a-kube-api-access-sckkq\") pod \"aws-ebs-csi-driver-node-2rzjr\" (UID: \"4b878ced-b265-4741-a08e-35a9b9c87d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2rzjr" Apr 22 19:23:45.222886 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.222697 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2b0ddba0-98a1-4870-acbb-832572b5d62b-etc-kubernetes\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.222886 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.222715 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/4b878ced-b265-4741-a08e-35a9b9c87d5a-sys-fs\") pod \"aws-ebs-csi-driver-node-2rzjr\" (UID: \"4b878ced-b265-4741-a08e-35a9b9c87d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2rzjr" Apr 22 19:23:45.222886 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.222739 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2b0ddba0-98a1-4870-acbb-832572b5d62b-etc-sysctl-conf\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.222886 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.222764 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b0ddba0-98a1-4870-acbb-832572b5d62b-host\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.222886 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.222804 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-system-cni-dir\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.222886 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.222829 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-host-cni-bin\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.222886 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.222853 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/06227928-4a6e-4e0e-b991-1f9a395b21c4-ovn-node-metrics-cert\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.223323 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.222897 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/4b878ced-b265-4741-a08e-35a9b9c87d5a-etc-selinux\") pod \"aws-ebs-csi-driver-node-2rzjr\" (UID: \"4b878ced-b265-4741-a08e-35a9b9c87d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2rzjr" Apr 22 19:23:45.223323 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.222931 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/20682e94-9131-4a8d-a325-b9f45d2fd64f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-z6hh4\" (UID: \"20682e94-9131-4a8d-a325-b9f45d2fd64f\") " pod="openshift-multus/multus-additional-cni-plugins-z6hh4" Apr 22 19:23:45.223323 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.222964 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cfg7\" (UniqueName: \"kubernetes.io/projected/93109160-9bbe-497f-9b25-d7fa7e08508f-kube-api-access-4cfg7\") pod \"network-metrics-daemon-mdzdp\" (UID: \"93109160-9bbe-497f-9b25-d7fa7e08508f\") " pod="openshift-multus/network-metrics-daemon-mdzdp" Apr 22 19:23:45.223323 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.222993 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/68be7f81-7c86-4929-92a2-0347981c9140-multus-daemon-config\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.223323 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.223018 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-host-kubelet\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.223323 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.223057 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-run-ovn\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.223323 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.223092 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/20682e94-9131-4a8d-a325-b9f45d2fd64f-system-cni-dir\") pod \"multus-additional-cni-plugins-z6hh4\" (UID: \"20682e94-9131-4a8d-a325-b9f45d2fd64f\") " pod="openshift-multus/multus-additional-cni-plugins-z6hh4" Apr 22 19:23:45.223323 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.223119 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b9527658-a6b8-4270-ae5d-f451e61ca79f-host-slash\") pod \"iptables-alerter-284wh\" (UID: \"b9527658-a6b8-4270-ae5d-f451e61ca79f\") " pod="openshift-network-operator/iptables-alerter-284wh" Apr 22 19:23:45.223323 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.223143 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2b0ddba0-98a1-4870-acbb-832572b5d62b-etc-sysconfig\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.223323 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.223165 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e586a32a-1d89-4ae4-a0b6-0667215a50e4-konnectivity-ca\") pod \"konnectivity-agent-ns824\" (UID: \"e586a32a-1d89-4ae4-a0b6-0667215a50e4\") " pod="kube-system/konnectivity-agent-ns824" Apr 22 19:23:45.223323 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.223187 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-multus-socket-dir-parent\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.223323 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.223208 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-host-var-lib-kubelet\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.223323 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.223245 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-etc-kubernetes\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.223870 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.223466 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-host-run-ovn-kubernetes\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.223870 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.223553 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/20682e94-9131-4a8d-a325-b9f45d2fd64f-cni-binary-copy\") pod \"multus-additional-cni-plugins-z6hh4\" (UID: \"20682e94-9131-4a8d-a325-b9f45d2fd64f\") " pod="openshift-multus/multus-additional-cni-plugins-z6hh4" Apr 22 19:23:45.223870 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.223596 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2b0ddba0-98a1-4870-acbb-832572b5d62b-etc-modprobe-d\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.223870 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.223693 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2b0ddba0-98a1-4870-acbb-832572b5d62b-etc-systemd\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.223870 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.223730 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-host-var-lib-cni-multus\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.223870 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.223765 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-host-slash\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.223870 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.223854 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpcz4\" (UniqueName: \"kubernetes.io/projected/06227928-4a6e-4e0e-b991-1f9a395b21c4-kube-api-access-bpcz4\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.224313 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.223897 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fb2a25eb-d7fc-4fe6-a965-be2f517c03ab-host\") pod \"node-ca-95pbb\" (UID: \"fb2a25eb-d7fc-4fe6-a965-be2f517c03ab\") " pod="openshift-image-registry/node-ca-95pbb" Apr 22 19:23:45.224313 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.223958 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2b0ddba0-98a1-4870-acbb-832572b5d62b-etc-tuned\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.224313 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.224005 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/20682e94-9131-4a8d-a325-b9f45d2fd64f-cnibin\") pod \"multus-additional-cni-plugins-z6hh4\" (UID: \"20682e94-9131-4a8d-a325-b9f45d2fd64f\") " pod="openshift-multus/multus-additional-cni-plugins-z6hh4" Apr 22 19:23:45.224313 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.224033 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2b0ddba0-98a1-4870-acbb-832572b5d62b-run\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.224313 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.224074 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2b0ddba0-98a1-4870-acbb-832572b5d62b-sys\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.224313 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.224110 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/68be7f81-7c86-4929-92a2-0347981c9140-cni-binary-copy\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.224313 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.224145 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-host-run-netns\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.224313 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.224172 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8wln\" (UniqueName: \"kubernetes.io/projected/9bbee64a-2154-4ea2-9299-c15d3614e769-kube-api-access-k8wln\") pod \"network-check-target-jzltp\" (UID: \"9bbee64a-2154-4ea2-9299-c15d3614e769\") " pod="openshift-network-diagnostics/network-check-target-jzltp" Apr 22 19:23:45.224313 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.224201 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2b0ddba0-98a1-4870-acbb-832572b5d62b-tmp\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.224313 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.224234 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e586a32a-1d89-4ae4-a0b6-0667215a50e4-agent-certs\") pod \"konnectivity-agent-ns824\" (UID: \"e586a32a-1d89-4ae4-a0b6-0667215a50e4\") " pod="kube-system/konnectivity-agent-ns824" Apr 22 19:23:45.224766 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.224270 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b9527658-a6b8-4270-ae5d-f451e61ca79f-iptables-alerter-script\") pod \"iptables-alerter-284wh\" (UID: \"b9527658-a6b8-4270-ae5d-f451e61ca79f\") " pod="openshift-network-operator/iptables-alerter-284wh" Apr 22 19:23:45.224766 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.224458 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-os-release\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.224766 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.224491 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-multus-conf-dir\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.224766 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.224520 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-host-run-multus-certs\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.224766 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.224553 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-systemd-units\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.224766 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.224583 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-host-run-netns\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.224766 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.224615 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-run-openvswitch\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.224766 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.224647 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/06227928-4a6e-4e0e-b991-1f9a395b21c4-env-overrides\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.224766 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.224673 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjzhq\" (UniqueName: \"kubernetes.io/projected/fb2a25eb-d7fc-4fe6-a965-be2f517c03ab-kube-api-access-cjzhq\") pod \"node-ca-95pbb\" (UID: \"fb2a25eb-d7fc-4fe6-a965-be2f517c03ab\") " pod="openshift-image-registry/node-ca-95pbb" Apr 22 19:23:45.224766 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.224703 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-cnibin\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.224766 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.224739 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcs8z\" (UniqueName: \"kubernetes.io/projected/68be7f81-7c86-4929-92a2-0347981c9140-kube-api-access-rcs8z\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.225626 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.224790 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/20682e94-9131-4a8d-a325-b9f45d2fd64f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-z6hh4\" (UID: \"20682e94-9131-4a8d-a325-b9f45d2fd64f\") " pod="openshift-multus/multus-additional-cni-plugins-z6hh4" Apr 22 19:23:45.225626 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.224823 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4b878ced-b265-4741-a08e-35a9b9c87d5a-registration-dir\") pod \"aws-ebs-csi-driver-node-2rzjr\" (UID: \"4b878ced-b265-4741-a08e-35a9b9c87d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2rzjr" Apr 22 19:23:45.225626 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.224852 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/4b878ced-b265-4741-a08e-35a9b9c87d5a-device-dir\") pod \"aws-ebs-csi-driver-node-2rzjr\" (UID: \"4b878ced-b265-4741-a08e-35a9b9c87d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2rzjr" Apr 22 19:23:45.225626 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.224879 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2b0ddba0-98a1-4870-acbb-832572b5d62b-etc-sysctl-d\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.225626 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.224914 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5xqp\" (UniqueName: \"kubernetes.io/projected/20682e94-9131-4a8d-a325-b9f45d2fd64f-kube-api-access-q5xqp\") pod \"multus-additional-cni-plugins-z6hh4\" (UID: \"20682e94-9131-4a8d-a325-b9f45d2fd64f\") " pod="openshift-multus/multus-additional-cni-plugins-z6hh4" Apr 22 19:23:45.225626 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.224948 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-host-run-k8s-cni-cncf-io\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.225626 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.224978 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-node-log\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.225626 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.225019 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.225626 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.225046 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fb2a25eb-d7fc-4fe6-a965-be2f517c03ab-serviceca\") pod \"node-ca-95pbb\" (UID: \"fb2a25eb-d7fc-4fe6-a965-be2f517c03ab\") " pod="openshift-image-registry/node-ca-95pbb" Apr 22 19:23:45.225626 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.225103 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4b878ced-b265-4741-a08e-35a9b9c87d5a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2rzjr\" (UID: \"4b878ced-b265-4741-a08e-35a9b9c87d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2rzjr" Apr 22 19:23:45.225626 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.225283 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-host-var-lib-cni-bin\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.225626 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.225335 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/06227928-4a6e-4e0e-b991-1f9a395b21c4-ovnkube-script-lib\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.257143 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.256984 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:18:44 +0000 UTC" deadline="2028-01-05 11:26:06.025018998 +0000 UTC" Apr 22 19:23:45.257143 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.257013 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14944h2m20.768010065s" Apr 22 19:23:45.326254 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.326225 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/68be7f81-7c86-4929-92a2-0347981c9140-cni-binary-copy\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.326254 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.326257 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-host-run-netns\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.326470 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.326278 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8wln\" (UniqueName: \"kubernetes.io/projected/9bbee64a-2154-4ea2-9299-c15d3614e769-kube-api-access-k8wln\") pod \"network-check-target-jzltp\" (UID: \"9bbee64a-2154-4ea2-9299-c15d3614e769\") " pod="openshift-network-diagnostics/network-check-target-jzltp" Apr 22 19:23:45.326470 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.326374 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-host-run-netns\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.326470 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.326402 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2b0ddba0-98a1-4870-acbb-832572b5d62b-tmp\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.326470 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.326454 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e586a32a-1d89-4ae4-a0b6-0667215a50e4-agent-certs\") pod \"konnectivity-agent-ns824\" (UID: \"e586a32a-1d89-4ae4-a0b6-0667215a50e4\") " pod="kube-system/konnectivity-agent-ns824" Apr 22 19:23:45.326654 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.326482 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b9527658-a6b8-4270-ae5d-f451e61ca79f-iptables-alerter-script\") pod \"iptables-alerter-284wh\" (UID: \"b9527658-a6b8-4270-ae5d-f451e61ca79f\") " pod="openshift-network-operator/iptables-alerter-284wh" Apr 22 19:23:45.326654 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.326507 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-os-release\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.326654 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.326530 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-multus-conf-dir\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.326654 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.326559 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-host-run-multus-certs\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.326654 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.326640 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-os-release\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.326912 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.326602 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-systemd-units\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.326912 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.326692 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-host-run-netns\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.326912 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.326725 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-run-openvswitch\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.326912 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.326751 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/06227928-4a6e-4e0e-b991-1f9a395b21c4-env-overrides\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.326912 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.326751 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-multus-conf-dir\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.326912 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.326747 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 19:23:45.326912 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.326795 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cjzhq\" (UniqueName: \"kubernetes.io/projected/fb2a25eb-d7fc-4fe6-a965-be2f517c03ab-kube-api-access-cjzhq\") pod \"node-ca-95pbb\" (UID: \"fb2a25eb-d7fc-4fe6-a965-be2f517c03ab\") " pod="openshift-image-registry/node-ca-95pbb" Apr 22 19:23:45.326912 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.326822 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-cnibin\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.326912 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.326826 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-host-run-multus-certs\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.326912 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.326873 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-host-run-netns\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.326912 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.326868 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-systemd-units\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.326912 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.326882 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-run-openvswitch\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.326912 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.326896 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rcs8z\" (UniqueName: \"kubernetes.io/projected/68be7f81-7c86-4929-92a2-0347981c9140-kube-api-access-rcs8z\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.326912 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.326894 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/68be7f81-7c86-4929-92a2-0347981c9140-cni-binary-copy\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.327545 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.326925 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-cnibin\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.327545 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327070 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/20682e94-9131-4a8d-a325-b9f45d2fd64f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-z6hh4\" (UID: \"20682e94-9131-4a8d-a325-b9f45d2fd64f\") " pod="openshift-multus/multus-additional-cni-plugins-z6hh4" Apr 22 19:23:45.327545 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327112 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4b878ced-b265-4741-a08e-35a9b9c87d5a-registration-dir\") pod \"aws-ebs-csi-driver-node-2rzjr\" (UID: \"4b878ced-b265-4741-a08e-35a9b9c87d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2rzjr" Apr 22 19:23:45.327545 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327141 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/4b878ced-b265-4741-a08e-35a9b9c87d5a-device-dir\") pod \"aws-ebs-csi-driver-node-2rzjr\" (UID: \"4b878ced-b265-4741-a08e-35a9b9c87d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2rzjr" Apr 22 19:23:45.327545 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327166 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2b0ddba0-98a1-4870-acbb-832572b5d62b-etc-sysctl-d\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.327545 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327194 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q5xqp\" (UniqueName: \"kubernetes.io/projected/20682e94-9131-4a8d-a325-b9f45d2fd64f-kube-api-access-q5xqp\") pod \"multus-additional-cni-plugins-z6hh4\" (UID: \"20682e94-9131-4a8d-a325-b9f45d2fd64f\") " pod="openshift-multus/multus-additional-cni-plugins-z6hh4" Apr 22 19:23:45.327545 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327221 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-host-run-k8s-cni-cncf-io\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.327545 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327246 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-node-log\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.327545 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327253 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/20682e94-9131-4a8d-a325-b9f45d2fd64f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-z6hh4\" (UID: \"20682e94-9131-4a8d-a325-b9f45d2fd64f\") " pod="openshift-multus/multus-additional-cni-plugins-z6hh4" Apr 22 19:23:45.327545 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327272 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.327545 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327300 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fb2a25eb-d7fc-4fe6-a965-be2f517c03ab-serviceca\") pod \"node-ca-95pbb\" (UID: \"fb2a25eb-d7fc-4fe6-a965-be2f517c03ab\") " pod="openshift-image-registry/node-ca-95pbb" Apr 22 19:23:45.327545 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327328 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4b878ced-b265-4741-a08e-35a9b9c87d5a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2rzjr\" (UID: \"4b878ced-b265-4741-a08e-35a9b9c87d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2rzjr" Apr 22 19:23:45.327545 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327351 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/06227928-4a6e-4e0e-b991-1f9a395b21c4-env-overrides\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.327545 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327354 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-host-var-lib-cni-bin\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.327545 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327397 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2b0ddba0-98a1-4870-acbb-832572b5d62b-etc-sysctl-d\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.327545 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327405 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/06227928-4a6e-4e0e-b991-1f9a395b21c4-ovnkube-script-lib\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.327545 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327412 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b9527658-a6b8-4270-ae5d-f451e61ca79f-iptables-alerter-script\") pod \"iptables-alerter-284wh\" (UID: \"b9527658-a6b8-4270-ae5d-f451e61ca79f\") " pod="openshift-network-operator/iptables-alerter-284wh" Apr 22 19:23:45.328330 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327435 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2b0ddba0-98a1-4870-acbb-832572b5d62b-lib-modules\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.328330 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327438 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-host-run-k8s-cni-cncf-io\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.328330 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327461 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2b0ddba0-98a1-4870-acbb-832572b5d62b-var-lib-kubelet\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.328330 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327488 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-node-log\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.328330 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327510 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-etc-openvswitch\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.328330 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327530 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.328330 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327548 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-hostroot\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.328330 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327573 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/20682e94-9131-4a8d-a325-b9f45d2fd64f-os-release\") pod \"multus-additional-cni-plugins-z6hh4\" (UID: \"20682e94-9131-4a8d-a325-b9f45d2fd64f\") " pod="openshift-multus/multus-additional-cni-plugins-z6hh4" Apr 22 19:23:45.328330 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327605 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-multus-cni-dir\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.328330 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327629 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-var-lib-openvswitch\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.328330 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327655 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-log-socket\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.328330 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327646 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4b878ced-b265-4741-a08e-35a9b9c87d5a-registration-dir\") pod \"aws-ebs-csi-driver-node-2rzjr\" (UID: \"4b878ced-b265-4741-a08e-35a9b9c87d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2rzjr" Apr 22 19:23:45.328330 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327681 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-run-systemd\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.328330 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327705 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-host-cni-netd\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.328330 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327709 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/4b878ced-b265-4741-a08e-35a9b9c87d5a-device-dir\") pod \"aws-ebs-csi-driver-node-2rzjr\" (UID: \"4b878ced-b265-4741-a08e-35a9b9c87d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2rzjr" Apr 22 19:23:45.328330 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327731 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93109160-9bbe-497f-9b25-d7fa7e08508f-metrics-certs\") pod \"network-metrics-daemon-mdzdp\" (UID: \"93109160-9bbe-497f-9b25-d7fa7e08508f\") " pod="openshift-multus/network-metrics-daemon-mdzdp" Apr 22 19:23:45.328330 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327399 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-host-var-lib-cni-bin\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.328866 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327793 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4b878ced-b265-4741-a08e-35a9b9c87d5a-socket-dir\") pod \"aws-ebs-csi-driver-node-2rzjr\" (UID: \"4b878ced-b265-4741-a08e-35a9b9c87d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2rzjr" Apr 22 19:23:45.328866 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327827 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4b878ced-b265-4741-a08e-35a9b9c87d5a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2rzjr\" (UID: \"4b878ced-b265-4741-a08e-35a9b9c87d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2rzjr" Apr 22 19:23:45.328866 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327835 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kq7bf\" (UniqueName: \"kubernetes.io/projected/b9527658-a6b8-4270-ae5d-f451e61ca79f-kube-api-access-kq7bf\") pod \"iptables-alerter-284wh\" (UID: \"b9527658-a6b8-4270-ae5d-f451e61ca79f\") " pod="openshift-network-operator/iptables-alerter-284wh" Apr 22 19:23:45.328866 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327865 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4blq2\" (UniqueName: \"kubernetes.io/projected/2b0ddba0-98a1-4870-acbb-832572b5d62b-kube-api-access-4blq2\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.328866 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327869 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-multus-cni-dir\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.328866 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327878 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-etc-openvswitch\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.328866 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327890 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/06227928-4a6e-4e0e-b991-1f9a395b21c4-ovnkube-config\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.328866 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327916 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-hostroot\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.328866 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327916 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/20682e94-9131-4a8d-a325-b9f45d2fd64f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-z6hh4\" (UID: \"20682e94-9131-4a8d-a325-b9f45d2fd64f\") " pod="openshift-multus/multus-additional-cni-plugins-z6hh4" Apr 22 19:23:45.328866 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327953 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fb2a25eb-d7fc-4fe6-a965-be2f517c03ab-serviceca\") pod \"node-ca-95pbb\" (UID: \"fb2a25eb-d7fc-4fe6-a965-be2f517c03ab\") " pod="openshift-image-registry/node-ca-95pbb" Apr 22 19:23:45.328866 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327958 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sckkq\" (UniqueName: \"kubernetes.io/projected/4b878ced-b265-4741-a08e-35a9b9c87d5a-kube-api-access-sckkq\") pod \"aws-ebs-csi-driver-node-2rzjr\" (UID: \"4b878ced-b265-4741-a08e-35a9b9c87d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2rzjr" Apr 22 19:23:45.328866 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327986 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2b0ddba0-98a1-4870-acbb-832572b5d62b-lib-modules\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.328866 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.327995 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2b0ddba0-98a1-4870-acbb-832572b5d62b-etc-kubernetes\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.328866 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.328031 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/4b878ced-b265-4741-a08e-35a9b9c87d5a-sys-fs\") pod \"aws-ebs-csi-driver-node-2rzjr\" (UID: \"4b878ced-b265-4741-a08e-35a9b9c87d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2rzjr" Apr 22 19:23:45.328866 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.328055 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2b0ddba0-98a1-4870-acbb-832572b5d62b-etc-sysctl-conf\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.328866 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.328073 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-var-lib-openvswitch\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.328866 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.328076 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b0ddba0-98a1-4870-acbb-832572b5d62b-host\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.329628 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.328108 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-system-cni-dir\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.329628 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.328115 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b0ddba0-98a1-4870-acbb-832572b5d62b-host\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.329628 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.328161 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-host-cni-bin\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.329628 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.328170 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/4b878ced-b265-4741-a08e-35a9b9c87d5a-sys-fs\") pod \"aws-ebs-csi-driver-node-2rzjr\" (UID: \"4b878ced-b265-4741-a08e-35a9b9c87d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2rzjr" Apr 22 19:23:45.329628 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.328033 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2b0ddba0-98a1-4870-acbb-832572b5d62b-var-lib-kubelet\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.329628 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.328188 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/06227928-4a6e-4e0e-b991-1f9a395b21c4-ovn-node-metrics-cert\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.329628 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.328205 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/20682e94-9131-4a8d-a325-b9f45d2fd64f-os-release\") pod \"multus-additional-cni-plugins-z6hh4\" (UID: \"20682e94-9131-4a8d-a325-b9f45d2fd64f\") " pod="openshift-multus/multus-additional-cni-plugins-z6hh4" Apr 22 19:23:45.329628 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.328216 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/4b878ced-b265-4741-a08e-35a9b9c87d5a-etc-selinux\") pod \"aws-ebs-csi-driver-node-2rzjr\" (UID: \"4b878ced-b265-4741-a08e-35a9b9c87d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2rzjr" Apr 22 19:23:45.329628 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:45.328249 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:45.329628 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.328249 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/20682e94-9131-4a8d-a325-b9f45d2fd64f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-z6hh4\" (UID: \"20682e94-9131-4a8d-a325-b9f45d2fd64f\") " pod="openshift-multus/multus-additional-cni-plugins-z6hh4" Apr 22 19:23:45.329628 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.328276 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4cfg7\" (UniqueName: \"kubernetes.io/projected/93109160-9bbe-497f-9b25-d7fa7e08508f-kube-api-access-4cfg7\") pod \"network-metrics-daemon-mdzdp\" (UID: \"93109160-9bbe-497f-9b25-d7fa7e08508f\") " pod="openshift-multus/network-metrics-daemon-mdzdp" Apr 22 19:23:45.329628 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:45.328324 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93109160-9bbe-497f-9b25-d7fa7e08508f-metrics-certs podName:93109160-9bbe-497f-9b25-d7fa7e08508f nodeName:}" failed. No retries permitted until 2026-04-22 19:23:45.828291581 +0000 UTC m=+3.111962032 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93109160-9bbe-497f-9b25-d7fa7e08508f-metrics-certs") pod "network-metrics-daemon-mdzdp" (UID: "93109160-9bbe-497f-9b25-d7fa7e08508f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:45.329628 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.328326 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4b878ced-b265-4741-a08e-35a9b9c87d5a-socket-dir\") pod \"aws-ebs-csi-driver-node-2rzjr\" (UID: \"4b878ced-b265-4741-a08e-35a9b9c87d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2rzjr" Apr 22 19:23:45.329628 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.328340 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/68be7f81-7c86-4929-92a2-0347981c9140-multus-daemon-config\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.329628 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.328364 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-host-kubelet\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.329628 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.328385 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-run-ovn\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.329628 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.328405 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/20682e94-9131-4a8d-a325-b9f45d2fd64f-system-cni-dir\") pod \"multus-additional-cni-plugins-z6hh4\" (UID: \"20682e94-9131-4a8d-a325-b9f45d2fd64f\") " pod="openshift-multus/multus-additional-cni-plugins-z6hh4" Apr 22 19:23:45.330429 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.328425 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b9527658-a6b8-4270-ae5d-f451e61ca79f-host-slash\") pod \"iptables-alerter-284wh\" (UID: \"b9527658-a6b8-4270-ae5d-f451e61ca79f\") " pod="openshift-network-operator/iptables-alerter-284wh" Apr 22 19:23:45.330429 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.328447 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2b0ddba0-98a1-4870-acbb-832572b5d62b-etc-sysconfig\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.330429 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.328446 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2b0ddba0-98a1-4870-acbb-832572b5d62b-etc-sysctl-conf\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.330429 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.328485 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2b0ddba0-98a1-4870-acbb-832572b5d62b-etc-kubernetes\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.330429 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.328506 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-host-cni-bin\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.330429 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.328518 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e586a32a-1d89-4ae4-a0b6-0667215a50e4-konnectivity-ca\") pod \"konnectivity-agent-ns824\" (UID: \"e586a32a-1d89-4ae4-a0b6-0667215a50e4\") " pod="kube-system/konnectivity-agent-ns824" Apr 22 19:23:45.330429 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.328947 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/20682e94-9131-4a8d-a325-b9f45d2fd64f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-z6hh4\" (UID: \"20682e94-9131-4a8d-a325-b9f45d2fd64f\") " pod="openshift-multus/multus-additional-cni-plugins-z6hh4" Apr 22 19:23:45.330429 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.329044 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e586a32a-1d89-4ae4-a0b6-0667215a50e4-konnectivity-ca\") pod \"konnectivity-agent-ns824\" (UID: \"e586a32a-1d89-4ae4-a0b6-0667215a50e4\") " pod="kube-system/konnectivity-agent-ns824" Apr 22 19:23:45.330429 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.329104 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/68be7f81-7c86-4929-92a2-0347981c9140-multus-daemon-config\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.330429 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.329112 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-log-socket\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.330429 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.329163 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-run-systemd\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.330429 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.329172 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-host-kubelet\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.330429 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.329203 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-host-cni-netd\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.330429 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.329210 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b9527658-a6b8-4270-ae5d-f451e61ca79f-host-slash\") pod \"iptables-alerter-284wh\" (UID: \"b9527658-a6b8-4270-ae5d-f451e61ca79f\") " pod="openshift-network-operator/iptables-alerter-284wh" Apr 22 19:23:45.330429 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.329232 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2b0ddba0-98a1-4870-acbb-832572b5d62b-etc-sysconfig\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.330429 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.329251 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-system-cni-dir\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.330429 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.329286 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-run-ovn\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.330429 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.329331 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/20682e94-9131-4a8d-a325-b9f45d2fd64f-system-cni-dir\") pod \"multus-additional-cni-plugins-z6hh4\" (UID: \"20682e94-9131-4a8d-a325-b9f45d2fd64f\") " pod="openshift-multus/multus-additional-cni-plugins-z6hh4" Apr 22 19:23:45.331276 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.329527 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/4b878ced-b265-4741-a08e-35a9b9c87d5a-etc-selinux\") pod \"aws-ebs-csi-driver-node-2rzjr\" (UID: \"4b878ced-b265-4741-a08e-35a9b9c87d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2rzjr" Apr 22 19:23:45.331276 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.329568 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-multus-socket-dir-parent\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.331276 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.329597 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-host-var-lib-kubelet\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.331276 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.329624 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-etc-kubernetes\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.331276 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.329650 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-host-run-ovn-kubernetes\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.331276 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.329667 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/06227928-4a6e-4e0e-b991-1f9a395b21c4-ovnkube-config\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.331276 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.329675 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/20682e94-9131-4a8d-a325-b9f45d2fd64f-cni-binary-copy\") pod \"multus-additional-cni-plugins-z6hh4\" (UID: \"20682e94-9131-4a8d-a325-b9f45d2fd64f\") " pod="openshift-multus/multus-additional-cni-plugins-z6hh4" Apr 22 19:23:45.331276 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.329699 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2b0ddba0-98a1-4870-acbb-832572b5d62b-etc-modprobe-d\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.331276 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.329725 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2b0ddba0-98a1-4870-acbb-832572b5d62b-etc-systemd\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.331276 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.329750 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/20682e94-9131-4a8d-a325-b9f45d2fd64f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-z6hh4\" (UID: \"20682e94-9131-4a8d-a325-b9f45d2fd64f\") " pod="openshift-multus/multus-additional-cni-plugins-z6hh4" Apr 22 19:23:45.331276 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.329768 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-multus-socket-dir-parent\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.331276 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.329753 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-host-var-lib-cni-multus\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.331276 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.329802 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-host-var-lib-cni-multus\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.331276 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.329824 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-host-slash\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.331276 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.329831 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-host-var-lib-kubelet\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.331276 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.329847 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-host-run-ovn-kubernetes\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.331276 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.329852 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bpcz4\" (UniqueName: \"kubernetes.io/projected/06227928-4a6e-4e0e-b991-1f9a395b21c4-kube-api-access-bpcz4\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.332086 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.329874 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fb2a25eb-d7fc-4fe6-a965-be2f517c03ab-host\") pod \"node-ca-95pbb\" (UID: \"fb2a25eb-d7fc-4fe6-a965-be2f517c03ab\") " pod="openshift-image-registry/node-ca-95pbb" Apr 22 19:23:45.332086 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.329890 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2b0ddba0-98a1-4870-acbb-832572b5d62b-etc-systemd\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.332086 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.329895 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2b0ddba0-98a1-4870-acbb-832572b5d62b-etc-tuned\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.332086 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.329727 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/68be7f81-7c86-4929-92a2-0347981c9140-etc-kubernetes\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.332086 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.329915 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/20682e94-9131-4a8d-a325-b9f45d2fd64f-cnibin\") pod \"multus-additional-cni-plugins-z6hh4\" (UID: \"20682e94-9131-4a8d-a325-b9f45d2fd64f\") " pod="openshift-multus/multus-additional-cni-plugins-z6hh4" Apr 22 19:23:45.332086 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.329919 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2b0ddba0-98a1-4870-acbb-832572b5d62b-etc-modprobe-d\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.332086 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.329935 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2b0ddba0-98a1-4870-acbb-832572b5d62b-run\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.332086 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.329954 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2b0ddba0-98a1-4870-acbb-832572b5d62b-sys\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.332086 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.329959 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fb2a25eb-d7fc-4fe6-a965-be2f517c03ab-host\") pod \"node-ca-95pbb\" (UID: \"fb2a25eb-d7fc-4fe6-a965-be2f517c03ab\") " pod="openshift-image-registry/node-ca-95pbb" Apr 22 19:23:45.332086 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.330018 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/20682e94-9131-4a8d-a325-b9f45d2fd64f-cnibin\") pod \"multus-additional-cni-plugins-z6hh4\" (UID: \"20682e94-9131-4a8d-a325-b9f45d2fd64f\") " pod="openshift-multus/multus-additional-cni-plugins-z6hh4" Apr 22 19:23:45.332086 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.330061 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2b0ddba0-98a1-4870-acbb-832572b5d62b-run\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.332086 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.330099 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2b0ddba0-98a1-4870-acbb-832572b5d62b-sys\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.332086 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.330134 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/06227928-4a6e-4e0e-b991-1f9a395b21c4-host-slash\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.332086 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.330329 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/20682e94-9131-4a8d-a325-b9f45d2fd64f-cni-binary-copy\") pod \"multus-additional-cni-plugins-z6hh4\" (UID: \"20682e94-9131-4a8d-a325-b9f45d2fd64f\") " pod="openshift-multus/multus-additional-cni-plugins-z6hh4" Apr 22 19:23:45.332086 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.330477 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/06227928-4a6e-4e0e-b991-1f9a395b21c4-ovnkube-script-lib\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.332086 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.331156 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2b0ddba0-98a1-4870-acbb-832572b5d62b-tmp\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.332086 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.331302 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/06227928-4a6e-4e0e-b991-1f9a395b21c4-ovn-node-metrics-cert\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.332634 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.332259 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e586a32a-1d89-4ae4-a0b6-0667215a50e4-agent-certs\") pod \"konnectivity-agent-ns824\" (UID: \"e586a32a-1d89-4ae4-a0b6-0667215a50e4\") " pod="kube-system/konnectivity-agent-ns824" Apr 22 19:23:45.332634 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.332413 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2b0ddba0-98a1-4870-acbb-832572b5d62b-etc-tuned\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.333007 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:45.332983 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:45.333007 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:45.333010 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:45.333297 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:45.333023 2574 projected.go:194] Error preparing data for projected volume kube-api-access-k8wln for pod openshift-network-diagnostics/network-check-target-jzltp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:45.333297 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:45.333086 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9bbee64a-2154-4ea2-9299-c15d3614e769-kube-api-access-k8wln podName:9bbee64a-2154-4ea2-9299-c15d3614e769 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:45.833068746 +0000 UTC m=+3.116739207 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-k8wln" (UniqueName: "kubernetes.io/projected/9bbee64a-2154-4ea2-9299-c15d3614e769-kube-api-access-k8wln") pod "network-check-target-jzltp" (UID: "9bbee64a-2154-4ea2-9299-c15d3614e769") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:45.334764 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.334741 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjzhq\" (UniqueName: \"kubernetes.io/projected/fb2a25eb-d7fc-4fe6-a965-be2f517c03ab-kube-api-access-cjzhq\") pod \"node-ca-95pbb\" (UID: \"fb2a25eb-d7fc-4fe6-a965-be2f517c03ab\") " pod="openshift-image-registry/node-ca-95pbb" Apr 22 19:23:45.335506 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.335478 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcs8z\" (UniqueName: \"kubernetes.io/projected/68be7f81-7c86-4929-92a2-0347981c9140-kube-api-access-rcs8z\") pod \"multus-gkjzf\" (UID: \"68be7f81-7c86-4929-92a2-0347981c9140\") " pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.335794 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.335764 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5xqp\" (UniqueName: \"kubernetes.io/projected/20682e94-9131-4a8d-a325-b9f45d2fd64f-kube-api-access-q5xqp\") pod \"multus-additional-cni-plugins-z6hh4\" (UID: \"20682e94-9131-4a8d-a325-b9f45d2fd64f\") " pod="openshift-multus/multus-additional-cni-plugins-z6hh4" Apr 22 19:23:45.337350 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.337328 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cfg7\" (UniqueName: \"kubernetes.io/projected/93109160-9bbe-497f-9b25-d7fa7e08508f-kube-api-access-4cfg7\") pod \"network-metrics-daemon-mdzdp\" (UID: \"93109160-9bbe-497f-9b25-d7fa7e08508f\") " pod="openshift-multus/network-metrics-daemon-mdzdp" Apr 22 19:23:45.337449 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.337350 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sckkq\" (UniqueName: \"kubernetes.io/projected/4b878ced-b265-4741-a08e-35a9b9c87d5a-kube-api-access-sckkq\") pod \"aws-ebs-csi-driver-node-2rzjr\" (UID: \"4b878ced-b265-4741-a08e-35a9b9c87d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2rzjr" Apr 22 19:23:45.337931 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.337900 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq7bf\" (UniqueName: \"kubernetes.io/projected/b9527658-a6b8-4270-ae5d-f451e61ca79f-kube-api-access-kq7bf\") pod \"iptables-alerter-284wh\" (UID: \"b9527658-a6b8-4270-ae5d-f451e61ca79f\") " pod="openshift-network-operator/iptables-alerter-284wh" Apr 22 19:23:45.338658 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.338636 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpcz4\" (UniqueName: \"kubernetes.io/projected/06227928-4a6e-4e0e-b991-1f9a395b21c4-kube-api-access-bpcz4\") pod \"ovnkube-node-pqkn7\" (UID: \"06227928-4a6e-4e0e-b991-1f9a395b21c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.341538 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.341518 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4blq2\" (UniqueName: \"kubernetes.io/projected/2b0ddba0-98a1-4870-acbb-832572b5d62b-kube-api-access-4blq2\") pod \"tuned-89wlr\" (UID: \"2b0ddba0-98a1-4870-acbb-832572b5d62b\") " pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.433044 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.432971 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:45.508482 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.508436 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-284wh" Apr 22 19:23:45.520483 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.520458 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:23:45.528864 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.528840 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-89wlr" Apr 22 19:23:45.535533 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.535507 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-95pbb" Apr 22 19:23:45.541117 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.541098 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-z6hh4" Apr 22 19:23:45.547675 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.547657 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-ns824" Apr 22 19:23:45.555265 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.555243 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2rzjr" Apr 22 19:23:45.560081 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.560064 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gkjzf" Apr 22 19:23:45.833482 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.833452 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93109160-9bbe-497f-9b25-d7fa7e08508f-metrics-certs\") pod \"network-metrics-daemon-mdzdp\" (UID: \"93109160-9bbe-497f-9b25-d7fa7e08508f\") " pod="openshift-multus/network-metrics-daemon-mdzdp" Apr 22 19:23:45.833655 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:45.833512 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8wln\" (UniqueName: \"kubernetes.io/projected/9bbee64a-2154-4ea2-9299-c15d3614e769-kube-api-access-k8wln\") pod \"network-check-target-jzltp\" (UID: \"9bbee64a-2154-4ea2-9299-c15d3614e769\") " pod="openshift-network-diagnostics/network-check-target-jzltp" Apr 22 19:23:45.833655 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:45.833594 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:45.833655 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:45.833654 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93109160-9bbe-497f-9b25-d7fa7e08508f-metrics-certs podName:93109160-9bbe-497f-9b25-d7fa7e08508f nodeName:}" failed. No retries permitted until 2026-04-22 19:23:46.833639241 +0000 UTC m=+4.117309692 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93109160-9bbe-497f-9b25-d7fa7e08508f-metrics-certs") pod "network-metrics-daemon-mdzdp" (UID: "93109160-9bbe-497f-9b25-d7fa7e08508f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:45.833837 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:45.833657 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:45.833837 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:45.833675 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:45.833837 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:45.833687 2574 projected.go:194] Error preparing data for projected volume kube-api-access-k8wln for pod openshift-network-diagnostics/network-check-target-jzltp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:45.833837 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:45.833744 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9bbee64a-2154-4ea2-9299-c15d3614e769-kube-api-access-k8wln podName:9bbee64a-2154-4ea2-9299-c15d3614e769 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:46.833726888 +0000 UTC m=+4.117397358 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-k8wln" (UniqueName: "kubernetes.io/projected/9bbee64a-2154-4ea2-9299-c15d3614e769-kube-api-access-k8wln") pod "network-check-target-jzltp" (UID: "9bbee64a-2154-4ea2-9299-c15d3614e769") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:45.872727 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:45.872699 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b878ced_b265_4741_a08e_35a9b9c87d5a.slice/crio-ca7378ab4a04e933e435a2aefe15f0793805105f7771370b5cd47c3116a422d9 WatchSource:0}: Error finding container ca7378ab4a04e933e435a2aefe15f0793805105f7771370b5cd47c3116a422d9: Status 404 returned error can't find the container with id ca7378ab4a04e933e435a2aefe15f0793805105f7771370b5cd47c3116a422d9 Apr 22 19:23:45.875350 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:45.875324 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06227928_4a6e_4e0e_b991_1f9a395b21c4.slice/crio-9ec5875c1ea72bc4e36d6fb385b4c4e47f469143e07490b0d38547933cb7b09c WatchSource:0}: Error finding container 9ec5875c1ea72bc4e36d6fb385b4c4e47f469143e07490b0d38547933cb7b09c: Status 404 returned error can't find the container with id 9ec5875c1ea72bc4e36d6fb385b4c4e47f469143e07490b0d38547933cb7b09c Apr 22 19:23:45.877025 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:45.877003 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9527658_a6b8_4270_ae5d_f451e61ca79f.slice/crio-a2046552ddda71dab49f6f5a917e58b6e6a56c12c6850f7597c95bfb0b5ce018 WatchSource:0}: Error finding container a2046552ddda71dab49f6f5a917e58b6e6a56c12c6850f7597c95bfb0b5ce018: Status 404 returned error can't find the container with id a2046552ddda71dab49f6f5a917e58b6e6a56c12c6850f7597c95bfb0b5ce018 Apr 22 19:23:45.879253 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:45.879231 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode586a32a_1d89_4ae4_a0b6_0667215a50e4.slice/crio-af8200d95092a89a6597f5e28d6ba7407321581700f1b2eb84fa5da0c11197a5 WatchSource:0}: Error finding container af8200d95092a89a6597f5e28d6ba7407321581700f1b2eb84fa5da0c11197a5: Status 404 returned error can't find the container with id af8200d95092a89a6597f5e28d6ba7407321581700f1b2eb84fa5da0c11197a5 Apr 22 19:23:45.880602 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:45.880572 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb2a25eb_d7fc_4fe6_a965_be2f517c03ab.slice/crio-be6b984b76e75831eb7b2740d44068c18e35b58a81f78a0a653540ad30222244 WatchSource:0}: Error finding container be6b984b76e75831eb7b2740d44068c18e35b58a81f78a0a653540ad30222244: Status 404 returned error can't find the container with id be6b984b76e75831eb7b2740d44068c18e35b58a81f78a0a653540ad30222244 Apr 22 19:23:45.881263 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:23:45.881238 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b0ddba0_98a1_4870_acbb_832572b5d62b.slice/crio-4c86a1fc1c8d8c77f5e6cf10433203dcc1680fe7e49c0b067217be68bf945fa3 WatchSource:0}: Error finding container 4c86a1fc1c8d8c77f5e6cf10433203dcc1680fe7e49c0b067217be68bf945fa3: Status 404 returned error can't find the container with id 4c86a1fc1c8d8c77f5e6cf10433203dcc1680fe7e49c0b067217be68bf945fa3 Apr 22 19:23:46.258406 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:46.258110 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:18:44 +0000 UTC" deadline="2028-01-07 11:15:21.79824187 +0000 UTC" Apr 22 19:23:46.258406 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:46.258376 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14991h51m35.539872583s" Apr 22 19:23:46.346352 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:46.345876 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jzltp" Apr 22 19:23:46.346352 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:46.345995 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jzltp" podUID="9bbee64a-2154-4ea2-9299-c15d3614e769" Apr 22 19:23:46.359007 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:46.358950 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-ns824" event={"ID":"e586a32a-1d89-4ae4-a0b6-0667215a50e4","Type":"ContainerStarted","Data":"af8200d95092a89a6597f5e28d6ba7407321581700f1b2eb84fa5da0c11197a5"} Apr 22 19:23:46.362843 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:46.362759 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-95pbb" event={"ID":"fb2a25eb-d7fc-4fe6-a965-be2f517c03ab","Type":"ContainerStarted","Data":"be6b984b76e75831eb7b2740d44068c18e35b58a81f78a0a653540ad30222244"} Apr 22 19:23:46.367504 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:46.367443 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-284wh" event={"ID":"b9527658-a6b8-4270-ae5d-f451e61ca79f","Type":"ContainerStarted","Data":"a2046552ddda71dab49f6f5a917e58b6e6a56c12c6850f7597c95bfb0b5ce018"} Apr 22 19:23:46.369705 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:46.369662 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2rzjr" event={"ID":"4b878ced-b265-4741-a08e-35a9b9c87d5a","Type":"ContainerStarted","Data":"ca7378ab4a04e933e435a2aefe15f0793805105f7771370b5cd47c3116a422d9"} Apr 22 19:23:46.377475 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:46.377445 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-89wlr" event={"ID":"2b0ddba0-98a1-4870-acbb-832572b5d62b","Type":"ContainerStarted","Data":"4c86a1fc1c8d8c77f5e6cf10433203dcc1680fe7e49c0b067217be68bf945fa3"} Apr 22 19:23:46.379089 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:46.379027 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" event={"ID":"06227928-4a6e-4e0e-b991-1f9a395b21c4","Type":"ContainerStarted","Data":"9ec5875c1ea72bc4e36d6fb385b4c4e47f469143e07490b0d38547933cb7b09c"} Apr 22 19:23:46.382801 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:46.382083 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-22.ec2.internal" event={"ID":"1b90ee820fd4186f1e6cd40d24ef3276","Type":"ContainerStarted","Data":"feabd544acc293f7e5254c1be80946b0a0067a05331f445bb92ea62db62e0129"} Apr 22 19:23:46.388998 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:46.388969 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gkjzf" event={"ID":"68be7f81-7c86-4929-92a2-0347981c9140","Type":"ContainerStarted","Data":"e1f418670902e8c35a2ac812a98ee839d61f9e5008c0d2168794e8c30f8569e6"} Apr 22 19:23:46.395473 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:46.395434 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z6hh4" event={"ID":"20682e94-9131-4a8d-a325-b9f45d2fd64f","Type":"ContainerStarted","Data":"07a85e65619a64ab6ed090ce3bcef6b05b86749f9672e0f9f24798d8beca3882"} Apr 22 19:23:46.399230 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:46.399166 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-22.ec2.internal" podStartSLOduration=2.399151838 podStartE2EDuration="2.399151838s" podCreationTimestamp="2026-04-22 19:23:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:23:46.399120625 +0000 UTC m=+3.682791099" watchObservedRunningTime="2026-04-22 19:23:46.399151838 +0000 UTC m=+3.682822312" Apr 22 19:23:46.840129 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:46.840091 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8wln\" (UniqueName: \"kubernetes.io/projected/9bbee64a-2154-4ea2-9299-c15d3614e769-kube-api-access-k8wln\") pod \"network-check-target-jzltp\" (UID: \"9bbee64a-2154-4ea2-9299-c15d3614e769\") " pod="openshift-network-diagnostics/network-check-target-jzltp" Apr 22 19:23:46.840316 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:46.840164 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93109160-9bbe-497f-9b25-d7fa7e08508f-metrics-certs\") pod \"network-metrics-daemon-mdzdp\" (UID: \"93109160-9bbe-497f-9b25-d7fa7e08508f\") " pod="openshift-multus/network-metrics-daemon-mdzdp" Apr 22 19:23:46.840316 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:46.840290 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:46.840430 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:46.840353 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93109160-9bbe-497f-9b25-d7fa7e08508f-metrics-certs podName:93109160-9bbe-497f-9b25-d7fa7e08508f nodeName:}" failed. No retries permitted until 2026-04-22 19:23:48.840334851 +0000 UTC m=+6.124005304 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93109160-9bbe-497f-9b25-d7fa7e08508f-metrics-certs") pod "network-metrics-daemon-mdzdp" (UID: "93109160-9bbe-497f-9b25-d7fa7e08508f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:46.840818 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:46.840798 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:46.840915 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:46.840829 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:46.840915 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:46.840842 2574 projected.go:194] Error preparing data for projected volume kube-api-access-k8wln for pod openshift-network-diagnostics/network-check-target-jzltp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:46.840915 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:46.840883 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9bbee64a-2154-4ea2-9299-c15d3614e769-kube-api-access-k8wln podName:9bbee64a-2154-4ea2-9299-c15d3614e769 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:48.840869897 +0000 UTC m=+6.124540351 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-k8wln" (UniqueName: "kubernetes.io/projected/9bbee64a-2154-4ea2-9299-c15d3614e769-kube-api-access-k8wln") pod "network-check-target-jzltp" (UID: "9bbee64a-2154-4ea2-9299-c15d3614e769") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:47.345630 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:47.345597 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdzdp" Apr 22 19:23:47.346270 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:47.345737 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdzdp" podUID="93109160-9bbe-497f-9b25-d7fa7e08508f" Apr 22 19:23:47.404965 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:47.404802 2574 generic.go:358] "Generic (PLEG): container finished" podID="f546dccbfe88d958c8bad79dd015e11c" containerID="719c11a6f5f2c2f9d3d265b841635535f964b7d6885941528295885fc0bc4799" exitCode=0 Apr 22 19:23:47.404965 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:47.404897 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" event={"ID":"f546dccbfe88d958c8bad79dd015e11c","Type":"ContainerDied","Data":"719c11a6f5f2c2f9d3d265b841635535f964b7d6885941528295885fc0bc4799"} Apr 22 19:23:48.144048 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:48.143227 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-hcbdv"] Apr 22 19:23:48.145231 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:48.145206 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hcbdv" Apr 22 19:23:48.145376 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:48.145285 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hcbdv" podUID="029be2e7-8cf1-404e-bf0d-59ccb446ec17" Apr 22 19:23:48.254319 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:48.254280 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/029be2e7-8cf1-404e-bf0d-59ccb446ec17-kubelet-config\") pod \"global-pull-secret-syncer-hcbdv\" (UID: \"029be2e7-8cf1-404e-bf0d-59ccb446ec17\") " pod="kube-system/global-pull-secret-syncer-hcbdv" Apr 22 19:23:48.254499 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:48.254331 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/029be2e7-8cf1-404e-bf0d-59ccb446ec17-original-pull-secret\") pod \"global-pull-secret-syncer-hcbdv\" (UID: \"029be2e7-8cf1-404e-bf0d-59ccb446ec17\") " pod="kube-system/global-pull-secret-syncer-hcbdv" Apr 22 19:23:48.254499 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:48.254387 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/029be2e7-8cf1-404e-bf0d-59ccb446ec17-dbus\") pod \"global-pull-secret-syncer-hcbdv\" (UID: \"029be2e7-8cf1-404e-bf0d-59ccb446ec17\") " pod="kube-system/global-pull-secret-syncer-hcbdv" Apr 22 19:23:48.344849 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:48.344821 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jzltp" Apr 22 19:23:48.345010 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:48.344940 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jzltp" podUID="9bbee64a-2154-4ea2-9299-c15d3614e769" Apr 22 19:23:48.354860 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:48.354827 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/029be2e7-8cf1-404e-bf0d-59ccb446ec17-kubelet-config\") pod \"global-pull-secret-syncer-hcbdv\" (UID: \"029be2e7-8cf1-404e-bf0d-59ccb446ec17\") " pod="kube-system/global-pull-secret-syncer-hcbdv" Apr 22 19:23:48.355272 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:48.354871 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/029be2e7-8cf1-404e-bf0d-59ccb446ec17-original-pull-secret\") pod \"global-pull-secret-syncer-hcbdv\" (UID: \"029be2e7-8cf1-404e-bf0d-59ccb446ec17\") " pod="kube-system/global-pull-secret-syncer-hcbdv" Apr 22 19:23:48.355272 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:48.354926 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/029be2e7-8cf1-404e-bf0d-59ccb446ec17-dbus\") pod \"global-pull-secret-syncer-hcbdv\" (UID: \"029be2e7-8cf1-404e-bf0d-59ccb446ec17\") " pod="kube-system/global-pull-secret-syncer-hcbdv" Apr 22 19:23:48.355272 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:48.354997 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/029be2e7-8cf1-404e-bf0d-59ccb446ec17-kubelet-config\") pod \"global-pull-secret-syncer-hcbdv\" (UID: \"029be2e7-8cf1-404e-bf0d-59ccb446ec17\") " pod="kube-system/global-pull-secret-syncer-hcbdv" Apr 22 19:23:48.355272 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:48.355091 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:48.355272 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:48.355104 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/029be2e7-8cf1-404e-bf0d-59ccb446ec17-dbus\") pod \"global-pull-secret-syncer-hcbdv\" (UID: \"029be2e7-8cf1-404e-bf0d-59ccb446ec17\") " pod="kube-system/global-pull-secret-syncer-hcbdv" Apr 22 19:23:48.355272 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:48.355142 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/029be2e7-8cf1-404e-bf0d-59ccb446ec17-original-pull-secret podName:029be2e7-8cf1-404e-bf0d-59ccb446ec17 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:48.855123573 +0000 UTC m=+6.138794037 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/029be2e7-8cf1-404e-bf0d-59ccb446ec17-original-pull-secret") pod "global-pull-secret-syncer-hcbdv" (UID: "029be2e7-8cf1-404e-bf0d-59ccb446ec17") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:48.410746 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:48.410572 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" event={"ID":"f546dccbfe88d958c8bad79dd015e11c","Type":"ContainerStarted","Data":"9344f18d083303e94d6b357f68ea13169e711f5673b1fc955fd1558690324d1a"} Apr 22 19:23:48.858520 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:48.858488 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93109160-9bbe-497f-9b25-d7fa7e08508f-metrics-certs\") pod \"network-metrics-daemon-mdzdp\" (UID: \"93109160-9bbe-497f-9b25-d7fa7e08508f\") " pod="openshift-multus/network-metrics-daemon-mdzdp" Apr 22 19:23:48.858722 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:48.858548 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/029be2e7-8cf1-404e-bf0d-59ccb446ec17-original-pull-secret\") pod \"global-pull-secret-syncer-hcbdv\" (UID: \"029be2e7-8cf1-404e-bf0d-59ccb446ec17\") " pod="kube-system/global-pull-secret-syncer-hcbdv" Apr 22 19:23:48.858722 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:48.858599 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8wln\" (UniqueName: \"kubernetes.io/projected/9bbee64a-2154-4ea2-9299-c15d3614e769-kube-api-access-k8wln\") pod \"network-check-target-jzltp\" (UID: \"9bbee64a-2154-4ea2-9299-c15d3614e769\") " pod="openshift-network-diagnostics/network-check-target-jzltp" Apr 22 19:23:48.858722 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:48.858648 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:48.858722 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:48.858717 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:48.858960 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:48.858734 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:48.858960 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:48.858746 2574 projected.go:194] Error preparing data for projected volume kube-api-access-k8wln for pod openshift-network-diagnostics/network-check-target-jzltp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:48.858960 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:48.858885 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:48.858960 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:48.858748 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93109160-9bbe-497f-9b25-d7fa7e08508f-metrics-certs podName:93109160-9bbe-497f-9b25-d7fa7e08508f nodeName:}" failed. No retries permitted until 2026-04-22 19:23:52.858694857 +0000 UTC m=+10.142365311 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93109160-9bbe-497f-9b25-d7fa7e08508f-metrics-certs") pod "network-metrics-daemon-mdzdp" (UID: "93109160-9bbe-497f-9b25-d7fa7e08508f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:48.858960 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:48.858941 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9bbee64a-2154-4ea2-9299-c15d3614e769-kube-api-access-k8wln podName:9bbee64a-2154-4ea2-9299-c15d3614e769 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:52.85892504 +0000 UTC m=+10.142595498 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-k8wln" (UniqueName: "kubernetes.io/projected/9bbee64a-2154-4ea2-9299-c15d3614e769-kube-api-access-k8wln") pod "network-check-target-jzltp" (UID: "9bbee64a-2154-4ea2-9299-c15d3614e769") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:48.858960 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:48.858956 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/029be2e7-8cf1-404e-bf0d-59ccb446ec17-original-pull-secret podName:029be2e7-8cf1-404e-bf0d-59ccb446ec17 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:49.858947125 +0000 UTC m=+7.142617577 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/029be2e7-8cf1-404e-bf0d-59ccb446ec17-original-pull-secret") pod "global-pull-secret-syncer-hcbdv" (UID: "029be2e7-8cf1-404e-bf0d-59ccb446ec17") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:49.347142 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:49.347095 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdzdp" Apr 22 19:23:49.347306 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:49.347206 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdzdp" podUID="93109160-9bbe-497f-9b25-d7fa7e08508f" Apr 22 19:23:49.347613 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:49.347536 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hcbdv" Apr 22 19:23:49.347728 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:49.347629 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hcbdv" podUID="029be2e7-8cf1-404e-bf0d-59ccb446ec17" Apr 22 19:23:49.867901 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:49.867852 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/029be2e7-8cf1-404e-bf0d-59ccb446ec17-original-pull-secret\") pod \"global-pull-secret-syncer-hcbdv\" (UID: \"029be2e7-8cf1-404e-bf0d-59ccb446ec17\") " pod="kube-system/global-pull-secret-syncer-hcbdv" Apr 22 19:23:49.868343 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:49.868057 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:49.868343 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:49.868128 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/029be2e7-8cf1-404e-bf0d-59ccb446ec17-original-pull-secret podName:029be2e7-8cf1-404e-bf0d-59ccb446ec17 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:51.868110024 +0000 UTC m=+9.151780482 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/029be2e7-8cf1-404e-bf0d-59ccb446ec17-original-pull-secret") pod "global-pull-secret-syncer-hcbdv" (UID: "029be2e7-8cf1-404e-bf0d-59ccb446ec17") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:50.344924 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:50.344888 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jzltp" Apr 22 19:23:50.345101 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:50.345023 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jzltp" podUID="9bbee64a-2154-4ea2-9299-c15d3614e769" Apr 22 19:23:51.346211 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:51.345681 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdzdp" Apr 22 19:23:51.346211 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:51.345841 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdzdp" podUID="93109160-9bbe-497f-9b25-d7fa7e08508f" Apr 22 19:23:51.346852 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:51.346824 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hcbdv" Apr 22 19:23:51.346956 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:51.346936 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hcbdv" podUID="029be2e7-8cf1-404e-bf0d-59ccb446ec17" Apr 22 19:23:51.886040 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:51.885995 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/029be2e7-8cf1-404e-bf0d-59ccb446ec17-original-pull-secret\") pod \"global-pull-secret-syncer-hcbdv\" (UID: \"029be2e7-8cf1-404e-bf0d-59ccb446ec17\") " pod="kube-system/global-pull-secret-syncer-hcbdv" Apr 22 19:23:51.886257 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:51.886223 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:51.886328 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:51.886290 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/029be2e7-8cf1-404e-bf0d-59ccb446ec17-original-pull-secret podName:029be2e7-8cf1-404e-bf0d-59ccb446ec17 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:55.886275191 +0000 UTC m=+13.169945645 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/029be2e7-8cf1-404e-bf0d-59ccb446ec17-original-pull-secret") pod "global-pull-secret-syncer-hcbdv" (UID: "029be2e7-8cf1-404e-bf0d-59ccb446ec17") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:52.345179 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:52.345135 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jzltp" Apr 22 19:23:52.345364 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:52.345268 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jzltp" podUID="9bbee64a-2154-4ea2-9299-c15d3614e769" Apr 22 19:23:52.895365 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:52.895324 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8wln\" (UniqueName: \"kubernetes.io/projected/9bbee64a-2154-4ea2-9299-c15d3614e769-kube-api-access-k8wln\") pod \"network-check-target-jzltp\" (UID: \"9bbee64a-2154-4ea2-9299-c15d3614e769\") " pod="openshift-network-diagnostics/network-check-target-jzltp" Apr 22 19:23:52.895848 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:52.895400 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93109160-9bbe-497f-9b25-d7fa7e08508f-metrics-certs\") pod \"network-metrics-daemon-mdzdp\" (UID: \"93109160-9bbe-497f-9b25-d7fa7e08508f\") " pod="openshift-multus/network-metrics-daemon-mdzdp" Apr 22 19:23:52.895848 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:52.895515 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:52.895848 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:52.895525 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:52.895848 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:52.895552 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:52.895848 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:52.895567 2574 projected.go:194] Error preparing data for projected volume kube-api-access-k8wln for pod openshift-network-diagnostics/network-check-target-jzltp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:52.895848 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:52.895588 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93109160-9bbe-497f-9b25-d7fa7e08508f-metrics-certs podName:93109160-9bbe-497f-9b25-d7fa7e08508f nodeName:}" failed. No retries permitted until 2026-04-22 19:24:00.895567626 +0000 UTC m=+18.179238077 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93109160-9bbe-497f-9b25-d7fa7e08508f-metrics-certs") pod "network-metrics-daemon-mdzdp" (UID: "93109160-9bbe-497f-9b25-d7fa7e08508f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:52.895848 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:52.895619 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9bbee64a-2154-4ea2-9299-c15d3614e769-kube-api-access-k8wln podName:9bbee64a-2154-4ea2-9299-c15d3614e769 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:00.895602501 +0000 UTC m=+18.179272957 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-k8wln" (UniqueName: "kubernetes.io/projected/9bbee64a-2154-4ea2-9299-c15d3614e769-kube-api-access-k8wln") pod "network-check-target-jzltp" (UID: "9bbee64a-2154-4ea2-9299-c15d3614e769") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:53.346520 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:53.346482 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hcbdv" Apr 22 19:23:53.346708 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:53.346588 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hcbdv" podUID="029be2e7-8cf1-404e-bf0d-59ccb446ec17" Apr 22 19:23:53.347033 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:53.346864 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdzdp" Apr 22 19:23:53.347033 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:53.347009 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdzdp" podUID="93109160-9bbe-497f-9b25-d7fa7e08508f" Apr 22 19:23:54.345194 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:54.345121 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jzltp" Apr 22 19:23:54.345635 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:54.345249 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jzltp" podUID="9bbee64a-2154-4ea2-9299-c15d3614e769" Apr 22 19:23:55.345428 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:55.345363 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hcbdv" Apr 22 19:23:55.345428 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:55.345387 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdzdp" Apr 22 19:23:55.345925 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:55.345544 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdzdp" podUID="93109160-9bbe-497f-9b25-d7fa7e08508f" Apr 22 19:23:55.345925 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:55.345638 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hcbdv" podUID="029be2e7-8cf1-404e-bf0d-59ccb446ec17" Apr 22 19:23:55.779652 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:55.779595 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" podStartSLOduration=11.779578699 podStartE2EDuration="11.779578699s" podCreationTimestamp="2026-04-22 19:23:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:23:48.429693934 +0000 UTC m=+5.713364408" watchObservedRunningTime="2026-04-22 19:23:55.779578699 +0000 UTC m=+13.063249173" Apr 22 19:23:55.780114 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:55.780090 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-zsm2s"] Apr 22 19:23:55.786059 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:55.786033 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zsm2s" Apr 22 19:23:55.788634 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:55.788615 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 19:23:55.788974 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:55.788953 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-g6d9q\"" Apr 22 19:23:55.789888 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:55.789870 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 19:23:55.919503 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:55.919470 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e794b825-b003-4cc7-9af6-8dd829fbea84-tmp-dir\") pod \"node-resolver-zsm2s\" (UID: \"e794b825-b003-4cc7-9af6-8dd829fbea84\") " pod="openshift-dns/node-resolver-zsm2s" Apr 22 19:23:55.919684 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:55.919520 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/029be2e7-8cf1-404e-bf0d-59ccb446ec17-original-pull-secret\") pod \"global-pull-secret-syncer-hcbdv\" (UID: \"029be2e7-8cf1-404e-bf0d-59ccb446ec17\") " pod="kube-system/global-pull-secret-syncer-hcbdv" Apr 22 19:23:55.919684 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:55.919599 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e794b825-b003-4cc7-9af6-8dd829fbea84-hosts-file\") pod \"node-resolver-zsm2s\" (UID: \"e794b825-b003-4cc7-9af6-8dd829fbea84\") " pod="openshift-dns/node-resolver-zsm2s" Apr 22 19:23:55.919684 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:55.919620 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:55.919684 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:55.919672 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2mvz\" (UniqueName: \"kubernetes.io/projected/e794b825-b003-4cc7-9af6-8dd829fbea84-kube-api-access-j2mvz\") pod \"node-resolver-zsm2s\" (UID: \"e794b825-b003-4cc7-9af6-8dd829fbea84\") " pod="openshift-dns/node-resolver-zsm2s" Apr 22 19:23:55.919882 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:55.919735 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/029be2e7-8cf1-404e-bf0d-59ccb446ec17-original-pull-secret podName:029be2e7-8cf1-404e-bf0d-59ccb446ec17 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:03.919719105 +0000 UTC m=+21.203389563 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/029be2e7-8cf1-404e-bf0d-59ccb446ec17-original-pull-secret") pod "global-pull-secret-syncer-hcbdv" (UID: "029be2e7-8cf1-404e-bf0d-59ccb446ec17") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:56.020091 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:56.020050 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e794b825-b003-4cc7-9af6-8dd829fbea84-tmp-dir\") pod \"node-resolver-zsm2s\" (UID: \"e794b825-b003-4cc7-9af6-8dd829fbea84\") " pod="openshift-dns/node-resolver-zsm2s" Apr 22 19:23:56.020251 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:56.020128 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e794b825-b003-4cc7-9af6-8dd829fbea84-hosts-file\") pod \"node-resolver-zsm2s\" (UID: \"e794b825-b003-4cc7-9af6-8dd829fbea84\") " pod="openshift-dns/node-resolver-zsm2s" Apr 22 19:23:56.020251 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:56.020177 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j2mvz\" (UniqueName: \"kubernetes.io/projected/e794b825-b003-4cc7-9af6-8dd829fbea84-kube-api-access-j2mvz\") pod \"node-resolver-zsm2s\" (UID: \"e794b825-b003-4cc7-9af6-8dd829fbea84\") " pod="openshift-dns/node-resolver-zsm2s" Apr 22 19:23:56.020365 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:56.020264 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e794b825-b003-4cc7-9af6-8dd829fbea84-hosts-file\") pod \"node-resolver-zsm2s\" (UID: \"e794b825-b003-4cc7-9af6-8dd829fbea84\") " pod="openshift-dns/node-resolver-zsm2s" Apr 22 19:23:56.020436 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:56.020420 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e794b825-b003-4cc7-9af6-8dd829fbea84-tmp-dir\") pod \"node-resolver-zsm2s\" (UID: \"e794b825-b003-4cc7-9af6-8dd829fbea84\") " pod="openshift-dns/node-resolver-zsm2s" Apr 22 19:23:56.029505 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:56.029483 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2mvz\" (UniqueName: \"kubernetes.io/projected/e794b825-b003-4cc7-9af6-8dd829fbea84-kube-api-access-j2mvz\") pod \"node-resolver-zsm2s\" (UID: \"e794b825-b003-4cc7-9af6-8dd829fbea84\") " pod="openshift-dns/node-resolver-zsm2s" Apr 22 19:23:56.095588 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:56.095515 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zsm2s" Apr 22 19:23:56.345825 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:56.345739 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jzltp" Apr 22 19:23:56.346254 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:56.345872 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jzltp" podUID="9bbee64a-2154-4ea2-9299-c15d3614e769" Apr 22 19:23:57.348029 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:57.348000 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdzdp" Apr 22 19:23:57.348029 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:57.348011 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hcbdv" Apr 22 19:23:57.348612 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:57.348128 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdzdp" podUID="93109160-9bbe-497f-9b25-d7fa7e08508f" Apr 22 19:23:57.348612 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:57.348257 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hcbdv" podUID="029be2e7-8cf1-404e-bf0d-59ccb446ec17" Apr 22 19:23:58.345142 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:58.345103 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jzltp" Apr 22 19:23:58.345313 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:58.345239 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jzltp" podUID="9bbee64a-2154-4ea2-9299-c15d3614e769" Apr 22 19:23:59.345098 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:59.345063 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdzdp" Apr 22 19:23:59.345473 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:59.345202 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdzdp" podUID="93109160-9bbe-497f-9b25-d7fa7e08508f" Apr 22 19:23:59.345473 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:23:59.345271 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hcbdv" Apr 22 19:23:59.345473 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:23:59.345385 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hcbdv" podUID="029be2e7-8cf1-404e-bf0d-59ccb446ec17" Apr 22 19:24:00.345215 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:00.345184 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jzltp" Apr 22 19:24:00.345654 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:00.345297 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jzltp" podUID="9bbee64a-2154-4ea2-9299-c15d3614e769" Apr 22 19:24:00.957413 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:00.957373 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8wln\" (UniqueName: \"kubernetes.io/projected/9bbee64a-2154-4ea2-9299-c15d3614e769-kube-api-access-k8wln\") pod \"network-check-target-jzltp\" (UID: \"9bbee64a-2154-4ea2-9299-c15d3614e769\") " pod="openshift-network-diagnostics/network-check-target-jzltp" Apr 22 19:24:00.957591 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:00.957437 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93109160-9bbe-497f-9b25-d7fa7e08508f-metrics-certs\") pod \"network-metrics-daemon-mdzdp\" (UID: \"93109160-9bbe-497f-9b25-d7fa7e08508f\") " pod="openshift-multus/network-metrics-daemon-mdzdp" Apr 22 19:24:00.957591 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:00.957562 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:24:00.957724 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:00.957563 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:24:00.957724 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:00.957618 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93109160-9bbe-497f-9b25-d7fa7e08508f-metrics-certs podName:93109160-9bbe-497f-9b25-d7fa7e08508f nodeName:}" failed. No retries permitted until 2026-04-22 19:24:16.957604977 +0000 UTC m=+34.241275427 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93109160-9bbe-497f-9b25-d7fa7e08508f-metrics-certs") pod "network-metrics-daemon-mdzdp" (UID: "93109160-9bbe-497f-9b25-d7fa7e08508f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:24:00.957724 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:00.957644 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:24:00.957724 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:00.957662 2574 projected.go:194] Error preparing data for projected volume kube-api-access-k8wln for pod openshift-network-diagnostics/network-check-target-jzltp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:24:00.957724 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:00.957721 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9bbee64a-2154-4ea2-9299-c15d3614e769-kube-api-access-k8wln podName:9bbee64a-2154-4ea2-9299-c15d3614e769 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:16.957703337 +0000 UTC m=+34.241373805 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-k8wln" (UniqueName: "kubernetes.io/projected/9bbee64a-2154-4ea2-9299-c15d3614e769-kube-api-access-k8wln") pod "network-check-target-jzltp" (UID: "9bbee64a-2154-4ea2-9299-c15d3614e769") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:24:01.345074 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:01.345041 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdzdp" Apr 22 19:24:01.345243 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:01.345050 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hcbdv" Apr 22 19:24:01.345243 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:01.345173 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdzdp" podUID="93109160-9bbe-497f-9b25-d7fa7e08508f" Apr 22 19:24:01.345243 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:01.345220 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hcbdv" podUID="029be2e7-8cf1-404e-bf0d-59ccb446ec17" Apr 22 19:24:02.261516 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:24:02.261316 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode794b825_b003_4cc7_9af6_8dd829fbea84.slice/crio-86ee6d86ac19ee236b8a34b3f76c4f4bf65a6a9f5704373171d6cb721b161ac0 WatchSource:0}: Error finding container 86ee6d86ac19ee236b8a34b3f76c4f4bf65a6a9f5704373171d6cb721b161ac0: Status 404 returned error can't find the container with id 86ee6d86ac19ee236b8a34b3f76c4f4bf65a6a9f5704373171d6cb721b161ac0 Apr 22 19:24:02.345825 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:02.345800 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jzltp" Apr 22 19:24:02.346211 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:02.345934 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jzltp" podUID="9bbee64a-2154-4ea2-9299-c15d3614e769" Apr 22 19:24:02.434654 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:02.434626 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zsm2s" event={"ID":"e794b825-b003-4cc7-9af6-8dd829fbea84","Type":"ContainerStarted","Data":"86ee6d86ac19ee236b8a34b3f76c4f4bf65a6a9f5704373171d6cb721b161ac0"} Apr 22 19:24:03.346048 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:03.345628 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdzdp" Apr 22 19:24:03.346900 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:03.345685 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hcbdv" Apr 22 19:24:03.346900 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:03.346159 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdzdp" podUID="93109160-9bbe-497f-9b25-d7fa7e08508f" Apr 22 19:24:03.346900 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:03.346200 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hcbdv" podUID="029be2e7-8cf1-404e-bf0d-59ccb446ec17" Apr 22 19:24:03.410027 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:03.410007 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 19:24:03.437569 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:03.437544 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gkjzf" event={"ID":"68be7f81-7c86-4929-92a2-0347981c9140","Type":"ContainerStarted","Data":"60249fc9394daea9a85537c661e8db061e2493ddc318313612bff993dbed7571"} Apr 22 19:24:03.438885 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:03.438862 2574 generic.go:358] "Generic (PLEG): container finished" podID="20682e94-9131-4a8d-a325-b9f45d2fd64f" containerID="99dfa271658ffb78f724df5e378f75f0e09927c55cbd7850e034e7027ad51c5a" exitCode=0 Apr 22 19:24:03.438981 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:03.438924 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z6hh4" event={"ID":"20682e94-9131-4a8d-a325-b9f45d2fd64f","Type":"ContainerDied","Data":"99dfa271658ffb78f724df5e378f75f0e09927c55cbd7850e034e7027ad51c5a"} Apr 22 19:24:03.440175 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:03.440153 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-ns824" event={"ID":"e586a32a-1d89-4ae4-a0b6-0667215a50e4","Type":"ContainerStarted","Data":"de94eaea71d417210a86c6380de475ee9208edfdf80f50691c9b08ff709a0281"} Apr 22 19:24:03.441454 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:03.441351 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-95pbb" event={"ID":"fb2a25eb-d7fc-4fe6-a965-be2f517c03ab","Type":"ContainerStarted","Data":"e0756a46ffb7f66b80a7267d59b32ee11ec1ec3bd446fb95423286754fdb0d24"} Apr 22 19:24:03.442878 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:03.442853 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2rzjr" event={"ID":"4b878ced-b265-4741-a08e-35a9b9c87d5a","Type":"ContainerStarted","Data":"56d3b6ec7cc424ce42aa427e1420342541cf43a0957fc9dc208d470fd3874dcd"} Apr 22 19:24:03.442953 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:03.442880 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2rzjr" event={"ID":"4b878ced-b265-4741-a08e-35a9b9c87d5a","Type":"ContainerStarted","Data":"c235f4dd73852add97e12c131d5f70efb2d427eab7efda1678c3e7611cc72bd9"} Apr 22 19:24:03.444013 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:03.443989 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-89wlr" event={"ID":"2b0ddba0-98a1-4870-acbb-832572b5d62b","Type":"ContainerStarted","Data":"d9c300794a0fef2b53c2f441849bb6399e84b061189b634165778e743e4ce154"} Apr 22 19:24:03.446414 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:03.446395 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" event={"ID":"06227928-4a6e-4e0e-b991-1f9a395b21c4","Type":"ContainerStarted","Data":"df40e72b397f220ad7adccb7b5a6d5d4ffc0be466ed3c15a0aa4ff3bb944ee9f"} Apr 22 19:24:03.446496 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:03.446421 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" event={"ID":"06227928-4a6e-4e0e-b991-1f9a395b21c4","Type":"ContainerStarted","Data":"715651921c1329aab605071b07db8973633823070dda496e50de08a333fe5673"} Apr 22 19:24:03.446496 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:03.446434 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" event={"ID":"06227928-4a6e-4e0e-b991-1f9a395b21c4","Type":"ContainerStarted","Data":"e0733fdbd726b563f3c75b78c47ccec4f9ecb1c79f1ba9b45f0a1d3c66e516de"} Apr 22 19:24:03.446496 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:03.446447 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" event={"ID":"06227928-4a6e-4e0e-b991-1f9a395b21c4","Type":"ContainerStarted","Data":"356acf449002beccb68d80cdef5a2fb3f142559418bb9198f2ce3031ee822bbf"} Apr 22 19:24:03.446496 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:03.446459 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" event={"ID":"06227928-4a6e-4e0e-b991-1f9a395b21c4","Type":"ContainerStarted","Data":"a2c5ec71029963526b34ff8b09c46fb67e0952bdbb4603809a02f487a8532389"} Apr 22 19:24:03.446496 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:03.446470 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" event={"ID":"06227928-4a6e-4e0e-b991-1f9a395b21c4","Type":"ContainerStarted","Data":"0be067c4da1ee9bcacbb2d6cafdc5615ed953f4b06ecf1675ef93be0ba661431"} Apr 22 19:24:03.447479 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:03.447463 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zsm2s" event={"ID":"e794b825-b003-4cc7-9af6-8dd829fbea84","Type":"ContainerStarted","Data":"f8317a089775ed1ce4ddc9c5047958f5c56fdd9870c5f153267ebeb9f44025e2"} Apr 22 19:24:03.452316 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:03.452283 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-gkjzf" podStartSLOduration=4.041949206 podStartE2EDuration="20.452273532s" podCreationTimestamp="2026-04-22 19:23:43 +0000 UTC" firstStartedPulling="2026-04-22 19:23:45.886987977 +0000 UTC m=+3.170658429" lastFinishedPulling="2026-04-22 19:24:02.297312288 +0000 UTC m=+19.580982755" observedRunningTime="2026-04-22 19:24:03.452189674 +0000 UTC m=+20.735860153" watchObservedRunningTime="2026-04-22 19:24:03.452273532 +0000 UTC m=+20.735944004" Apr 22 19:24:03.465653 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:03.465617 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-89wlr" podStartSLOduration=4.086232352 podStartE2EDuration="20.4656076s" podCreationTimestamp="2026-04-22 19:23:43 +0000 UTC" firstStartedPulling="2026-04-22 19:23:45.883754932 +0000 UTC m=+3.167425384" lastFinishedPulling="2026-04-22 19:24:02.263130168 +0000 UTC m=+19.546800632" observedRunningTime="2026-04-22 19:24:03.465366143 +0000 UTC m=+20.749036615" watchObservedRunningTime="2026-04-22 19:24:03.4656076 +0000 UTC m=+20.749278073" Apr 22 19:24:03.496198 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:03.496162 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-zsm2s" podStartSLOduration=8.496151357 podStartE2EDuration="8.496151357s" podCreationTimestamp="2026-04-22 19:23:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:24:03.495715963 +0000 UTC m=+20.779386433" watchObservedRunningTime="2026-04-22 19:24:03.496151357 +0000 UTC m=+20.779821830" Apr 22 19:24:03.517402 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:03.517317 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-95pbb" podStartSLOduration=4.157457411 podStartE2EDuration="20.517306791s" podCreationTimestamp="2026-04-22 19:23:43 +0000 UTC" firstStartedPulling="2026-04-22 19:23:45.882816814 +0000 UTC m=+3.166487269" lastFinishedPulling="2026-04-22 19:24:02.242666193 +0000 UTC m=+19.526336649" observedRunningTime="2026-04-22 19:24:03.516915872 +0000 UTC m=+20.800586345" watchObservedRunningTime="2026-04-22 19:24:03.517306791 +0000 UTC m=+20.800977263" Apr 22 19:24:03.531506 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:03.531463 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-ns824" podStartSLOduration=4.154930253 podStartE2EDuration="20.531451739s" podCreationTimestamp="2026-04-22 19:23:43 +0000 UTC" firstStartedPulling="2026-04-22 19:23:45.881758252 +0000 UTC m=+3.165428707" lastFinishedPulling="2026-04-22 19:24:02.258279737 +0000 UTC m=+19.541950193" observedRunningTime="2026-04-22 19:24:03.53104535 +0000 UTC m=+20.814715824" watchObservedRunningTime="2026-04-22 19:24:03.531451739 +0000 UTC m=+20.815122212" Apr 22 19:24:03.978990 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:03.978904 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/029be2e7-8cf1-404e-bf0d-59ccb446ec17-original-pull-secret\") pod \"global-pull-secret-syncer-hcbdv\" (UID: \"029be2e7-8cf1-404e-bf0d-59ccb446ec17\") " pod="kube-system/global-pull-secret-syncer-hcbdv" Apr 22 19:24:03.979158 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:03.979088 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:24:03.979216 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:03.979173 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/029be2e7-8cf1-404e-bf0d-59ccb446ec17-original-pull-secret podName:029be2e7-8cf1-404e-bf0d-59ccb446ec17 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:19.97915252 +0000 UTC m=+37.262822985 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/029be2e7-8cf1-404e-bf0d-59ccb446ec17-original-pull-secret") pod "global-pull-secret-syncer-hcbdv" (UID: "029be2e7-8cf1-404e-bf0d-59ccb446ec17") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:24:04.220409 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:04.220379 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-ns824" Apr 22 19:24:04.220949 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:04.220932 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-ns824" Apr 22 19:24:04.282756 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:04.282650 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T19:24:03.410023122Z","UUID":"51741bfc-65db-4d32-9757-03388ce223ad","Handler":null,"Name":"","Endpoint":""} Apr 22 19:24:04.284626 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:04.284606 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 19:24:04.284805 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:04.284635 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 19:24:04.345647 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:04.345622 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jzltp" Apr 22 19:24:04.345770 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:04.345735 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jzltp" podUID="9bbee64a-2154-4ea2-9299-c15d3614e769" Apr 22 19:24:04.451196 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:04.450945 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-284wh" event={"ID":"b9527658-a6b8-4270-ae5d-f451e61ca79f","Type":"ContainerStarted","Data":"a191ff71bb14dc0b7982d1daea5f2df425ea7e0a1f42abf81dff55166a101e26"} Apr 22 19:24:04.452857 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:04.452827 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2rzjr" event={"ID":"4b878ced-b265-4741-a08e-35a9b9c87d5a","Type":"ContainerStarted","Data":"028ec8cacd60cb09070fc7649b26586519f31fa804bacb05bc5703951ff3c364"} Apr 22 19:24:04.453402 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:04.453386 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-ns824" Apr 22 19:24:04.453860 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:04.453845 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-ns824" Apr 22 19:24:04.499631 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:04.499582 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2rzjr" podStartSLOduration=3.119346034 podStartE2EDuration="21.499564137s" podCreationTimestamp="2026-04-22 19:23:43 +0000 UTC" firstStartedPulling="2026-04-22 19:23:45.87425587 +0000 UTC m=+3.157926321" lastFinishedPulling="2026-04-22 19:24:04.254473968 +0000 UTC m=+21.538144424" observedRunningTime="2026-04-22 19:24:04.499439871 +0000 UTC m=+21.783110344" watchObservedRunningTime="2026-04-22 19:24:04.499564137 +0000 UTC m=+21.783234612" Apr 22 19:24:04.499915 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:04.499875 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-284wh" podStartSLOduration=5.122905419 podStartE2EDuration="21.499865799s" podCreationTimestamp="2026-04-22 19:23:43 +0000 UTC" firstStartedPulling="2026-04-22 19:23:45.879797602 +0000 UTC m=+3.163468067" lastFinishedPulling="2026-04-22 19:24:02.256757992 +0000 UTC m=+19.540428447" observedRunningTime="2026-04-22 19:24:04.472861284 +0000 UTC m=+21.756531758" watchObservedRunningTime="2026-04-22 19:24:04.499865799 +0000 UTC m=+21.783536303" Apr 22 19:24:05.345542 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:05.345458 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdzdp" Apr 22 19:24:05.345542 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:05.345470 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hcbdv" Apr 22 19:24:05.345808 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:05.345600 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdzdp" podUID="93109160-9bbe-497f-9b25-d7fa7e08508f" Apr 22 19:24:05.345808 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:05.345699 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hcbdv" podUID="029be2e7-8cf1-404e-bf0d-59ccb446ec17" Apr 22 19:24:05.457717 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:05.457683 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" event={"ID":"06227928-4a6e-4e0e-b991-1f9a395b21c4","Type":"ContainerStarted","Data":"50a8a741cf94ac1b801a548eeb34aafab86727b27a5ee797781ea898c5dfdbe5"} Apr 22 19:24:06.344921 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:06.344889 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jzltp" Apr 22 19:24:06.345149 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:06.345009 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jzltp" podUID="9bbee64a-2154-4ea2-9299-c15d3614e769" Apr 22 19:24:07.345179 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:07.345138 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hcbdv" Apr 22 19:24:07.345179 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:07.345178 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdzdp" Apr 22 19:24:07.345842 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:07.345280 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hcbdv" podUID="029be2e7-8cf1-404e-bf0d-59ccb446ec17" Apr 22 19:24:07.345842 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:07.345450 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdzdp" podUID="93109160-9bbe-497f-9b25-d7fa7e08508f" Apr 22 19:24:08.345878 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:08.345668 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jzltp" Apr 22 19:24:08.346544 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:08.345960 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jzltp" podUID="9bbee64a-2154-4ea2-9299-c15d3614e769" Apr 22 19:24:08.465831 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:08.465794 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" event={"ID":"06227928-4a6e-4e0e-b991-1f9a395b21c4","Type":"ContainerStarted","Data":"9740b1887a43649868d61fccefce625da23d652ef25af2e558f156d962a438cd"} Apr 22 19:24:08.466211 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:08.466168 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:24:08.467508 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:08.467486 2574 generic.go:358] "Generic (PLEG): container finished" podID="20682e94-9131-4a8d-a325-b9f45d2fd64f" containerID="dc0701b320b25f81ab4350b43a79d779806e4a77591b18b1379ed4c78081d73b" exitCode=0 Apr 22 19:24:08.467616 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:08.467519 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z6hh4" event={"ID":"20682e94-9131-4a8d-a325-b9f45d2fd64f","Type":"ContainerDied","Data":"dc0701b320b25f81ab4350b43a79d779806e4a77591b18b1379ed4c78081d73b"} Apr 22 19:24:08.480968 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:08.480940 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:24:08.497582 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:08.497545 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" podStartSLOduration=8.756477137 podStartE2EDuration="25.497535997s" podCreationTimestamp="2026-04-22 19:23:43 +0000 UTC" firstStartedPulling="2026-04-22 19:23:45.878091491 +0000 UTC m=+3.161761945" lastFinishedPulling="2026-04-22 19:24:02.619150341 +0000 UTC m=+19.902820805" observedRunningTime="2026-04-22 19:24:08.497240058 +0000 UTC m=+25.780910532" watchObservedRunningTime="2026-04-22 19:24:08.497535997 +0000 UTC m=+25.781206466" Apr 22 19:24:09.345637 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:09.345608 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdzdp" Apr 22 19:24:09.345902 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:09.345606 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hcbdv" Apr 22 19:24:09.345902 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:09.345714 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdzdp" podUID="93109160-9bbe-497f-9b25-d7fa7e08508f" Apr 22 19:24:09.345902 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:09.345825 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hcbdv" podUID="029be2e7-8cf1-404e-bf0d-59ccb446ec17" Apr 22 19:24:09.469844 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:09.469815 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:24:09.470037 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:09.469853 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:24:09.484102 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:09.484076 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:24:09.646007 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:09.645967 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hcbdv"] Apr 22 19:24:09.646155 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:09.646122 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hcbdv" Apr 22 19:24:09.646289 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:09.646261 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hcbdv" podUID="029be2e7-8cf1-404e-bf0d-59ccb446ec17" Apr 22 19:24:09.649907 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:09.649866 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jzltp"] Apr 22 19:24:09.650029 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:09.649996 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jzltp" Apr 22 19:24:09.650125 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:09.650101 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jzltp" podUID="9bbee64a-2154-4ea2-9299-c15d3614e769" Apr 22 19:24:09.652726 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:09.652580 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mdzdp"] Apr 22 19:24:09.652726 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:09.652670 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdzdp" Apr 22 19:24:09.652901 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:09.652772 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdzdp" podUID="93109160-9bbe-497f-9b25-d7fa7e08508f" Apr 22 19:24:10.472954 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:10.472917 2574 generic.go:358] "Generic (PLEG): container finished" podID="20682e94-9131-4a8d-a325-b9f45d2fd64f" containerID="b806375392a7589915d00a96a6956287764410d4ea64add3f395def2c09f7213" exitCode=0 Apr 22 19:24:10.473380 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:10.473000 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z6hh4" event={"ID":"20682e94-9131-4a8d-a325-b9f45d2fd64f","Type":"ContainerDied","Data":"b806375392a7589915d00a96a6956287764410d4ea64add3f395def2c09f7213"} Apr 22 19:24:11.345299 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:11.345272 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdzdp" Apr 22 19:24:11.345475 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:11.345272 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jzltp" Apr 22 19:24:11.345475 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:11.345376 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdzdp" podUID="93109160-9bbe-497f-9b25-d7fa7e08508f" Apr 22 19:24:11.345475 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:11.345394 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hcbdv" Apr 22 19:24:11.345642 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:11.345476 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jzltp" podUID="9bbee64a-2154-4ea2-9299-c15d3614e769" Apr 22 19:24:11.345642 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:11.345544 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hcbdv" podUID="029be2e7-8cf1-404e-bf0d-59ccb446ec17" Apr 22 19:24:12.477930 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:12.477717 2574 generic.go:358] "Generic (PLEG): container finished" podID="20682e94-9131-4a8d-a325-b9f45d2fd64f" containerID="f12ca6a5855f820701dc21fad1706e776898314978f6df643e3c8c564da38d33" exitCode=0 Apr 22 19:24:12.478406 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:12.477808 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z6hh4" event={"ID":"20682e94-9131-4a8d-a325-b9f45d2fd64f","Type":"ContainerDied","Data":"f12ca6a5855f820701dc21fad1706e776898314978f6df643e3c8c564da38d33"} Apr 22 19:24:13.347698 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:13.347612 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jzltp" Apr 22 19:24:13.347866 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:13.347741 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jzltp" podUID="9bbee64a-2154-4ea2-9299-c15d3614e769" Apr 22 19:24:13.348171 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:13.348149 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hcbdv" Apr 22 19:24:13.348297 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:13.348274 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hcbdv" podUID="029be2e7-8cf1-404e-bf0d-59ccb446ec17" Apr 22 19:24:13.348364 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:13.348348 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdzdp" Apr 22 19:24:13.348477 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:13.348460 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdzdp" podUID="93109160-9bbe-497f-9b25-d7fa7e08508f" Apr 22 19:24:14.988025 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:14.987999 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeReady" Apr 22 19:24:14.988663 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:14.988135 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 19:24:15.033711 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.033679 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-658bf49c4d-qxl4w"] Apr 22 19:24:15.056834 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.056801 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-658bf49c4d-qxl4w" Apr 22 19:24:15.061116 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.060945 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 19:24:15.061583 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.061329 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 19:24:15.061583 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.061381 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-lccrt\"" Apr 22 19:24:15.061583 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.061441 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 19:24:15.062115 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.062093 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-658bf49c4d-qxl4w"] Apr 22 19:24:15.067979 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.067910 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-qsnbz"] Apr 22 19:24:15.080092 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.079913 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 19:24:15.088016 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.087496 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-lnt5s"] Apr 22 19:24:15.088016 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.087668 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qsnbz" Apr 22 19:24:15.092721 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.090863 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 19:24:15.092721 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.091125 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 19:24:15.092721 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.091306 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 19:24:15.092721 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.091492 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-76gtl\"" Apr 22 19:24:15.108619 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.108559 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-nms75"] Apr 22 19:24:15.108928 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.108897 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lnt5s" Apr 22 19:24:15.112081 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.112043 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 19:24:15.112388 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.112373 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 19:24:15.112469 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.112375 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-jd9zz\"" Apr 22 19:24:15.126408 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.126385 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qsnbz"] Apr 22 19:24:15.126408 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.126408 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-nms75"] Apr 22 19:24:15.126545 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.126418 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lnt5s"] Apr 22 19:24:15.126545 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.126518 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nms75" Apr 22 19:24:15.130628 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.130605 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 19:24:15.130866 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.130849 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-tk64m\"" Apr 22 19:24:15.130967 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.130889 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 19:24:15.169993 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.169970 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4htqr\" (UniqueName: \"kubernetes.io/projected/8207bbac-ad34-4f98-a8af-1a2daaa6ea59-kube-api-access-4htqr\") pod \"ingress-canary-qsnbz\" (UID: \"8207bbac-ad34-4f98-a8af-1a2daaa6ea59\") " pod="openshift-ingress-canary/ingress-canary-qsnbz" Apr 22 19:24:15.170158 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.170008 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-image-registry-private-configuration\") pod \"image-registry-658bf49c4d-qxl4w\" (UID: \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\") " pod="openshift-image-registry/image-registry-658bf49c4d-qxl4w" Apr 22 19:24:15.170158 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.170051 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-ca-trust-extracted\") pod \"image-registry-658bf49c4d-qxl4w\" (UID: \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\") " pod="openshift-image-registry/image-registry-658bf49c4d-qxl4w" Apr 22 19:24:15.170158 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.170094 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-trusted-ca\") pod \"image-registry-658bf49c4d-qxl4w\" (UID: \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\") " pod="openshift-image-registry/image-registry-658bf49c4d-qxl4w" Apr 22 19:24:15.170318 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.170155 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-installation-pull-secrets\") pod \"image-registry-658bf49c4d-qxl4w\" (UID: \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\") " pod="openshift-image-registry/image-registry-658bf49c4d-qxl4w" Apr 22 19:24:15.170318 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.170250 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-registry-certificates\") pod \"image-registry-658bf49c4d-qxl4w\" (UID: \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\") " pod="openshift-image-registry/image-registry-658bf49c4d-qxl4w" Apr 22 19:24:15.170318 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.170277 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-bound-sa-token\") pod \"image-registry-658bf49c4d-qxl4w\" (UID: \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\") " pod="openshift-image-registry/image-registry-658bf49c4d-qxl4w" Apr 22 19:24:15.170318 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.170307 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-registry-tls\") pod \"image-registry-658bf49c4d-qxl4w\" (UID: \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\") " pod="openshift-image-registry/image-registry-658bf49c4d-qxl4w" Apr 22 19:24:15.170496 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.170332 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8207bbac-ad34-4f98-a8af-1a2daaa6ea59-cert\") pod \"ingress-canary-qsnbz\" (UID: \"8207bbac-ad34-4f98-a8af-1a2daaa6ea59\") " pod="openshift-ingress-canary/ingress-canary-qsnbz" Apr 22 19:24:15.170496 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.170370 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwhj8\" (UniqueName: \"kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-kube-api-access-wwhj8\") pod \"image-registry-658bf49c4d-qxl4w\" (UID: \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\") " pod="openshift-image-registry/image-registry-658bf49c4d-qxl4w" Apr 22 19:24:15.271092 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.271057 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4htqr\" (UniqueName: \"kubernetes.io/projected/8207bbac-ad34-4f98-a8af-1a2daaa6ea59-kube-api-access-4htqr\") pod \"ingress-canary-qsnbz\" (UID: \"8207bbac-ad34-4f98-a8af-1a2daaa6ea59\") " pod="openshift-ingress-canary/ingress-canary-qsnbz" Apr 22 19:24:15.271255 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.271103 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be5b5131-1460-4746-96a8-4720ed712cf1-config-volume\") pod \"dns-default-lnt5s\" (UID: \"be5b5131-1460-4746-96a8-4720ed712cf1\") " pod="openshift-dns/dns-default-lnt5s" Apr 22 19:24:15.271255 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.271165 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-image-registry-private-configuration\") pod \"image-registry-658bf49c4d-qxl4w\" (UID: \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\") " pod="openshift-image-registry/image-registry-658bf49c4d-qxl4w" Apr 22 19:24:15.271377 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.271281 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-ca-trust-extracted\") pod \"image-registry-658bf49c4d-qxl4w\" (UID: \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\") " pod="openshift-image-registry/image-registry-658bf49c4d-qxl4w" Apr 22 19:24:15.271377 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.271308 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-trusted-ca\") pod \"image-registry-658bf49c4d-qxl4w\" (UID: \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\") " pod="openshift-image-registry/image-registry-658bf49c4d-qxl4w" Apr 22 19:24:15.271377 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.271330 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-installation-pull-secrets\") pod \"image-registry-658bf49c4d-qxl4w\" (UID: \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\") " pod="openshift-image-registry/image-registry-658bf49c4d-qxl4w" Apr 22 19:24:15.271512 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.271410 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-registry-certificates\") pod \"image-registry-658bf49c4d-qxl4w\" (UID: \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\") " pod="openshift-image-registry/image-registry-658bf49c4d-qxl4w" Apr 22 19:24:15.271512 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.271437 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-bound-sa-token\") pod \"image-registry-658bf49c4d-qxl4w\" (UID: \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\") " pod="openshift-image-registry/image-registry-658bf49c4d-qxl4w" Apr 22 19:24:15.271512 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.271466 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-registry-tls\") pod \"image-registry-658bf49c4d-qxl4w\" (UID: \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\") " pod="openshift-image-registry/image-registry-658bf49c4d-qxl4w" Apr 22 19:24:15.271512 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.271488 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8207bbac-ad34-4f98-a8af-1a2daaa6ea59-cert\") pod \"ingress-canary-qsnbz\" (UID: \"8207bbac-ad34-4f98-a8af-1a2daaa6ea59\") " pod="openshift-ingress-canary/ingress-canary-qsnbz" Apr 22 19:24:15.271697 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.271515 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/be5b5131-1460-4746-96a8-4720ed712cf1-tmp-dir\") pod \"dns-default-lnt5s\" (UID: \"be5b5131-1460-4746-96a8-4720ed712cf1\") " pod="openshift-dns/dns-default-lnt5s" Apr 22 19:24:15.271697 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.271557 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wwhj8\" (UniqueName: \"kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-kube-api-access-wwhj8\") pod \"image-registry-658bf49c4d-qxl4w\" (UID: \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\") " pod="openshift-image-registry/image-registry-658bf49c4d-qxl4w" Apr 22 19:24:15.271697 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.271580 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be5b5131-1460-4746-96a8-4720ed712cf1-metrics-tls\") pod \"dns-default-lnt5s\" (UID: \"be5b5131-1460-4746-96a8-4720ed712cf1\") " pod="openshift-dns/dns-default-lnt5s" Apr 22 19:24:15.271697 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.271600 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvlh4\" (UniqueName: \"kubernetes.io/projected/be5b5131-1460-4746-96a8-4720ed712cf1-kube-api-access-rvlh4\") pod \"dns-default-lnt5s\" (UID: \"be5b5131-1460-4746-96a8-4720ed712cf1\") " pod="openshift-dns/dns-default-lnt5s" Apr 22 19:24:15.271697 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.271641 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmfcm\" (UniqueName: \"kubernetes.io/projected/cb58056c-3091-4679-8665-a73d6668e604-kube-api-access-mmfcm\") pod \"network-check-source-8894fc9bd-nms75\" (UID: \"cb58056c-3091-4679-8665-a73d6668e604\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nms75" Apr 22 19:24:15.272022 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:15.271768 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:24:15.272022 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:15.271799 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-658bf49c4d-qxl4w: secret "image-registry-tls" not found Apr 22 19:24:15.272022 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.271807 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-ca-trust-extracted\") pod \"image-registry-658bf49c4d-qxl4w\" (UID: \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\") " pod="openshift-image-registry/image-registry-658bf49c4d-qxl4w" Apr 22 19:24:15.272022 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:15.271862 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-registry-tls podName:4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e nodeName:}" failed. No retries permitted until 2026-04-22 19:24:15.771843157 +0000 UTC m=+33.055513609 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-registry-tls") pod "image-registry-658bf49c4d-qxl4w" (UID: "4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e") : secret "image-registry-tls" not found Apr 22 19:24:15.272022 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:15.271938 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:15.272022 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:15.271997 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8207bbac-ad34-4f98-a8af-1a2daaa6ea59-cert podName:8207bbac-ad34-4f98-a8af-1a2daaa6ea59 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:15.77198022 +0000 UTC m=+33.055650674 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8207bbac-ad34-4f98-a8af-1a2daaa6ea59-cert") pod "ingress-canary-qsnbz" (UID: "8207bbac-ad34-4f98-a8af-1a2daaa6ea59") : secret "canary-serving-cert" not found Apr 22 19:24:15.272324 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.272236 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-registry-certificates\") pod \"image-registry-658bf49c4d-qxl4w\" (UID: \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\") " pod="openshift-image-registry/image-registry-658bf49c4d-qxl4w" Apr 22 19:24:15.272324 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.272309 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-trusted-ca\") pod \"image-registry-658bf49c4d-qxl4w\" (UID: \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\") " pod="openshift-image-registry/image-registry-658bf49c4d-qxl4w" Apr 22 19:24:15.275661 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.275639 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-image-registry-private-configuration\") pod \"image-registry-658bf49c4d-qxl4w\" (UID: \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\") " pod="openshift-image-registry/image-registry-658bf49c4d-qxl4w" Apr 22 19:24:15.275805 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.275639 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-installation-pull-secrets\") pod \"image-registry-658bf49c4d-qxl4w\" (UID: \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\") " pod="openshift-image-registry/image-registry-658bf49c4d-qxl4w" Apr 22 19:24:15.281238 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.281213 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-bound-sa-token\") pod \"image-registry-658bf49c4d-qxl4w\" (UID: \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\") " pod="openshift-image-registry/image-registry-658bf49c4d-qxl4w" Apr 22 19:24:15.281357 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.281326 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwhj8\" (UniqueName: \"kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-kube-api-access-wwhj8\") pod \"image-registry-658bf49c4d-qxl4w\" (UID: \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\") " pod="openshift-image-registry/image-registry-658bf49c4d-qxl4w" Apr 22 19:24:15.281357 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.281347 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4htqr\" (UniqueName: \"kubernetes.io/projected/8207bbac-ad34-4f98-a8af-1a2daaa6ea59-kube-api-access-4htqr\") pod \"ingress-canary-qsnbz\" (UID: \"8207bbac-ad34-4f98-a8af-1a2daaa6ea59\") " pod="openshift-ingress-canary/ingress-canary-qsnbz" Apr 22 19:24:15.345070 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.345039 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jzltp" Apr 22 19:24:15.345238 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.345153 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdzdp" Apr 22 19:24:15.345441 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.345310 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hcbdv" Apr 22 19:24:15.348487 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.348426 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 19:24:15.348487 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.348468 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tlsqf\"" Apr 22 19:24:15.348487 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.348481 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wrgld\"" Apr 22 19:24:15.348740 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.348720 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 19:24:15.372508 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.372476 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/be5b5131-1460-4746-96a8-4720ed712cf1-tmp-dir\") pod \"dns-default-lnt5s\" (UID: \"be5b5131-1460-4746-96a8-4720ed712cf1\") " pod="openshift-dns/dns-default-lnt5s" Apr 22 19:24:15.372622 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.372517 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be5b5131-1460-4746-96a8-4720ed712cf1-metrics-tls\") pod \"dns-default-lnt5s\" (UID: \"be5b5131-1460-4746-96a8-4720ed712cf1\") " pod="openshift-dns/dns-default-lnt5s" Apr 22 19:24:15.372622 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.372537 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvlh4\" (UniqueName: \"kubernetes.io/projected/be5b5131-1460-4746-96a8-4720ed712cf1-kube-api-access-rvlh4\") pod \"dns-default-lnt5s\" (UID: \"be5b5131-1460-4746-96a8-4720ed712cf1\") " pod="openshift-dns/dns-default-lnt5s" Apr 22 19:24:15.372622 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.372562 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mmfcm\" (UniqueName: \"kubernetes.io/projected/cb58056c-3091-4679-8665-a73d6668e604-kube-api-access-mmfcm\") pod \"network-check-source-8894fc9bd-nms75\" (UID: \"cb58056c-3091-4679-8665-a73d6668e604\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nms75" Apr 22 19:24:15.372622 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.372584 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be5b5131-1460-4746-96a8-4720ed712cf1-config-volume\") pod \"dns-default-lnt5s\" (UID: \"be5b5131-1460-4746-96a8-4720ed712cf1\") " pod="openshift-dns/dns-default-lnt5s" Apr 22 19:24:15.372946 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:15.372926 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:15.373059 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:15.373002 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be5b5131-1460-4746-96a8-4720ed712cf1-metrics-tls podName:be5b5131-1460-4746-96a8-4720ed712cf1 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:15.872981609 +0000 UTC m=+33.156652066 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/be5b5131-1460-4746-96a8-4720ed712cf1-metrics-tls") pod "dns-default-lnt5s" (UID: "be5b5131-1460-4746-96a8-4720ed712cf1") : secret "dns-default-metrics-tls" not found Apr 22 19:24:15.373130 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.373087 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/be5b5131-1460-4746-96a8-4720ed712cf1-tmp-dir\") pod \"dns-default-lnt5s\" (UID: \"be5b5131-1460-4746-96a8-4720ed712cf1\") " pod="openshift-dns/dns-default-lnt5s" Apr 22 19:24:15.373175 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.373151 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be5b5131-1460-4746-96a8-4720ed712cf1-config-volume\") pod \"dns-default-lnt5s\" (UID: \"be5b5131-1460-4746-96a8-4720ed712cf1\") " pod="openshift-dns/dns-default-lnt5s" Apr 22 19:24:15.381790 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.381755 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmfcm\" (UniqueName: \"kubernetes.io/projected/cb58056c-3091-4679-8665-a73d6668e604-kube-api-access-mmfcm\") pod \"network-check-source-8894fc9bd-nms75\" (UID: \"cb58056c-3091-4679-8665-a73d6668e604\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nms75" Apr 22 19:24:15.382260 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.382241 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvlh4\" (UniqueName: \"kubernetes.io/projected/be5b5131-1460-4746-96a8-4720ed712cf1-kube-api-access-rvlh4\") pod \"dns-default-lnt5s\" (UID: \"be5b5131-1460-4746-96a8-4720ed712cf1\") " pod="openshift-dns/dns-default-lnt5s" Apr 22 19:24:15.436652 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.436624 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nms75" Apr 22 19:24:15.595860 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.595660 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-nms75"] Apr 22 19:24:15.599794 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:24:15.599747 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb58056c_3091_4679_8665_a73d6668e604.slice/crio-f20f6cbe3a55f833bbb78a173ecad4b5920835bb0947a8ac31ca27187ada78f8 WatchSource:0}: Error finding container f20f6cbe3a55f833bbb78a173ecad4b5920835bb0947a8ac31ca27187ada78f8: Status 404 returned error can't find the container with id f20f6cbe3a55f833bbb78a173ecad4b5920835bb0947a8ac31ca27187ada78f8 Apr 22 19:24:15.776160 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.776129 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-registry-tls\") pod \"image-registry-658bf49c4d-qxl4w\" (UID: \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\") " pod="openshift-image-registry/image-registry-658bf49c4d-qxl4w" Apr 22 19:24:15.776357 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.776171 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8207bbac-ad34-4f98-a8af-1a2daaa6ea59-cert\") pod \"ingress-canary-qsnbz\" (UID: \"8207bbac-ad34-4f98-a8af-1a2daaa6ea59\") " pod="openshift-ingress-canary/ingress-canary-qsnbz" Apr 22 19:24:15.776357 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:15.776286 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:24:15.776357 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:15.776311 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-658bf49c4d-qxl4w: secret "image-registry-tls" not found Apr 22 19:24:15.776357 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:15.776292 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:15.776578 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:15.776371 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-registry-tls podName:4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e nodeName:}" failed. No retries permitted until 2026-04-22 19:24:16.77635602 +0000 UTC m=+34.060026471 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-registry-tls") pod "image-registry-658bf49c4d-qxl4w" (UID: "4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e") : secret "image-registry-tls" not found Apr 22 19:24:15.776578 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:15.776385 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8207bbac-ad34-4f98-a8af-1a2daaa6ea59-cert podName:8207bbac-ad34-4f98-a8af-1a2daaa6ea59 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:16.776379296 +0000 UTC m=+34.060049748 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8207bbac-ad34-4f98-a8af-1a2daaa6ea59-cert") pod "ingress-canary-qsnbz" (UID: "8207bbac-ad34-4f98-a8af-1a2daaa6ea59") : secret "canary-serving-cert" not found Apr 22 19:24:15.837063 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.837031 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-2cwpp"] Apr 22 19:24:15.858563 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.858531 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-2cwpp"] Apr 22 19:24:15.858729 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.858598 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-2cwpp" Apr 22 19:24:15.861183 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.861162 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 22 19:24:15.861282 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.861168 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-l8htc\"" Apr 22 19:24:15.861282 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.861214 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 22 19:24:15.876920 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.876898 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be5b5131-1460-4746-96a8-4720ed712cf1-metrics-tls\") pod \"dns-default-lnt5s\" (UID: \"be5b5131-1460-4746-96a8-4720ed712cf1\") " pod="openshift-dns/dns-default-lnt5s" Apr 22 19:24:15.877058 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:15.877018 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:15.877104 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:15.877062 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be5b5131-1460-4746-96a8-4720ed712cf1-metrics-tls podName:be5b5131-1460-4746-96a8-4720ed712cf1 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:16.87704999 +0000 UTC m=+34.160720441 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/be5b5131-1460-4746-96a8-4720ed712cf1-metrics-tls") pod "dns-default-lnt5s" (UID: "be5b5131-1460-4746-96a8-4720ed712cf1") : secret "dns-default-metrics-tls" not found Apr 22 19:24:15.977683 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.977648 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/76e16566-d9dd-4f0b-9cc5-a72e27e4518a-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-2cwpp\" (UID: \"76e16566-d9dd-4f0b-9cc5-a72e27e4518a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-2cwpp" Apr 22 19:24:15.977875 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:15.977792 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/76e16566-d9dd-4f0b-9cc5-a72e27e4518a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-2cwpp\" (UID: \"76e16566-d9dd-4f0b-9cc5-a72e27e4518a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-2cwpp" Apr 22 19:24:16.078488 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:16.078392 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/76e16566-d9dd-4f0b-9cc5-a72e27e4518a-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-2cwpp\" (UID: \"76e16566-d9dd-4f0b-9cc5-a72e27e4518a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-2cwpp" Apr 22 19:24:16.078488 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:16.078451 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/76e16566-d9dd-4f0b-9cc5-a72e27e4518a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-2cwpp\" (UID: \"76e16566-d9dd-4f0b-9cc5-a72e27e4518a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-2cwpp" Apr 22 19:24:16.079163 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:16.078580 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:24:16.079163 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:16.078639 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76e16566-d9dd-4f0b-9cc5-a72e27e4518a-networking-console-plugin-cert podName:76e16566-d9dd-4f0b-9cc5-a72e27e4518a nodeName:}" failed. No retries permitted until 2026-04-22 19:24:16.578625078 +0000 UTC m=+33.862295528 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/76e16566-d9dd-4f0b-9cc5-a72e27e4518a-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-2cwpp" (UID: "76e16566-d9dd-4f0b-9cc5-a72e27e4518a") : secret "networking-console-plugin-cert" not found Apr 22 19:24:16.079163 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:16.079066 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/76e16566-d9dd-4f0b-9cc5-a72e27e4518a-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-2cwpp\" (UID: \"76e16566-d9dd-4f0b-9cc5-a72e27e4518a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-2cwpp" Apr 22 19:24:16.485746 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:16.485651 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nms75" event={"ID":"cb58056c-3091-4679-8665-a73d6668e604","Type":"ContainerStarted","Data":"f20f6cbe3a55f833bbb78a173ecad4b5920835bb0947a8ac31ca27187ada78f8"} Apr 22 19:24:16.582068 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:16.582019 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/76e16566-d9dd-4f0b-9cc5-a72e27e4518a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-2cwpp\" (UID: \"76e16566-d9dd-4f0b-9cc5-a72e27e4518a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-2cwpp" Apr 22 19:24:16.582319 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:16.582151 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:24:16.582319 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:16.582237 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76e16566-d9dd-4f0b-9cc5-a72e27e4518a-networking-console-plugin-cert podName:76e16566-d9dd-4f0b-9cc5-a72e27e4518a nodeName:}" failed. No retries permitted until 2026-04-22 19:24:17.582213991 +0000 UTC m=+34.865884445 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/76e16566-d9dd-4f0b-9cc5-a72e27e4518a-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-2cwpp" (UID: "76e16566-d9dd-4f0b-9cc5-a72e27e4518a") : secret "networking-console-plugin-cert" not found Apr 22 19:24:16.783727 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:16.783689 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-registry-tls\") pod \"image-registry-658bf49c4d-qxl4w\" (UID: \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\") " pod="openshift-image-registry/image-registry-658bf49c4d-qxl4w" Apr 22 19:24:16.783727 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:16.783727 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8207bbac-ad34-4f98-a8af-1a2daaa6ea59-cert\") pod \"ingress-canary-qsnbz\" (UID: \"8207bbac-ad34-4f98-a8af-1a2daaa6ea59\") " pod="openshift-ingress-canary/ingress-canary-qsnbz" Apr 22 19:24:16.784074 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:16.783852 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:24:16.784074 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:16.783869 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:16.784074 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:16.783875 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-658bf49c4d-qxl4w: secret "image-registry-tls" not found Apr 22 19:24:16.784074 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:16.783942 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-registry-tls podName:4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e nodeName:}" failed. No retries permitted until 2026-04-22 19:24:18.783924616 +0000 UTC m=+36.067595070 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-registry-tls") pod "image-registry-658bf49c4d-qxl4w" (UID: "4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e") : secret "image-registry-tls" not found Apr 22 19:24:16.784074 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:16.783958 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8207bbac-ad34-4f98-a8af-1a2daaa6ea59-cert podName:8207bbac-ad34-4f98-a8af-1a2daaa6ea59 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:18.783951775 +0000 UTC m=+36.067622226 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8207bbac-ad34-4f98-a8af-1a2daaa6ea59-cert") pod "ingress-canary-qsnbz" (UID: "8207bbac-ad34-4f98-a8af-1a2daaa6ea59") : secret "canary-serving-cert" not found Apr 22 19:24:16.884661 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:16.884622 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be5b5131-1460-4746-96a8-4720ed712cf1-metrics-tls\") pod \"dns-default-lnt5s\" (UID: \"be5b5131-1460-4746-96a8-4720ed712cf1\") " pod="openshift-dns/dns-default-lnt5s" Apr 22 19:24:16.884837 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:16.884760 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:16.884901 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:16.884845 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be5b5131-1460-4746-96a8-4720ed712cf1-metrics-tls podName:be5b5131-1460-4746-96a8-4720ed712cf1 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:18.88483022 +0000 UTC m=+36.168500670 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/be5b5131-1460-4746-96a8-4720ed712cf1-metrics-tls") pod "dns-default-lnt5s" (UID: "be5b5131-1460-4746-96a8-4720ed712cf1") : secret "dns-default-metrics-tls" not found Apr 22 19:24:16.985760 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:16.985727 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8wln\" (UniqueName: \"kubernetes.io/projected/9bbee64a-2154-4ea2-9299-c15d3614e769-kube-api-access-k8wln\") pod \"network-check-target-jzltp\" (UID: \"9bbee64a-2154-4ea2-9299-c15d3614e769\") " pod="openshift-network-diagnostics/network-check-target-jzltp" Apr 22 19:24:16.985971 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:16.985831 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93109160-9bbe-497f-9b25-d7fa7e08508f-metrics-certs\") pod \"network-metrics-daemon-mdzdp\" (UID: \"93109160-9bbe-497f-9b25-d7fa7e08508f\") " pod="openshift-multus/network-metrics-daemon-mdzdp" Apr 22 19:24:16.985971 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:16.985931 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:24:16.986090 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:16.985992 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93109160-9bbe-497f-9b25-d7fa7e08508f-metrics-certs podName:93109160-9bbe-497f-9b25-d7fa7e08508f nodeName:}" failed. No retries permitted until 2026-04-22 19:24:48.985977575 +0000 UTC m=+66.269648043 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93109160-9bbe-497f-9b25-d7fa7e08508f-metrics-certs") pod "network-metrics-daemon-mdzdp" (UID: "93109160-9bbe-497f-9b25-d7fa7e08508f") : secret "metrics-daemon-secret" not found Apr 22 19:24:16.989511 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:16.989485 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8wln\" (UniqueName: \"kubernetes.io/projected/9bbee64a-2154-4ea2-9299-c15d3614e769-kube-api-access-k8wln\") pod \"network-check-target-jzltp\" (UID: \"9bbee64a-2154-4ea2-9299-c15d3614e769\") " pod="openshift-network-diagnostics/network-check-target-jzltp" Apr 22 19:24:17.159384 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:17.159287 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jzltp" Apr 22 19:24:17.590537 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:17.590502 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/76e16566-d9dd-4f0b-9cc5-a72e27e4518a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-2cwpp\" (UID: \"76e16566-d9dd-4f0b-9cc5-a72e27e4518a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-2cwpp" Apr 22 19:24:17.590713 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:17.590654 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:24:17.590761 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:17.590717 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76e16566-d9dd-4f0b-9cc5-a72e27e4518a-networking-console-plugin-cert podName:76e16566-d9dd-4f0b-9cc5-a72e27e4518a nodeName:}" failed. No retries permitted until 2026-04-22 19:24:19.590699756 +0000 UTC m=+36.874370218 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/76e16566-d9dd-4f0b-9cc5-a72e27e4518a-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-2cwpp" (UID: "76e16566-d9dd-4f0b-9cc5-a72e27e4518a") : secret "networking-console-plugin-cert" not found Apr 22 19:24:18.145375 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:18.145344 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jzltp"] Apr 22 19:24:18.224980 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:24:18.224936 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bbee64a_2154_4ea2_9299_c15d3614e769.slice/crio-82885d224d6d5d634ac7cb5316868af74d2387fe3a64ea876107e5a8139793dc WatchSource:0}: Error finding container 82885d224d6d5d634ac7cb5316868af74d2387fe3a64ea876107e5a8139793dc: Status 404 returned error can't find the container with id 82885d224d6d5d634ac7cb5316868af74d2387fe3a64ea876107e5a8139793dc Apr 22 19:24:18.491491 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:18.491248 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jzltp" event={"ID":"9bbee64a-2154-4ea2-9299-c15d3614e769","Type":"ContainerStarted","Data":"82885d224d6d5d634ac7cb5316868af74d2387fe3a64ea876107e5a8139793dc"} Apr 22 19:24:18.494126 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:18.494088 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z6hh4" event={"ID":"20682e94-9131-4a8d-a325-b9f45d2fd64f","Type":"ContainerStarted","Data":"bc5a48380b16c50299884472e0a9863bbeec1155bfaa565384c0a1e7dc80bf4b"} Apr 22 19:24:18.801554 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:18.801512 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-registry-tls\") pod \"image-registry-658bf49c4d-qxl4w\" (UID: \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\") " pod="openshift-image-registry/image-registry-658bf49c4d-qxl4w" Apr 22 19:24:18.801738 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:18.801561 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8207bbac-ad34-4f98-a8af-1a2daaa6ea59-cert\") pod \"ingress-canary-qsnbz\" (UID: \"8207bbac-ad34-4f98-a8af-1a2daaa6ea59\") " pod="openshift-ingress-canary/ingress-canary-qsnbz" Apr 22 19:24:18.801738 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:18.801662 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:24:18.801738 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:18.801688 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-658bf49c4d-qxl4w: secret "image-registry-tls" not found Apr 22 19:24:18.801738 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:18.801690 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:18.802003 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:18.801758 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-registry-tls podName:4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e nodeName:}" failed. No retries permitted until 2026-04-22 19:24:22.801736966 +0000 UTC m=+40.085407423 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-registry-tls") pod "image-registry-658bf49c4d-qxl4w" (UID: "4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e") : secret "image-registry-tls" not found Apr 22 19:24:18.802003 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:18.801809 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8207bbac-ad34-4f98-a8af-1a2daaa6ea59-cert podName:8207bbac-ad34-4f98-a8af-1a2daaa6ea59 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:22.801797864 +0000 UTC m=+40.085468315 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8207bbac-ad34-4f98-a8af-1a2daaa6ea59-cert") pod "ingress-canary-qsnbz" (UID: "8207bbac-ad34-4f98-a8af-1a2daaa6ea59") : secret "canary-serving-cert" not found Apr 22 19:24:18.902497 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:18.902413 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be5b5131-1460-4746-96a8-4720ed712cf1-metrics-tls\") pod \"dns-default-lnt5s\" (UID: \"be5b5131-1460-4746-96a8-4720ed712cf1\") " pod="openshift-dns/dns-default-lnt5s" Apr 22 19:24:18.902651 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:18.902594 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:18.902712 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:18.902676 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be5b5131-1460-4746-96a8-4720ed712cf1-metrics-tls podName:be5b5131-1460-4746-96a8-4720ed712cf1 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:22.902651656 +0000 UTC m=+40.186322118 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/be5b5131-1460-4746-96a8-4720ed712cf1-metrics-tls") pod "dns-default-lnt5s" (UID: "be5b5131-1460-4746-96a8-4720ed712cf1") : secret "dns-default-metrics-tls" not found Apr 22 19:24:19.499288 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:19.499249 2574 generic.go:358] "Generic (PLEG): container finished" podID="20682e94-9131-4a8d-a325-b9f45d2fd64f" containerID="bc5a48380b16c50299884472e0a9863bbeec1155bfaa565384c0a1e7dc80bf4b" exitCode=0 Apr 22 19:24:19.499809 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:19.499333 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z6hh4" event={"ID":"20682e94-9131-4a8d-a325-b9f45d2fd64f","Type":"ContainerDied","Data":"bc5a48380b16c50299884472e0a9863bbeec1155bfaa565384c0a1e7dc80bf4b"} Apr 22 19:24:19.610402 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:19.610361 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/76e16566-d9dd-4f0b-9cc5-a72e27e4518a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-2cwpp\" (UID: \"76e16566-d9dd-4f0b-9cc5-a72e27e4518a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-2cwpp" Apr 22 19:24:19.610960 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:19.610934 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:24:19.611082 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:19.611005 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76e16566-d9dd-4f0b-9cc5-a72e27e4518a-networking-console-plugin-cert podName:76e16566-d9dd-4f0b-9cc5-a72e27e4518a nodeName:}" failed. No retries permitted until 2026-04-22 19:24:23.610983862 +0000 UTC m=+40.894654319 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/76e16566-d9dd-4f0b-9cc5-a72e27e4518a-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-2cwpp" (UID: "76e16566-d9dd-4f0b-9cc5-a72e27e4518a") : secret "networking-console-plugin-cert" not found Apr 22 19:24:20.013061 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:20.013020 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/029be2e7-8cf1-404e-bf0d-59ccb446ec17-original-pull-secret\") pod \"global-pull-secret-syncer-hcbdv\" (UID: \"029be2e7-8cf1-404e-bf0d-59ccb446ec17\") " pod="kube-system/global-pull-secret-syncer-hcbdv" Apr 22 19:24:20.017307 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:20.017275 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/029be2e7-8cf1-404e-bf0d-59ccb446ec17-original-pull-secret\") pod \"global-pull-secret-syncer-hcbdv\" (UID: \"029be2e7-8cf1-404e-bf0d-59ccb446ec17\") " pod="kube-system/global-pull-secret-syncer-hcbdv" Apr 22 19:24:20.171245 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:20.171209 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hcbdv" Apr 22 19:24:20.504094 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:20.504059 2574 generic.go:358] "Generic (PLEG): container finished" podID="20682e94-9131-4a8d-a325-b9f45d2fd64f" containerID="ac0fd84cf02bdd4c54b250b1d9963ea106a1a740aff641ad45190d138ae42d74" exitCode=0 Apr 22 19:24:20.504612 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:20.504112 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z6hh4" event={"ID":"20682e94-9131-4a8d-a325-b9f45d2fd64f","Type":"ContainerDied","Data":"ac0fd84cf02bdd4c54b250b1d9963ea106a1a740aff641ad45190d138ae42d74"} Apr 22 19:24:20.555316 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:20.555282 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f6fb44f9-n2844"] Apr 22 19:24:20.597826 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:20.597771 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f6fb44f9-n2844"] Apr 22 19:24:20.597995 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:20.597940 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f6fb44f9-n2844" Apr 22 19:24:20.600584 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:20.600561 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 19:24:20.600584 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:20.600562 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 19:24:20.600759 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:20.600609 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 19:24:20.600759 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:20.600587 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 22 19:24:20.720164 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:20.720123 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cx2h\" (UniqueName: \"kubernetes.io/projected/09d7293f-3362-40d6-8f2d-17b0e9e06c9f-kube-api-access-5cx2h\") pod \"klusterlet-addon-workmgr-86f6fb44f9-n2844\" (UID: \"09d7293f-3362-40d6-8f2d-17b0e9e06c9f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f6fb44f9-n2844" Apr 22 19:24:20.720351 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:20.720226 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/09d7293f-3362-40d6-8f2d-17b0e9e06c9f-tmp\") pod \"klusterlet-addon-workmgr-86f6fb44f9-n2844\" (UID: \"09d7293f-3362-40d6-8f2d-17b0e9e06c9f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f6fb44f9-n2844" Apr 22 19:24:20.720423 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:20.720390 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/09d7293f-3362-40d6-8f2d-17b0e9e06c9f-klusterlet-config\") pod \"klusterlet-addon-workmgr-86f6fb44f9-n2844\" (UID: \"09d7293f-3362-40d6-8f2d-17b0e9e06c9f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f6fb44f9-n2844" Apr 22 19:24:20.821762 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:20.821724 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/09d7293f-3362-40d6-8f2d-17b0e9e06c9f-tmp\") pod \"klusterlet-addon-workmgr-86f6fb44f9-n2844\" (UID: \"09d7293f-3362-40d6-8f2d-17b0e9e06c9f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f6fb44f9-n2844" Apr 22 19:24:20.821920 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:20.821900 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/09d7293f-3362-40d6-8f2d-17b0e9e06c9f-klusterlet-config\") pod \"klusterlet-addon-workmgr-86f6fb44f9-n2844\" (UID: \"09d7293f-3362-40d6-8f2d-17b0e9e06c9f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f6fb44f9-n2844" Apr 22 19:24:20.821920 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:20.821956 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5cx2h\" (UniqueName: \"kubernetes.io/projected/09d7293f-3362-40d6-8f2d-17b0e9e06c9f-kube-api-access-5cx2h\") pod \"klusterlet-addon-workmgr-86f6fb44f9-n2844\" (UID: \"09d7293f-3362-40d6-8f2d-17b0e9e06c9f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f6fb44f9-n2844" Apr 22 19:24:20.822102 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:20.822055 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/09d7293f-3362-40d6-8f2d-17b0e9e06c9f-tmp\") pod \"klusterlet-addon-workmgr-86f6fb44f9-n2844\" (UID: \"09d7293f-3362-40d6-8f2d-17b0e9e06c9f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f6fb44f9-n2844" Apr 22 19:24:20.824399 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:20.824372 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/09d7293f-3362-40d6-8f2d-17b0e9e06c9f-klusterlet-config\") pod \"klusterlet-addon-workmgr-86f6fb44f9-n2844\" (UID: \"09d7293f-3362-40d6-8f2d-17b0e9e06c9f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f6fb44f9-n2844" Apr 22 19:24:20.830488 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:20.830440 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cx2h\" (UniqueName: \"kubernetes.io/projected/09d7293f-3362-40d6-8f2d-17b0e9e06c9f-kube-api-access-5cx2h\") pod \"klusterlet-addon-workmgr-86f6fb44f9-n2844\" (UID: \"09d7293f-3362-40d6-8f2d-17b0e9e06c9f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f6fb44f9-n2844" Apr 22 19:24:20.908237 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:20.908202 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f6fb44f9-n2844" Apr 22 19:24:20.929043 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:20.929018 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hcbdv"] Apr 22 19:24:20.932798 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:24:20.932757 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod029be2e7_8cf1_404e_bf0d_59ccb446ec17.slice/crio-1cebb5cb6bb35d0fcf6fd254adf188b3f5963e6eb3864c43c29710522d602c63 WatchSource:0}: Error finding container 1cebb5cb6bb35d0fcf6fd254adf188b3f5963e6eb3864c43c29710522d602c63: Status 404 returned error can't find the container with id 1cebb5cb6bb35d0fcf6fd254adf188b3f5963e6eb3864c43c29710522d602c63 Apr 22 19:24:21.021710 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:21.021651 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f6fb44f9-n2844"] Apr 22 19:24:21.174858 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:24:21.174770 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09d7293f_3362_40d6_8f2d_17b0e9e06c9f.slice/crio-8998437422c0b3e3a09146164a22f26ff6a6e0787a9aadef31fc54a96efa29a0 WatchSource:0}: Error finding container 8998437422c0b3e3a09146164a22f26ff6a6e0787a9aadef31fc54a96efa29a0: Status 404 returned error can't find the container with id 8998437422c0b3e3a09146164a22f26ff6a6e0787a9aadef31fc54a96efa29a0 Apr 22 19:24:21.507887 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:21.507845 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nms75" event={"ID":"cb58056c-3091-4679-8665-a73d6668e604","Type":"ContainerStarted","Data":"9b823a96e7ba7fe55005d9c725ffb732dc988bdbe9f3ae00c40b3c4d563c9af3"} Apr 22 19:24:21.509153 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:21.509122 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hcbdv" event={"ID":"029be2e7-8cf1-404e-bf0d-59ccb446ec17","Type":"ContainerStarted","Data":"1cebb5cb6bb35d0fcf6fd254adf188b3f5963e6eb3864c43c29710522d602c63"} Apr 22 19:24:21.512374 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:21.512345 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z6hh4" event={"ID":"20682e94-9131-4a8d-a325-b9f45d2fd64f","Type":"ContainerStarted","Data":"1ba85c209af9052a583a79bbcf19c486e089444fe4aae1e0f1c7c6f27cb0b097"} Apr 22 19:24:21.513794 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:21.513757 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jzltp" event={"ID":"9bbee64a-2154-4ea2-9299-c15d3614e769","Type":"ContainerStarted","Data":"5ab5e2915a6dc29692c70da14cea630ea9191f4b8b7c897bbd70b8b35f2c8a01"} Apr 22 19:24:21.513920 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:21.513902 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-jzltp" Apr 22 19:24:21.514966 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:21.514945 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f6fb44f9-n2844" event={"ID":"09d7293f-3362-40d6-8f2d-17b0e9e06c9f","Type":"ContainerStarted","Data":"8998437422c0b3e3a09146164a22f26ff6a6e0787a9aadef31fc54a96efa29a0"} Apr 22 19:24:21.528989 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:21.528951 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nms75" podStartSLOduration=27.86843292 podStartE2EDuration="33.528939245s" podCreationTimestamp="2026-04-22 19:23:48 +0000 UTC" firstStartedPulling="2026-04-22 19:24:15.602032794 +0000 UTC m=+32.885703245" lastFinishedPulling="2026-04-22 19:24:21.262539106 +0000 UTC m=+38.546209570" observedRunningTime="2026-04-22 19:24:21.52823651 +0000 UTC m=+38.811906984" watchObservedRunningTime="2026-04-22 19:24:21.528939245 +0000 UTC m=+38.812609717" Apr 22 19:24:21.550995 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:21.550952 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-z6hh4" podStartSLOduration=6.166331042 podStartE2EDuration="38.550938449s" podCreationTimestamp="2026-04-22 19:23:43 +0000 UTC" firstStartedPulling="2026-04-22 19:23:45.884902816 +0000 UTC m=+3.168573273" lastFinishedPulling="2026-04-22 19:24:18.269510216 +0000 UTC m=+35.553180680" observedRunningTime="2026-04-22 19:24:21.550408138 +0000 UTC m=+38.834078613" watchObservedRunningTime="2026-04-22 19:24:21.550938449 +0000 UTC m=+38.834608926" Apr 22 19:24:22.841406 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:22.840604 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-registry-tls\") pod \"image-registry-658bf49c4d-qxl4w\" (UID: \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\") " pod="openshift-image-registry/image-registry-658bf49c4d-qxl4w" Apr 22 19:24:22.841406 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:22.840656 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8207bbac-ad34-4f98-a8af-1a2daaa6ea59-cert\") pod \"ingress-canary-qsnbz\" (UID: \"8207bbac-ad34-4f98-a8af-1a2daaa6ea59\") " pod="openshift-ingress-canary/ingress-canary-qsnbz" Apr 22 19:24:22.841406 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:22.840856 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:22.841406 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:22.840916 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8207bbac-ad34-4f98-a8af-1a2daaa6ea59-cert podName:8207bbac-ad34-4f98-a8af-1a2daaa6ea59 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:30.840897317 +0000 UTC m=+48.124567772 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8207bbac-ad34-4f98-a8af-1a2daaa6ea59-cert") pod "ingress-canary-qsnbz" (UID: "8207bbac-ad34-4f98-a8af-1a2daaa6ea59") : secret "canary-serving-cert" not found Apr 22 19:24:22.841406 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:22.841316 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:24:22.841406 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:22.841331 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-658bf49c4d-qxl4w: secret "image-registry-tls" not found Apr 22 19:24:22.841406 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:22.841374 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-registry-tls podName:4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e nodeName:}" failed. No retries permitted until 2026-04-22 19:24:30.841358815 +0000 UTC m=+48.125029280 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-registry-tls") pod "image-registry-658bf49c4d-qxl4w" (UID: "4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e") : secret "image-registry-tls" not found Apr 22 19:24:22.942069 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:22.942032 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be5b5131-1460-4746-96a8-4720ed712cf1-metrics-tls\") pod \"dns-default-lnt5s\" (UID: \"be5b5131-1460-4746-96a8-4720ed712cf1\") " pod="openshift-dns/dns-default-lnt5s" Apr 22 19:24:22.942233 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:22.942182 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:22.942286 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:22.942257 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be5b5131-1460-4746-96a8-4720ed712cf1-metrics-tls podName:be5b5131-1460-4746-96a8-4720ed712cf1 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:30.942237135 +0000 UTC m=+48.225907592 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/be5b5131-1460-4746-96a8-4720ed712cf1-metrics-tls") pod "dns-default-lnt5s" (UID: "be5b5131-1460-4746-96a8-4720ed712cf1") : secret "dns-default-metrics-tls" not found Apr 22 19:24:23.381652 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:23.381604 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-jzltp" podStartSLOduration=37.353942126 podStartE2EDuration="40.38159043s" podCreationTimestamp="2026-04-22 19:23:43 +0000 UTC" firstStartedPulling="2026-04-22 19:24:18.243095998 +0000 UTC m=+35.526766449" lastFinishedPulling="2026-04-22 19:24:21.270744301 +0000 UTC m=+38.554414753" observedRunningTime="2026-04-22 19:24:21.566488476 +0000 UTC m=+38.850158949" watchObservedRunningTime="2026-04-22 19:24:23.38159043 +0000 UTC m=+40.665260903" Apr 22 19:24:23.647760 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:23.647675 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/76e16566-d9dd-4f0b-9cc5-a72e27e4518a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-2cwpp\" (UID: \"76e16566-d9dd-4f0b-9cc5-a72e27e4518a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-2cwpp" Apr 22 19:24:23.647932 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:23.647845 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:24:23.647932 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:23.647918 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76e16566-d9dd-4f0b-9cc5-a72e27e4518a-networking-console-plugin-cert podName:76e16566-d9dd-4f0b-9cc5-a72e27e4518a nodeName:}" failed. No retries permitted until 2026-04-22 19:24:31.647903479 +0000 UTC m=+48.931573942 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/76e16566-d9dd-4f0b-9cc5-a72e27e4518a-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-2cwpp" (UID: "76e16566-d9dd-4f0b-9cc5-a72e27e4518a") : secret "networking-console-plugin-cert" not found Apr 22 19:24:27.528305 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:27.528266 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hcbdv" event={"ID":"029be2e7-8cf1-404e-bf0d-59ccb446ec17","Type":"ContainerStarted","Data":"9abdc4577bc009c2d915a6cd374b7f2fb7c51334a30f07a78d69fe8d14ebbeb4"} Apr 22 19:24:27.529504 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:27.529478 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f6fb44f9-n2844" event={"ID":"09d7293f-3362-40d6-8f2d-17b0e9e06c9f","Type":"ContainerStarted","Data":"c175bc8a80d8a19c535fbd177fe3c6193afd79274069b24362ae7ac94e10c164"} Apr 22 19:24:27.529717 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:27.529698 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f6fb44f9-n2844" Apr 22 19:24:27.531244 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:27.531227 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f6fb44f9-n2844" Apr 22 19:24:27.547590 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:27.547551 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-hcbdv" podStartSLOduration=33.258091663 podStartE2EDuration="39.54753994s" podCreationTimestamp="2026-04-22 19:23:48 +0000 UTC" firstStartedPulling="2026-04-22 19:24:20.935012234 +0000 UTC m=+38.218682688" lastFinishedPulling="2026-04-22 19:24:27.224460514 +0000 UTC m=+44.508130965" observedRunningTime="2026-04-22 19:24:27.547291405 +0000 UTC m=+44.830961879" watchObservedRunningTime="2026-04-22 19:24:27.54753994 +0000 UTC m=+44.831210412" Apr 22 19:24:27.566310 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:27.566273 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f6fb44f9-n2844" podStartSLOduration=1.537388236 podStartE2EDuration="7.566263475s" podCreationTimestamp="2026-04-22 19:24:20 +0000 UTC" firstStartedPulling="2026-04-22 19:24:21.210297468 +0000 UTC m=+38.493967920" lastFinishedPulling="2026-04-22 19:24:27.239172693 +0000 UTC m=+44.522843159" observedRunningTime="2026-04-22 19:24:27.565445265 +0000 UTC m=+44.849115761" watchObservedRunningTime="2026-04-22 19:24:27.566263475 +0000 UTC m=+44.849933945" Apr 22 19:24:30.904488 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:30.904446 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-registry-tls\") pod \"image-registry-658bf49c4d-qxl4w\" (UID: \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\") " pod="openshift-image-registry/image-registry-658bf49c4d-qxl4w" Apr 22 19:24:30.904488 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:30.904490 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8207bbac-ad34-4f98-a8af-1a2daaa6ea59-cert\") pod \"ingress-canary-qsnbz\" (UID: \"8207bbac-ad34-4f98-a8af-1a2daaa6ea59\") " pod="openshift-ingress-canary/ingress-canary-qsnbz" Apr 22 19:24:30.905034 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:30.904626 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:30.905034 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:30.904651 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:24:30.905034 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:30.904677 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-658bf49c4d-qxl4w: secret "image-registry-tls" not found Apr 22 19:24:30.905034 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:30.904701 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8207bbac-ad34-4f98-a8af-1a2daaa6ea59-cert podName:8207bbac-ad34-4f98-a8af-1a2daaa6ea59 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:46.904686238 +0000 UTC m=+64.188356689 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8207bbac-ad34-4f98-a8af-1a2daaa6ea59-cert") pod "ingress-canary-qsnbz" (UID: "8207bbac-ad34-4f98-a8af-1a2daaa6ea59") : secret "canary-serving-cert" not found Apr 22 19:24:30.905034 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:30.904747 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-registry-tls podName:4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e nodeName:}" failed. No retries permitted until 2026-04-22 19:24:46.90472705 +0000 UTC m=+64.188397502 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-registry-tls") pod "image-registry-658bf49c4d-qxl4w" (UID: "4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e") : secret "image-registry-tls" not found Apr 22 19:24:31.005063 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:31.005030 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be5b5131-1460-4746-96a8-4720ed712cf1-metrics-tls\") pod \"dns-default-lnt5s\" (UID: \"be5b5131-1460-4746-96a8-4720ed712cf1\") " pod="openshift-dns/dns-default-lnt5s" Apr 22 19:24:31.005225 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:31.005177 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:31.005264 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:31.005236 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be5b5131-1460-4746-96a8-4720ed712cf1-metrics-tls podName:be5b5131-1460-4746-96a8-4720ed712cf1 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:47.005219478 +0000 UTC m=+64.288889931 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/be5b5131-1460-4746-96a8-4720ed712cf1-metrics-tls") pod "dns-default-lnt5s" (UID: "be5b5131-1460-4746-96a8-4720ed712cf1") : secret "dns-default-metrics-tls" not found Apr 22 19:24:31.710044 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:31.710006 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/76e16566-d9dd-4f0b-9cc5-a72e27e4518a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-2cwpp\" (UID: \"76e16566-d9dd-4f0b-9cc5-a72e27e4518a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-2cwpp" Apr 22 19:24:31.710231 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:31.710156 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:24:31.710231 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:31.710218 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76e16566-d9dd-4f0b-9cc5-a72e27e4518a-networking-console-plugin-cert podName:76e16566-d9dd-4f0b-9cc5-a72e27e4518a nodeName:}" failed. No retries permitted until 2026-04-22 19:24:47.710200535 +0000 UTC m=+64.993871004 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/76e16566-d9dd-4f0b-9cc5-a72e27e4518a-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-2cwpp" (UID: "76e16566-d9dd-4f0b-9cc5-a72e27e4518a") : secret "networking-console-plugin-cert" not found Apr 22 19:24:41.491173 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:41.491140 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pqkn7" Apr 22 19:24:46.920835 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:46.920796 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-registry-tls\") pod \"image-registry-658bf49c4d-qxl4w\" (UID: \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\") " pod="openshift-image-registry/image-registry-658bf49c4d-qxl4w" Apr 22 19:24:46.920835 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:46.920833 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8207bbac-ad34-4f98-a8af-1a2daaa6ea59-cert\") pod \"ingress-canary-qsnbz\" (UID: \"8207bbac-ad34-4f98-a8af-1a2daaa6ea59\") " pod="openshift-ingress-canary/ingress-canary-qsnbz" Apr 22 19:24:46.921279 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:46.920973 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:46.921279 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:46.921038 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8207bbac-ad34-4f98-a8af-1a2daaa6ea59-cert podName:8207bbac-ad34-4f98-a8af-1a2daaa6ea59 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:18.921023842 +0000 UTC m=+96.204694293 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8207bbac-ad34-4f98-a8af-1a2daaa6ea59-cert") pod "ingress-canary-qsnbz" (UID: "8207bbac-ad34-4f98-a8af-1a2daaa6ea59") : secret "canary-serving-cert" not found Apr 22 19:24:46.921279 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:46.920976 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:24:46.921279 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:46.921072 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-658bf49c4d-qxl4w: secret "image-registry-tls" not found Apr 22 19:24:46.921279 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:46.921142 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-registry-tls podName:4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e nodeName:}" failed. No retries permitted until 2026-04-22 19:25:18.92112766 +0000 UTC m=+96.204798111 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-registry-tls") pod "image-registry-658bf49c4d-qxl4w" (UID: "4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e") : secret "image-registry-tls" not found Apr 22 19:24:47.022176 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:47.022110 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be5b5131-1460-4746-96a8-4720ed712cf1-metrics-tls\") pod \"dns-default-lnt5s\" (UID: \"be5b5131-1460-4746-96a8-4720ed712cf1\") " pod="openshift-dns/dns-default-lnt5s" Apr 22 19:24:47.022362 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:47.022244 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:47.022362 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:47.022311 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be5b5131-1460-4746-96a8-4720ed712cf1-metrics-tls podName:be5b5131-1460-4746-96a8-4720ed712cf1 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:19.022295198 +0000 UTC m=+96.305965668 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/be5b5131-1460-4746-96a8-4720ed712cf1-metrics-tls") pod "dns-default-lnt5s" (UID: "be5b5131-1460-4746-96a8-4720ed712cf1") : secret "dns-default-metrics-tls" not found Apr 22 19:24:47.727989 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:47.727948 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/76e16566-d9dd-4f0b-9cc5-a72e27e4518a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-2cwpp\" (UID: \"76e16566-d9dd-4f0b-9cc5-a72e27e4518a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-2cwpp" Apr 22 19:24:47.728169 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:47.728099 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:24:47.728169 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:47.728161 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76e16566-d9dd-4f0b-9cc5-a72e27e4518a-networking-console-plugin-cert podName:76e16566-d9dd-4f0b-9cc5-a72e27e4518a nodeName:}" failed. No retries permitted until 2026-04-22 19:25:19.728146925 +0000 UTC m=+97.011817375 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/76e16566-d9dd-4f0b-9cc5-a72e27e4518a-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-2cwpp" (UID: "76e16566-d9dd-4f0b-9cc5-a72e27e4518a") : secret "networking-console-plugin-cert" not found Apr 22 19:24:49.038876 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:49.038831 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93109160-9bbe-497f-9b25-d7fa7e08508f-metrics-certs\") pod \"network-metrics-daemon-mdzdp\" (UID: \"93109160-9bbe-497f-9b25-d7fa7e08508f\") " pod="openshift-multus/network-metrics-daemon-mdzdp" Apr 22 19:24:49.039462 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:49.038985 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:24:49.039462 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:24:49.039081 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93109160-9bbe-497f-9b25-d7fa7e08508f-metrics-certs podName:93109160-9bbe-497f-9b25-d7fa7e08508f nodeName:}" failed. No retries permitted until 2026-04-22 19:25:53.039065339 +0000 UTC m=+130.322735790 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93109160-9bbe-497f-9b25-d7fa7e08508f-metrics-certs") pod "network-metrics-daemon-mdzdp" (UID: "93109160-9bbe-497f-9b25-d7fa7e08508f") : secret "metrics-daemon-secret" not found Apr 22 19:24:52.520714 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:24:52.520684 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-jzltp" Apr 22 19:25:18.960352 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:25:18.960212 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-registry-tls\") pod \"image-registry-658bf49c4d-qxl4w\" (UID: \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\") " pod="openshift-image-registry/image-registry-658bf49c4d-qxl4w" Apr 22 19:25:18.960352 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:25:18.960250 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8207bbac-ad34-4f98-a8af-1a2daaa6ea59-cert\") pod \"ingress-canary-qsnbz\" (UID: \"8207bbac-ad34-4f98-a8af-1a2daaa6ea59\") " pod="openshift-ingress-canary/ingress-canary-qsnbz" Apr 22 19:25:18.960352 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:25:18.960338 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:25:18.961054 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:25:18.960364 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:25:18.961054 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:25:18.960385 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-658bf49c4d-qxl4w: secret "image-registry-tls" not found Apr 22 19:25:18.961054 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:25:18.960397 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8207bbac-ad34-4f98-a8af-1a2daaa6ea59-cert podName:8207bbac-ad34-4f98-a8af-1a2daaa6ea59 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:22.960383785 +0000 UTC m=+160.244054236 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8207bbac-ad34-4f98-a8af-1a2daaa6ea59-cert") pod "ingress-canary-qsnbz" (UID: "8207bbac-ad34-4f98-a8af-1a2daaa6ea59") : secret "canary-serving-cert" not found Apr 22 19:25:18.961054 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:25:18.960891 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-registry-tls podName:4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e nodeName:}" failed. No retries permitted until 2026-04-22 19:26:22.960440752 +0000 UTC m=+160.244111203 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-registry-tls") pod "image-registry-658bf49c4d-qxl4w" (UID: "4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e") : secret "image-registry-tls" not found Apr 22 19:25:19.060949 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:25:19.060902 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be5b5131-1460-4746-96a8-4720ed712cf1-metrics-tls\") pod \"dns-default-lnt5s\" (UID: \"be5b5131-1460-4746-96a8-4720ed712cf1\") " pod="openshift-dns/dns-default-lnt5s" Apr 22 19:25:19.061130 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:25:19.061052 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:25:19.061130 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:25:19.061117 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be5b5131-1460-4746-96a8-4720ed712cf1-metrics-tls podName:be5b5131-1460-4746-96a8-4720ed712cf1 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:23.061100711 +0000 UTC m=+160.344771161 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/be5b5131-1460-4746-96a8-4720ed712cf1-metrics-tls") pod "dns-default-lnt5s" (UID: "be5b5131-1460-4746-96a8-4720ed712cf1") : secret "dns-default-metrics-tls" not found Apr 22 19:25:19.766425 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:25:19.766383 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/76e16566-d9dd-4f0b-9cc5-a72e27e4518a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-2cwpp\" (UID: \"76e16566-d9dd-4f0b-9cc5-a72e27e4518a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-2cwpp" Apr 22 19:25:19.766611 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:25:19.766531 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:25:19.766611 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:25:19.766603 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76e16566-d9dd-4f0b-9cc5-a72e27e4518a-networking-console-plugin-cert podName:76e16566-d9dd-4f0b-9cc5-a72e27e4518a nodeName:}" failed. No retries permitted until 2026-04-22 19:26:23.76658121 +0000 UTC m=+161.050251661 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/76e16566-d9dd-4f0b-9cc5-a72e27e4518a-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-2cwpp" (UID: "76e16566-d9dd-4f0b-9cc5-a72e27e4518a") : secret "networking-console-plugin-cert" not found Apr 22 19:25:53.119516 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:25:53.119471 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93109160-9bbe-497f-9b25-d7fa7e08508f-metrics-certs\") pod \"network-metrics-daemon-mdzdp\" (UID: \"93109160-9bbe-497f-9b25-d7fa7e08508f\") " pod="openshift-multus/network-metrics-daemon-mdzdp" Apr 22 19:25:53.120023 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:25:53.119624 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:25:53.120023 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:25:53.119704 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93109160-9bbe-497f-9b25-d7fa7e08508f-metrics-certs podName:93109160-9bbe-497f-9b25-d7fa7e08508f nodeName:}" failed. No retries permitted until 2026-04-22 19:27:55.119683962 +0000 UTC m=+252.403354419 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93109160-9bbe-497f-9b25-d7fa7e08508f-metrics-certs") pod "network-metrics-daemon-mdzdp" (UID: "93109160-9bbe-497f-9b25-d7fa7e08508f") : secret "metrics-daemon-secret" not found Apr 22 19:26:18.072376 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:26:18.072328 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-658bf49c4d-qxl4w" podUID="4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e" Apr 22 19:26:18.099732 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:26:18.099693 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-qsnbz" podUID="8207bbac-ad34-4f98-a8af-1a2daaa6ea59" Apr 22 19:26:18.119004 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:26:18.118979 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-lnt5s" podUID="be5b5131-1460-4746-96a8-4720ed712cf1" Apr 22 19:26:18.364711 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:26:18.364620 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-mdzdp" podUID="93109160-9bbe-497f-9b25-d7fa7e08508f" Apr 22 19:26:18.756723 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:18.756695 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-658bf49c4d-qxl4w" Apr 22 19:26:18.756955 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:18.756695 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qsnbz" Apr 22 19:26:18.756955 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:18.756695 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lnt5s" Apr 22 19:26:18.867671 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:26:18.867632 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-2cwpp" podUID="76e16566-d9dd-4f0b-9cc5-a72e27e4518a" Apr 22 19:26:19.758237 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:19.758208 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-2cwpp" Apr 22 19:26:22.301612 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:22.301586 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zsm2s_e794b825-b003-4cc7-9af6-8dd829fbea84/dns-node-resolver/0.log" Apr 22 19:26:23.041355 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:23.041321 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-registry-tls\") pod \"image-registry-658bf49c4d-qxl4w\" (UID: \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\") " pod="openshift-image-registry/image-registry-658bf49c4d-qxl4w" Apr 22 19:26:23.041355 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:23.041362 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8207bbac-ad34-4f98-a8af-1a2daaa6ea59-cert\") pod \"ingress-canary-qsnbz\" (UID: \"8207bbac-ad34-4f98-a8af-1a2daaa6ea59\") " pod="openshift-ingress-canary/ingress-canary-qsnbz" Apr 22 19:26:23.041599 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:26:23.041473 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:26:23.041599 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:26:23.041498 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-658bf49c4d-qxl4w: secret "image-registry-tls" not found Apr 22 19:26:23.041599 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:26:23.041476 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:26:23.041599 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:26:23.041557 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-registry-tls podName:4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e nodeName:}" failed. No retries permitted until 2026-04-22 19:28:25.041539019 +0000 UTC m=+282.325209470 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-registry-tls") pod "image-registry-658bf49c4d-qxl4w" (UID: "4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e") : secret "image-registry-tls" not found Apr 22 19:26:23.041599 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:26:23.041573 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8207bbac-ad34-4f98-a8af-1a2daaa6ea59-cert podName:8207bbac-ad34-4f98-a8af-1a2daaa6ea59 nodeName:}" failed. No retries permitted until 2026-04-22 19:28:25.04156504 +0000 UTC m=+282.325235491 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8207bbac-ad34-4f98-a8af-1a2daaa6ea59-cert") pod "ingress-canary-qsnbz" (UID: "8207bbac-ad34-4f98-a8af-1a2daaa6ea59") : secret "canary-serving-cert" not found Apr 22 19:26:23.102403 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:23.102375 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-95pbb_fb2a25eb-d7fc-4fe6-a965-be2f517c03ab/node-ca/0.log" Apr 22 19:26:23.142200 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:23.142168 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be5b5131-1460-4746-96a8-4720ed712cf1-metrics-tls\") pod \"dns-default-lnt5s\" (UID: \"be5b5131-1460-4746-96a8-4720ed712cf1\") " pod="openshift-dns/dns-default-lnt5s" Apr 22 19:26:23.142365 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:26:23.142283 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:26:23.142365 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:26:23.142340 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be5b5131-1460-4746-96a8-4720ed712cf1-metrics-tls podName:be5b5131-1460-4746-96a8-4720ed712cf1 nodeName:}" failed. No retries permitted until 2026-04-22 19:28:25.142326167 +0000 UTC m=+282.425996617 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/be5b5131-1460-4746-96a8-4720ed712cf1-metrics-tls") pod "dns-default-lnt5s" (UID: "be5b5131-1460-4746-96a8-4720ed712cf1") : secret "dns-default-metrics-tls" not found Apr 22 19:26:23.846265 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:23.846209 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/76e16566-d9dd-4f0b-9cc5-a72e27e4518a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-2cwpp\" (UID: \"76e16566-d9dd-4f0b-9cc5-a72e27e4518a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-2cwpp" Apr 22 19:26:23.846750 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:26:23.846358 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:26:23.846750 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:26:23.846433 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76e16566-d9dd-4f0b-9cc5-a72e27e4518a-networking-console-plugin-cert podName:76e16566-d9dd-4f0b-9cc5-a72e27e4518a nodeName:}" failed. No retries permitted until 2026-04-22 19:28:25.846416199 +0000 UTC m=+283.130086649 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/76e16566-d9dd-4f0b-9cc5-a72e27e4518a-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-2cwpp" (UID: "76e16566-d9dd-4f0b-9cc5-a72e27e4518a") : secret "networking-console-plugin-cert" not found Apr 22 19:26:26.677418 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:26.677383 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8fqmb"] Apr 22 19:26:26.680160 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:26.680144 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8fqmb" Apr 22 19:26:26.682563 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:26.682537 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 22 19:26:26.682817 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:26.682800 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:26:26.682873 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:26.682840 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 22 19:26:26.683922 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:26.683905 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-jfc7g\"" Apr 22 19:26:26.684013 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:26.683945 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 22 19:26:26.689423 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:26.689403 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8fqmb"] Apr 22 19:26:26.767829 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:26.767771 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwmw2\" (UniqueName: \"kubernetes.io/projected/0185be2e-9ff1-4f5d-a6ea-d439437f5421-kube-api-access-gwmw2\") pod \"service-ca-operator-d6fc45fc5-8fqmb\" (UID: \"0185be2e-9ff1-4f5d-a6ea-d439437f5421\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8fqmb" Apr 22 19:26:26.768004 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:26.767864 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0185be2e-9ff1-4f5d-a6ea-d439437f5421-serving-cert\") pod \"service-ca-operator-d6fc45fc5-8fqmb\" (UID: \"0185be2e-9ff1-4f5d-a6ea-d439437f5421\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8fqmb" Apr 22 19:26:26.768004 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:26.767894 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0185be2e-9ff1-4f5d-a6ea-d439437f5421-config\") pod \"service-ca-operator-d6fc45fc5-8fqmb\" (UID: \"0185be2e-9ff1-4f5d-a6ea-d439437f5421\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8fqmb" Apr 22 19:26:26.868547 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:26.868510 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0185be2e-9ff1-4f5d-a6ea-d439437f5421-config\") pod \"service-ca-operator-d6fc45fc5-8fqmb\" (UID: \"0185be2e-9ff1-4f5d-a6ea-d439437f5421\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8fqmb" Apr 22 19:26:26.868716 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:26.868575 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwmw2\" (UniqueName: \"kubernetes.io/projected/0185be2e-9ff1-4f5d-a6ea-d439437f5421-kube-api-access-gwmw2\") pod \"service-ca-operator-d6fc45fc5-8fqmb\" (UID: \"0185be2e-9ff1-4f5d-a6ea-d439437f5421\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8fqmb" Apr 22 19:26:26.868716 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:26.868627 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0185be2e-9ff1-4f5d-a6ea-d439437f5421-serving-cert\") pod \"service-ca-operator-d6fc45fc5-8fqmb\" (UID: \"0185be2e-9ff1-4f5d-a6ea-d439437f5421\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8fqmb" Apr 22 19:26:26.869102 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:26.869073 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0185be2e-9ff1-4f5d-a6ea-d439437f5421-config\") pod \"service-ca-operator-d6fc45fc5-8fqmb\" (UID: \"0185be2e-9ff1-4f5d-a6ea-d439437f5421\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8fqmb" Apr 22 19:26:26.870847 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:26.870828 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0185be2e-9ff1-4f5d-a6ea-d439437f5421-serving-cert\") pod \"service-ca-operator-d6fc45fc5-8fqmb\" (UID: \"0185be2e-9ff1-4f5d-a6ea-d439437f5421\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8fqmb" Apr 22 19:26:26.876443 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:26.876416 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwmw2\" (UniqueName: \"kubernetes.io/projected/0185be2e-9ff1-4f5d-a6ea-d439437f5421-kube-api-access-gwmw2\") pod \"service-ca-operator-d6fc45fc5-8fqmb\" (UID: \"0185be2e-9ff1-4f5d-a6ea-d439437f5421\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8fqmb" Apr 22 19:26:26.989342 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:26.989239 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8fqmb" Apr 22 19:26:27.107020 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:27.106991 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8fqmb"] Apr 22 19:26:27.530491 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:27.530421 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f6fb44f9-n2844" podUID="09d7293f-3362-40d6-8f2d-17b0e9e06c9f" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.11:8000/readyz\": dial tcp 10.132.0.11:8000: connect: connection refused" Apr 22 19:26:27.774620 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:27.774586 2574 generic.go:358] "Generic (PLEG): container finished" podID="09d7293f-3362-40d6-8f2d-17b0e9e06c9f" containerID="c175bc8a80d8a19c535fbd177fe3c6193afd79274069b24362ae7ac94e10c164" exitCode=1 Apr 22 19:26:27.775081 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:27.774666 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f6fb44f9-n2844" event={"ID":"09d7293f-3362-40d6-8f2d-17b0e9e06c9f","Type":"ContainerDied","Data":"c175bc8a80d8a19c535fbd177fe3c6193afd79274069b24362ae7ac94e10c164"} Apr 22 19:26:27.775081 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:27.775057 2574 scope.go:117] "RemoveContainer" containerID="c175bc8a80d8a19c535fbd177fe3c6193afd79274069b24362ae7ac94e10c164" Apr 22 19:26:27.775707 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:27.775686 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8fqmb" event={"ID":"0185be2e-9ff1-4f5d-a6ea-d439437f5421","Type":"ContainerStarted","Data":"4c31631bbc953eec724a2e7d12adbcb637d6f379cec0a183a2c859896c2eaa0d"} Apr 22 19:26:28.779316 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:28.779278 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f6fb44f9-n2844" event={"ID":"09d7293f-3362-40d6-8f2d-17b0e9e06c9f","Type":"ContainerStarted","Data":"66613ceeba893a2193a5e68e4488400fb987fa053f7efcca6cb12f62318c02b9"} Apr 22 19:26:28.779727 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:28.779538 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f6fb44f9-n2844" Apr 22 19:26:28.780311 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:28.780292 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f6fb44f9-n2844" Apr 22 19:26:29.345600 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:29.345520 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdzdp" Apr 22 19:26:29.783510 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:29.783474 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8fqmb" event={"ID":"0185be2e-9ff1-4f5d-a6ea-d439437f5421","Type":"ContainerStarted","Data":"ce034462fe819e5908ea779ef67b060d17a41e98714ec6771232a5867b15f634"} Apr 22 19:26:29.800017 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:29.799977 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8fqmb" podStartSLOduration=1.91361073 podStartE2EDuration="3.799961485s" podCreationTimestamp="2026-04-22 19:26:26 +0000 UTC" firstStartedPulling="2026-04-22 19:26:27.111991036 +0000 UTC m=+164.395661486" lastFinishedPulling="2026-04-22 19:26:28.998341777 +0000 UTC m=+166.282012241" observedRunningTime="2026-04-22 19:26:29.799882215 +0000 UTC m=+167.083552710" watchObservedRunningTime="2026-04-22 19:26:29.799961485 +0000 UTC m=+167.083631958" Apr 22 19:26:30.343097 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:30.343062 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-bwzd9"] Apr 22 19:26:30.346185 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:30.346171 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bwzd9" Apr 22 19:26:30.348731 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:30.348705 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 22 19:26:30.348731 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:30.348712 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 22 19:26:30.348949 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:30.348743 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-gmgd2\"" Apr 22 19:26:30.355577 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:30.355553 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-bwzd9"] Apr 22 19:26:30.397490 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:30.397458 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2nvf\" (UniqueName: \"kubernetes.io/projected/73b8fc2e-2383-453a-813a-8253bb341485-kube-api-access-w2nvf\") pod \"migrator-74bb7799d9-bwzd9\" (UID: \"73b8fc2e-2383-453a-813a-8253bb341485\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bwzd9" Apr 22 19:26:30.497950 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:30.497914 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w2nvf\" (UniqueName: \"kubernetes.io/projected/73b8fc2e-2383-453a-813a-8253bb341485-kube-api-access-w2nvf\") pod \"migrator-74bb7799d9-bwzd9\" (UID: \"73b8fc2e-2383-453a-813a-8253bb341485\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bwzd9" Apr 22 19:26:30.509415 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:30.509386 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2nvf\" (UniqueName: \"kubernetes.io/projected/73b8fc2e-2383-453a-813a-8253bb341485-kube-api-access-w2nvf\") pod \"migrator-74bb7799d9-bwzd9\" (UID: \"73b8fc2e-2383-453a-813a-8253bb341485\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bwzd9" Apr 22 19:26:30.654725 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:30.654643 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bwzd9" Apr 22 19:26:30.777410 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:30.777375 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-bwzd9"] Apr 22 19:26:30.779974 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:26:30.779947 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73b8fc2e_2383_453a_813a_8253bb341485.slice/crio-9c6e7cc9d152da8392a496cd594321de9727e7455ab6e5ddd566cad13e1127e4 WatchSource:0}: Error finding container 9c6e7cc9d152da8392a496cd594321de9727e7455ab6e5ddd566cad13e1127e4: Status 404 returned error can't find the container with id 9c6e7cc9d152da8392a496cd594321de9727e7455ab6e5ddd566cad13e1127e4 Apr 22 19:26:30.787369 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:30.787343 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bwzd9" event={"ID":"73b8fc2e-2383-453a-813a-8253bb341485","Type":"ContainerStarted","Data":"9c6e7cc9d152da8392a496cd594321de9727e7455ab6e5ddd566cad13e1127e4"} Apr 22 19:26:32.793279 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:32.793243 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bwzd9" event={"ID":"73b8fc2e-2383-453a-813a-8253bb341485","Type":"ContainerStarted","Data":"b44f8a6882073597604d2f9401d1c46d13f2b3bf13c8b352bbf143e37cb939bb"} Apr 22 19:26:32.793279 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:32.793281 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bwzd9" event={"ID":"73b8fc2e-2383-453a-813a-8253bb341485","Type":"ContainerStarted","Data":"49083500bb56d5b08a6a4c6eae1fde496eb94c7e0ea1e3519a7a326ceb4d818e"} Apr 22 19:26:32.811013 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:32.810969 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bwzd9" podStartSLOduration=1.6960889909999999 podStartE2EDuration="2.810956291s" podCreationTimestamp="2026-04-22 19:26:30 +0000 UTC" firstStartedPulling="2026-04-22 19:26:30.781905759 +0000 UTC m=+168.065576226" lastFinishedPulling="2026-04-22 19:26:31.89677307 +0000 UTC m=+169.180443526" observedRunningTime="2026-04-22 19:26:32.809676589 +0000 UTC m=+170.093347062" watchObservedRunningTime="2026-04-22 19:26:32.810956291 +0000 UTC m=+170.094626764" Apr 22 19:26:53.316540 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:53.316507 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-8w2jt"] Apr 22 19:26:53.318445 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:53.318428 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8w2jt" Apr 22 19:26:53.321350 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:53.321328 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-pr677\"" Apr 22 19:26:53.321488 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:53.321446 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 19:26:53.322567 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:53.322544 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 19:26:53.322680 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:53.322658 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 19:26:53.326876 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:53.326849 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 19:26:53.337407 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:53.337385 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8w2jt"] Apr 22 19:26:53.375055 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:53.375024 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk94s\" (UniqueName: \"kubernetes.io/projected/d5544954-5800-4b59-b1a1-0e15d44a7aeb-kube-api-access-sk94s\") pod \"insights-runtime-extractor-8w2jt\" (UID: \"d5544954-5800-4b59-b1a1-0e15d44a7aeb\") " pod="openshift-insights/insights-runtime-extractor-8w2jt" Apr 22 19:26:53.375055 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:53.375055 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d5544954-5800-4b59-b1a1-0e15d44a7aeb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8w2jt\" (UID: \"d5544954-5800-4b59-b1a1-0e15d44a7aeb\") " pod="openshift-insights/insights-runtime-extractor-8w2jt" Apr 22 19:26:53.375222 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:53.375091 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d5544954-5800-4b59-b1a1-0e15d44a7aeb-crio-socket\") pod \"insights-runtime-extractor-8w2jt\" (UID: \"d5544954-5800-4b59-b1a1-0e15d44a7aeb\") " pod="openshift-insights/insights-runtime-extractor-8w2jt" Apr 22 19:26:53.375222 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:53.375133 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d5544954-5800-4b59-b1a1-0e15d44a7aeb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8w2jt\" (UID: \"d5544954-5800-4b59-b1a1-0e15d44a7aeb\") " pod="openshift-insights/insights-runtime-extractor-8w2jt" Apr 22 19:26:53.375222 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:53.375202 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d5544954-5800-4b59-b1a1-0e15d44a7aeb-data-volume\") pod \"insights-runtime-extractor-8w2jt\" (UID: \"d5544954-5800-4b59-b1a1-0e15d44a7aeb\") " pod="openshift-insights/insights-runtime-extractor-8w2jt" Apr 22 19:26:53.403468 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:53.403439 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-77g4t"] Apr 22 19:26:53.405160 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:53.405144 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-77g4t" Apr 22 19:26:53.408046 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:53.408025 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 22 19:26:53.408179 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:53.408122 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-nr5xz\"" Apr 22 19:26:53.419985 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:53.419962 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-77g4t"] Apr 22 19:26:53.475922 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:53.475895 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d5544954-5800-4b59-b1a1-0e15d44a7aeb-crio-socket\") pod \"insights-runtime-extractor-8w2jt\" (UID: \"d5544954-5800-4b59-b1a1-0e15d44a7aeb\") " pod="openshift-insights/insights-runtime-extractor-8w2jt" Apr 22 19:26:53.476072 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:53.475954 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d5544954-5800-4b59-b1a1-0e15d44a7aeb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8w2jt\" (UID: \"d5544954-5800-4b59-b1a1-0e15d44a7aeb\") " pod="openshift-insights/insights-runtime-extractor-8w2jt" Apr 22 19:26:53.476072 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:53.475990 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d5544954-5800-4b59-b1a1-0e15d44a7aeb-data-volume\") pod \"insights-runtime-extractor-8w2jt\" (UID: \"d5544954-5800-4b59-b1a1-0e15d44a7aeb\") " pod="openshift-insights/insights-runtime-extractor-8w2jt" Apr 22 19:26:53.476072 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:53.476015 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d5544954-5800-4b59-b1a1-0e15d44a7aeb-crio-socket\") pod \"insights-runtime-extractor-8w2jt\" (UID: \"d5544954-5800-4b59-b1a1-0e15d44a7aeb\") " pod="openshift-insights/insights-runtime-extractor-8w2jt" Apr 22 19:26:53.476171 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:53.476124 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sk94s\" (UniqueName: \"kubernetes.io/projected/d5544954-5800-4b59-b1a1-0e15d44a7aeb-kube-api-access-sk94s\") pod \"insights-runtime-extractor-8w2jt\" (UID: \"d5544954-5800-4b59-b1a1-0e15d44a7aeb\") " pod="openshift-insights/insights-runtime-extractor-8w2jt" Apr 22 19:26:53.476171 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:53.476151 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d5544954-5800-4b59-b1a1-0e15d44a7aeb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8w2jt\" (UID: \"d5544954-5800-4b59-b1a1-0e15d44a7aeb\") " pod="openshift-insights/insights-runtime-extractor-8w2jt" Apr 22 19:26:53.476260 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:53.476196 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/3b1b56ad-1a25-404b-bb00-1eaca8e5e50b-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-77g4t\" (UID: \"3b1b56ad-1a25-404b-bb00-1eaca8e5e50b\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-77g4t" Apr 22 19:26:53.476382 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:53.476367 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d5544954-5800-4b59-b1a1-0e15d44a7aeb-data-volume\") pod \"insights-runtime-extractor-8w2jt\" (UID: \"d5544954-5800-4b59-b1a1-0e15d44a7aeb\") " pod="openshift-insights/insights-runtime-extractor-8w2jt" Apr 22 19:26:53.476667 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:53.476644 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d5544954-5800-4b59-b1a1-0e15d44a7aeb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8w2jt\" (UID: \"d5544954-5800-4b59-b1a1-0e15d44a7aeb\") " pod="openshift-insights/insights-runtime-extractor-8w2jt" Apr 22 19:26:53.478314 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:53.478296 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d5544954-5800-4b59-b1a1-0e15d44a7aeb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8w2jt\" (UID: \"d5544954-5800-4b59-b1a1-0e15d44a7aeb\") " pod="openshift-insights/insights-runtime-extractor-8w2jt" Apr 22 19:26:53.485220 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:53.485194 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk94s\" (UniqueName: \"kubernetes.io/projected/d5544954-5800-4b59-b1a1-0e15d44a7aeb-kube-api-access-sk94s\") pod \"insights-runtime-extractor-8w2jt\" (UID: \"d5544954-5800-4b59-b1a1-0e15d44a7aeb\") " pod="openshift-insights/insights-runtime-extractor-8w2jt" Apr 22 19:26:53.577222 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:53.577125 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/3b1b56ad-1a25-404b-bb00-1eaca8e5e50b-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-77g4t\" (UID: \"3b1b56ad-1a25-404b-bb00-1eaca8e5e50b\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-77g4t" Apr 22 19:26:53.579513 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:53.579490 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/3b1b56ad-1a25-404b-bb00-1eaca8e5e50b-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-77g4t\" (UID: \"3b1b56ad-1a25-404b-bb00-1eaca8e5e50b\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-77g4t" Apr 22 19:26:53.626883 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:53.626856 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8w2jt" Apr 22 19:26:53.714578 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:53.713948 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-77g4t" Apr 22 19:26:53.745231 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:53.745201 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8w2jt"] Apr 22 19:26:53.748503 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:26:53.748465 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5544954_5800_4b59_b1a1_0e15d44a7aeb.slice/crio-225372560a1d7f05b5da34a5ca5726e1a31b7190633b93af50035e23c66bcb8b WatchSource:0}: Error finding container 225372560a1d7f05b5da34a5ca5726e1a31b7190633b93af50035e23c66bcb8b: Status 404 returned error can't find the container with id 225372560a1d7f05b5da34a5ca5726e1a31b7190633b93af50035e23c66bcb8b Apr 22 19:26:53.834518 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:53.834492 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-77g4t"] Apr 22 19:26:53.837397 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:26:53.837355 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b1b56ad_1a25_404b_bb00_1eaca8e5e50b.slice/crio-04860a192a2e8ef71f1a9839061152c711a13515d02f73aae0fe18284fc9e865 WatchSource:0}: Error finding container 04860a192a2e8ef71f1a9839061152c711a13515d02f73aae0fe18284fc9e865: Status 404 returned error can't find the container with id 04860a192a2e8ef71f1a9839061152c711a13515d02f73aae0fe18284fc9e865 Apr 22 19:26:53.844818 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:53.844790 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-77g4t" event={"ID":"3b1b56ad-1a25-404b-bb00-1eaca8e5e50b","Type":"ContainerStarted","Data":"04860a192a2e8ef71f1a9839061152c711a13515d02f73aae0fe18284fc9e865"} Apr 22 19:26:53.845842 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:53.845826 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8w2jt" event={"ID":"d5544954-5800-4b59-b1a1-0e15d44a7aeb","Type":"ContainerStarted","Data":"dc7d5cea89017a0c99896da1d3ec22fce3df3adc84f7bcc048b53aad22be1886"} Apr 22 19:26:53.845914 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:53.845850 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8w2jt" event={"ID":"d5544954-5800-4b59-b1a1-0e15d44a7aeb","Type":"ContainerStarted","Data":"225372560a1d7f05b5da34a5ca5726e1a31b7190633b93af50035e23c66bcb8b"} Apr 22 19:26:54.849713 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:54.849679 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8w2jt" event={"ID":"d5544954-5800-4b59-b1a1-0e15d44a7aeb","Type":"ContainerStarted","Data":"cc5b788052c170981519332f0e15ed8b84f79ed8a201782b834406b966e2f796"} Apr 22 19:26:55.855996 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:55.855964 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-77g4t" event={"ID":"3b1b56ad-1a25-404b-bb00-1eaca8e5e50b","Type":"ContainerStarted","Data":"d7f53890d5397569269f601f0e3b37325b3e5b56e81bc919d2035e0ba58c83c1"} Apr 22 19:26:55.856411 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:55.856159 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-77g4t" Apr 22 19:26:55.860749 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:55.860726 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-77g4t" Apr 22 19:26:55.882258 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:55.882212 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-77g4t" podStartSLOduration=1.8682525490000002 podStartE2EDuration="2.882192832s" podCreationTimestamp="2026-04-22 19:26:53 +0000 UTC" firstStartedPulling="2026-04-22 19:26:53.838913637 +0000 UTC m=+191.122584088" lastFinishedPulling="2026-04-22 19:26:54.852853916 +0000 UTC m=+192.136524371" observedRunningTime="2026-04-22 19:26:55.881657848 +0000 UTC m=+193.165328338" watchObservedRunningTime="2026-04-22 19:26:55.882192832 +0000 UTC m=+193.165863305" Apr 22 19:26:56.733369 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:56.733332 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-ppqd5"] Apr 22 19:26:56.735211 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:56.735194 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-ppqd5" Apr 22 19:26:56.738720 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:56.738701 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 22 19:26:56.739672 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:56.739654 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 19:26:56.739766 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:56.739700 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 19:26:56.740145 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:56.740129 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-b7qz8\"" Apr 22 19:26:56.740145 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:56.740140 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 22 19:26:56.740261 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:56.740143 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 19:26:56.746520 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:56.746502 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-ppqd5"] Apr 22 19:26:56.803506 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:56.803484 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/172d410c-869f-4053-99a6-d41927b1b8e3-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-ppqd5\" (UID: \"172d410c-869f-4053-99a6-d41927b1b8e3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ppqd5" Apr 22 19:26:56.803626 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:56.803530 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d459n\" (UniqueName: \"kubernetes.io/projected/172d410c-869f-4053-99a6-d41927b1b8e3-kube-api-access-d459n\") pod \"prometheus-operator-5676c8c784-ppqd5\" (UID: \"172d410c-869f-4053-99a6-d41927b1b8e3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ppqd5" Apr 22 19:26:56.803626 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:56.803557 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/172d410c-869f-4053-99a6-d41927b1b8e3-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-ppqd5\" (UID: \"172d410c-869f-4053-99a6-d41927b1b8e3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ppqd5" Apr 22 19:26:56.803700 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:56.803619 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/172d410c-869f-4053-99a6-d41927b1b8e3-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-ppqd5\" (UID: \"172d410c-869f-4053-99a6-d41927b1b8e3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ppqd5" Apr 22 19:26:56.860135 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:56.860104 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8w2jt" event={"ID":"d5544954-5800-4b59-b1a1-0e15d44a7aeb","Type":"ContainerStarted","Data":"ca9aa8d9abebecb3001287e63f3d63fadbaf4a05e2237baa7745b13fa44c34dc"} Apr 22 19:26:56.881456 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:56.881407 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-8w2jt" podStartSLOduration=1.802176123 podStartE2EDuration="3.881393744s" podCreationTimestamp="2026-04-22 19:26:53 +0000 UTC" firstStartedPulling="2026-04-22 19:26:53.804445808 +0000 UTC m=+191.088116268" lastFinishedPulling="2026-04-22 19:26:55.883663437 +0000 UTC m=+193.167333889" observedRunningTime="2026-04-22 19:26:56.880385005 +0000 UTC m=+194.164055478" watchObservedRunningTime="2026-04-22 19:26:56.881393744 +0000 UTC m=+194.165064270" Apr 22 19:26:56.904536 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:56.904512 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/172d410c-869f-4053-99a6-d41927b1b8e3-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-ppqd5\" (UID: \"172d410c-869f-4053-99a6-d41927b1b8e3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ppqd5" Apr 22 19:26:56.904659 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:56.904552 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d459n\" (UniqueName: \"kubernetes.io/projected/172d410c-869f-4053-99a6-d41927b1b8e3-kube-api-access-d459n\") pod \"prometheus-operator-5676c8c784-ppqd5\" (UID: \"172d410c-869f-4053-99a6-d41927b1b8e3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ppqd5" Apr 22 19:26:56.904659 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:56.904593 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/172d410c-869f-4053-99a6-d41927b1b8e3-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-ppqd5\" (UID: \"172d410c-869f-4053-99a6-d41927b1b8e3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ppqd5" Apr 22 19:26:56.904741 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:26:56.904657 2574 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 22 19:26:56.904741 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:56.904700 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/172d410c-869f-4053-99a6-d41927b1b8e3-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-ppqd5\" (UID: \"172d410c-869f-4053-99a6-d41927b1b8e3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ppqd5" Apr 22 19:26:56.904741 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:26:56.904722 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/172d410c-869f-4053-99a6-d41927b1b8e3-prometheus-operator-tls podName:172d410c-869f-4053-99a6-d41927b1b8e3 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:57.404703081 +0000 UTC m=+194.688373546 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/172d410c-869f-4053-99a6-d41927b1b8e3-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-ppqd5" (UID: "172d410c-869f-4053-99a6-d41927b1b8e3") : secret "prometheus-operator-tls" not found Apr 22 19:26:56.905248 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:56.905230 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/172d410c-869f-4053-99a6-d41927b1b8e3-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-ppqd5\" (UID: \"172d410c-869f-4053-99a6-d41927b1b8e3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ppqd5" Apr 22 19:26:56.907014 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:56.906993 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/172d410c-869f-4053-99a6-d41927b1b8e3-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-ppqd5\" (UID: \"172d410c-869f-4053-99a6-d41927b1b8e3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ppqd5" Apr 22 19:26:56.915762 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:56.915732 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d459n\" (UniqueName: \"kubernetes.io/projected/172d410c-869f-4053-99a6-d41927b1b8e3-kube-api-access-d459n\") pod \"prometheus-operator-5676c8c784-ppqd5\" (UID: \"172d410c-869f-4053-99a6-d41927b1b8e3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ppqd5" Apr 22 19:26:57.409709 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:57.409656 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/172d410c-869f-4053-99a6-d41927b1b8e3-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-ppqd5\" (UID: \"172d410c-869f-4053-99a6-d41927b1b8e3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ppqd5" Apr 22 19:26:57.412004 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:57.411981 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/172d410c-869f-4053-99a6-d41927b1b8e3-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-ppqd5\" (UID: \"172d410c-869f-4053-99a6-d41927b1b8e3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ppqd5" Apr 22 19:26:57.643417 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:57.643381 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-ppqd5" Apr 22 19:26:57.758403 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:57.758377 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-ppqd5"] Apr 22 19:26:57.760751 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:26:57.760723 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod172d410c_869f_4053_99a6_d41927b1b8e3.slice/crio-dd8f52c7fb7c893cddea3ad0036755a23f1e71d12daeb9783e655efb23606947 WatchSource:0}: Error finding container dd8f52c7fb7c893cddea3ad0036755a23f1e71d12daeb9783e655efb23606947: Status 404 returned error can't find the container with id dd8f52c7fb7c893cddea3ad0036755a23f1e71d12daeb9783e655efb23606947 Apr 22 19:26:57.863840 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:57.863803 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-ppqd5" event={"ID":"172d410c-869f-4053-99a6-d41927b1b8e3","Type":"ContainerStarted","Data":"dd8f52c7fb7c893cddea3ad0036755a23f1e71d12daeb9783e655efb23606947"} Apr 22 19:26:59.870695 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:59.870655 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-ppqd5" event={"ID":"172d410c-869f-4053-99a6-d41927b1b8e3","Type":"ContainerStarted","Data":"7bbe962cfcdc531a1b7d252ec01be52722fcd73271c353c0554d46acc937bac0"} Apr 22 19:26:59.870695 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:59.870699 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-ppqd5" event={"ID":"172d410c-869f-4053-99a6-d41927b1b8e3","Type":"ContainerStarted","Data":"64478180a8d5ce8f14cd28fd3e23fb28462edbc1cff310de46ed5c7a6d8760d0"} Apr 22 19:26:59.893041 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:26:59.892963 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-ppqd5" podStartSLOduration=2.695198051 podStartE2EDuration="3.892947794s" podCreationTimestamp="2026-04-22 19:26:56 +0000 UTC" firstStartedPulling="2026-04-22 19:26:57.762539876 +0000 UTC m=+195.046210328" lastFinishedPulling="2026-04-22 19:26:58.960289618 +0000 UTC m=+196.243960071" observedRunningTime="2026-04-22 19:26:59.891888315 +0000 UTC m=+197.175558811" watchObservedRunningTime="2026-04-22 19:26:59.892947794 +0000 UTC m=+197.176618268" Apr 22 19:27:02.101492 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.101450 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-9bw78"] Apr 22 19:27:02.103915 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.103894 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-9bw78" Apr 22 19:27:02.106655 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.106631 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 22 19:27:02.106853 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.106695 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 22 19:27:02.106853 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.106739 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-l6tpm\"" Apr 22 19:27:02.107547 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.107530 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 22 19:27:02.117828 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.117805 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-9bw78"] Apr 22 19:27:02.129346 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.129319 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-mcwhq"] Apr 22 19:27:02.131419 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.131404 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-mcwhq" Apr 22 19:27:02.135959 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.135940 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-b8znt\"" Apr 22 19:27:02.136906 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.136884 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 19:27:02.146517 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.146495 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 19:27:02.146621 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.146579 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 19:27:02.249225 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.249195 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/aae64ea5-a201-4205-b675-7f24942d6b45-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-mcwhq\" (UID: \"aae64ea5-a201-4205-b675-7f24942d6b45\") " pod="openshift-monitoring/node-exporter-mcwhq" Apr 22 19:27:02.249225 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.249231 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f8634c7a-287f-4afc-8a9d-55a3e85c0c45-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-9bw78\" (UID: \"f8634c7a-287f-4afc-8a9d-55a3e85c0c45\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9bw78" Apr 22 19:27:02.249436 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.249271 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8634c7a-287f-4afc-8a9d-55a3e85c0c45-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-9bw78\" (UID: \"f8634c7a-287f-4afc-8a9d-55a3e85c0c45\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9bw78" Apr 22 19:27:02.249436 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.249339 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/aae64ea5-a201-4205-b675-7f24942d6b45-root\") pod \"node-exporter-mcwhq\" (UID: \"aae64ea5-a201-4205-b675-7f24942d6b45\") " pod="openshift-monitoring/node-exporter-mcwhq" Apr 22 19:27:02.249436 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.249395 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aae64ea5-a201-4205-b675-7f24942d6b45-metrics-client-ca\") pod \"node-exporter-mcwhq\" (UID: \"aae64ea5-a201-4205-b675-7f24942d6b45\") " pod="openshift-monitoring/node-exporter-mcwhq" Apr 22 19:27:02.249436 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.249415 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f8634c7a-287f-4afc-8a9d-55a3e85c0c45-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-9bw78\" (UID: \"f8634c7a-287f-4afc-8a9d-55a3e85c0c45\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9bw78" Apr 22 19:27:02.249436 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.249433 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvmml\" (UniqueName: \"kubernetes.io/projected/f8634c7a-287f-4afc-8a9d-55a3e85c0c45-kube-api-access-pvmml\") pod \"kube-state-metrics-69db897b98-9bw78\" (UID: \"f8634c7a-287f-4afc-8a9d-55a3e85c0c45\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9bw78" Apr 22 19:27:02.249588 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.249457 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/aae64ea5-a201-4205-b675-7f24942d6b45-node-exporter-wtmp\") pod \"node-exporter-mcwhq\" (UID: \"aae64ea5-a201-4205-b675-7f24942d6b45\") " pod="openshift-monitoring/node-exporter-mcwhq" Apr 22 19:27:02.249588 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.249478 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/aae64ea5-a201-4205-b675-7f24942d6b45-node-exporter-tls\") pod \"node-exporter-mcwhq\" (UID: \"aae64ea5-a201-4205-b675-7f24942d6b45\") " pod="openshift-monitoring/node-exporter-mcwhq" Apr 22 19:27:02.249588 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.249506 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f8634c7a-287f-4afc-8a9d-55a3e85c0c45-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-9bw78\" (UID: \"f8634c7a-287f-4afc-8a9d-55a3e85c0c45\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9bw78" Apr 22 19:27:02.249588 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.249540 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77lhm\" (UniqueName: \"kubernetes.io/projected/aae64ea5-a201-4205-b675-7f24942d6b45-kube-api-access-77lhm\") pod \"node-exporter-mcwhq\" (UID: \"aae64ea5-a201-4205-b675-7f24942d6b45\") " pod="openshift-monitoring/node-exporter-mcwhq" Apr 22 19:27:02.249588 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.249575 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/aae64ea5-a201-4205-b675-7f24942d6b45-node-exporter-textfile\") pod \"node-exporter-mcwhq\" (UID: \"aae64ea5-a201-4205-b675-7f24942d6b45\") " pod="openshift-monitoring/node-exporter-mcwhq" Apr 22 19:27:02.249735 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.249625 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/aae64ea5-a201-4205-b675-7f24942d6b45-node-exporter-accelerators-collector-config\") pod \"node-exporter-mcwhq\" (UID: \"aae64ea5-a201-4205-b675-7f24942d6b45\") " pod="openshift-monitoring/node-exporter-mcwhq" Apr 22 19:27:02.249735 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.249666 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f8634c7a-287f-4afc-8a9d-55a3e85c0c45-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-9bw78\" (UID: \"f8634c7a-287f-4afc-8a9d-55a3e85c0c45\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9bw78" Apr 22 19:27:02.249735 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.249695 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/aae64ea5-a201-4205-b675-7f24942d6b45-sys\") pod \"node-exporter-mcwhq\" (UID: \"aae64ea5-a201-4205-b675-7f24942d6b45\") " pod="openshift-monitoring/node-exporter-mcwhq" Apr 22 19:27:02.350468 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.350433 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/aae64ea5-a201-4205-b675-7f24942d6b45-node-exporter-tls\") pod \"node-exporter-mcwhq\" (UID: \"aae64ea5-a201-4205-b675-7f24942d6b45\") " pod="openshift-monitoring/node-exporter-mcwhq" Apr 22 19:27:02.350640 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.350477 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f8634c7a-287f-4afc-8a9d-55a3e85c0c45-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-9bw78\" (UID: \"f8634c7a-287f-4afc-8a9d-55a3e85c0c45\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9bw78" Apr 22 19:27:02.350640 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.350504 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-77lhm\" (UniqueName: \"kubernetes.io/projected/aae64ea5-a201-4205-b675-7f24942d6b45-kube-api-access-77lhm\") pod \"node-exporter-mcwhq\" (UID: \"aae64ea5-a201-4205-b675-7f24942d6b45\") " pod="openshift-monitoring/node-exporter-mcwhq" Apr 22 19:27:02.350640 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:27:02.350579 2574 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 19:27:02.350825 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.350638 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/aae64ea5-a201-4205-b675-7f24942d6b45-node-exporter-textfile\") pod \"node-exporter-mcwhq\" (UID: \"aae64ea5-a201-4205-b675-7f24942d6b45\") " pod="openshift-monitoring/node-exporter-mcwhq" Apr 22 19:27:02.350825 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:27:02.350646 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aae64ea5-a201-4205-b675-7f24942d6b45-node-exporter-tls podName:aae64ea5-a201-4205-b675-7f24942d6b45 nodeName:}" failed. No retries permitted until 2026-04-22 19:27:02.850630002 +0000 UTC m=+200.134300454 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/aae64ea5-a201-4205-b675-7f24942d6b45-node-exporter-tls") pod "node-exporter-mcwhq" (UID: "aae64ea5-a201-4205-b675-7f24942d6b45") : secret "node-exporter-tls" not found Apr 22 19:27:02.350825 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.350748 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/aae64ea5-a201-4205-b675-7f24942d6b45-node-exporter-accelerators-collector-config\") pod \"node-exporter-mcwhq\" (UID: \"aae64ea5-a201-4205-b675-7f24942d6b45\") " pod="openshift-monitoring/node-exporter-mcwhq" Apr 22 19:27:02.350825 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.350804 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f8634c7a-287f-4afc-8a9d-55a3e85c0c45-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-9bw78\" (UID: \"f8634c7a-287f-4afc-8a9d-55a3e85c0c45\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9bw78" Apr 22 19:27:02.351044 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.350843 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/aae64ea5-a201-4205-b675-7f24942d6b45-sys\") pod \"node-exporter-mcwhq\" (UID: \"aae64ea5-a201-4205-b675-7f24942d6b45\") " pod="openshift-monitoring/node-exporter-mcwhq" Apr 22 19:27:02.351044 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.350873 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/aae64ea5-a201-4205-b675-7f24942d6b45-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-mcwhq\" (UID: \"aae64ea5-a201-4205-b675-7f24942d6b45\") " pod="openshift-monitoring/node-exporter-mcwhq" Apr 22 19:27:02.351044 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.350905 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f8634c7a-287f-4afc-8a9d-55a3e85c0c45-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-9bw78\" (UID: \"f8634c7a-287f-4afc-8a9d-55a3e85c0c45\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9bw78" Apr 22 19:27:02.351044 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.350924 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/aae64ea5-a201-4205-b675-7f24942d6b45-sys\") pod \"node-exporter-mcwhq\" (UID: \"aae64ea5-a201-4205-b675-7f24942d6b45\") " pod="openshift-monitoring/node-exporter-mcwhq" Apr 22 19:27:02.351044 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.350907 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f8634c7a-287f-4afc-8a9d-55a3e85c0c45-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-9bw78\" (UID: \"f8634c7a-287f-4afc-8a9d-55a3e85c0c45\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9bw78" Apr 22 19:27:02.351044 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.350996 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8634c7a-287f-4afc-8a9d-55a3e85c0c45-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-9bw78\" (UID: \"f8634c7a-287f-4afc-8a9d-55a3e85c0c45\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9bw78" Apr 22 19:27:02.351044 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.351030 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/aae64ea5-a201-4205-b675-7f24942d6b45-node-exporter-textfile\") pod \"node-exporter-mcwhq\" (UID: \"aae64ea5-a201-4205-b675-7f24942d6b45\") " pod="openshift-monitoring/node-exporter-mcwhq" Apr 22 19:27:02.351044 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.351039 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/aae64ea5-a201-4205-b675-7f24942d6b45-root\") pod \"node-exporter-mcwhq\" (UID: \"aae64ea5-a201-4205-b675-7f24942d6b45\") " pod="openshift-monitoring/node-exporter-mcwhq" Apr 22 19:27:02.351454 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.351082 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aae64ea5-a201-4205-b675-7f24942d6b45-metrics-client-ca\") pod \"node-exporter-mcwhq\" (UID: \"aae64ea5-a201-4205-b675-7f24942d6b45\") " pod="openshift-monitoring/node-exporter-mcwhq" Apr 22 19:27:02.351454 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.351114 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f8634c7a-287f-4afc-8a9d-55a3e85c0c45-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-9bw78\" (UID: \"f8634c7a-287f-4afc-8a9d-55a3e85c0c45\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9bw78" Apr 22 19:27:02.351454 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.351142 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pvmml\" (UniqueName: \"kubernetes.io/projected/f8634c7a-287f-4afc-8a9d-55a3e85c0c45-kube-api-access-pvmml\") pod \"kube-state-metrics-69db897b98-9bw78\" (UID: \"f8634c7a-287f-4afc-8a9d-55a3e85c0c45\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9bw78" Apr 22 19:27:02.351454 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:27:02.351152 2574 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 22 19:27:02.351454 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.351169 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/aae64ea5-a201-4205-b675-7f24942d6b45-node-exporter-wtmp\") pod \"node-exporter-mcwhq\" (UID: \"aae64ea5-a201-4205-b675-7f24942d6b45\") " pod="openshift-monitoring/node-exporter-mcwhq" Apr 22 19:27:02.351454 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.351086 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/aae64ea5-a201-4205-b675-7f24942d6b45-root\") pod \"node-exporter-mcwhq\" (UID: \"aae64ea5-a201-4205-b675-7f24942d6b45\") " pod="openshift-monitoring/node-exporter-mcwhq" Apr 22 19:27:02.351454 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:27:02.351201 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8634c7a-287f-4afc-8a9d-55a3e85c0c45-kube-state-metrics-tls podName:f8634c7a-287f-4afc-8a9d-55a3e85c0c45 nodeName:}" failed. No retries permitted until 2026-04-22 19:27:02.851185634 +0000 UTC m=+200.134856089 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/f8634c7a-287f-4afc-8a9d-55a3e85c0c45-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-9bw78" (UID: "f8634c7a-287f-4afc-8a9d-55a3e85c0c45") : secret "kube-state-metrics-tls" not found Apr 22 19:27:02.351454 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.351349 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/aae64ea5-a201-4205-b675-7f24942d6b45-node-exporter-wtmp\") pod \"node-exporter-mcwhq\" (UID: \"aae64ea5-a201-4205-b675-7f24942d6b45\") " pod="openshift-monitoring/node-exporter-mcwhq" Apr 22 19:27:02.351454 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.351368 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/aae64ea5-a201-4205-b675-7f24942d6b45-node-exporter-accelerators-collector-config\") pod \"node-exporter-mcwhq\" (UID: \"aae64ea5-a201-4205-b675-7f24942d6b45\") " pod="openshift-monitoring/node-exporter-mcwhq" Apr 22 19:27:02.351824 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.351528 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f8634c7a-287f-4afc-8a9d-55a3e85c0c45-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-9bw78\" (UID: \"f8634c7a-287f-4afc-8a9d-55a3e85c0c45\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9bw78" Apr 22 19:27:02.351824 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.351740 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aae64ea5-a201-4205-b675-7f24942d6b45-metrics-client-ca\") pod \"node-exporter-mcwhq\" (UID: \"aae64ea5-a201-4205-b675-7f24942d6b45\") " pod="openshift-monitoring/node-exporter-mcwhq" Apr 22 19:27:02.351912 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.351827 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f8634c7a-287f-4afc-8a9d-55a3e85c0c45-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-9bw78\" (UID: \"f8634c7a-287f-4afc-8a9d-55a3e85c0c45\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9bw78" Apr 22 19:27:02.353208 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.353191 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/aae64ea5-a201-4205-b675-7f24942d6b45-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-mcwhq\" (UID: \"aae64ea5-a201-4205-b675-7f24942d6b45\") " pod="openshift-monitoring/node-exporter-mcwhq" Apr 22 19:27:02.353296 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.353223 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f8634c7a-287f-4afc-8a9d-55a3e85c0c45-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-9bw78\" (UID: \"f8634c7a-287f-4afc-8a9d-55a3e85c0c45\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9bw78" Apr 22 19:27:02.359090 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.359069 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-77lhm\" (UniqueName: \"kubernetes.io/projected/aae64ea5-a201-4205-b675-7f24942d6b45-kube-api-access-77lhm\") pod \"node-exporter-mcwhq\" (UID: \"aae64ea5-a201-4205-b675-7f24942d6b45\") " pod="openshift-monitoring/node-exporter-mcwhq" Apr 22 19:27:02.360222 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.360203 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvmml\" (UniqueName: \"kubernetes.io/projected/f8634c7a-287f-4afc-8a9d-55a3e85c0c45-kube-api-access-pvmml\") pod \"kube-state-metrics-69db897b98-9bw78\" (UID: \"f8634c7a-287f-4afc-8a9d-55a3e85c0c45\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9bw78" Apr 22 19:27:02.856516 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.856484 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8634c7a-287f-4afc-8a9d-55a3e85c0c45-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-9bw78\" (UID: \"f8634c7a-287f-4afc-8a9d-55a3e85c0c45\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9bw78" Apr 22 19:27:02.856704 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.856538 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/aae64ea5-a201-4205-b675-7f24942d6b45-node-exporter-tls\") pod \"node-exporter-mcwhq\" (UID: \"aae64ea5-a201-4205-b675-7f24942d6b45\") " pod="openshift-monitoring/node-exporter-mcwhq" Apr 22 19:27:02.858742 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.858719 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/aae64ea5-a201-4205-b675-7f24942d6b45-node-exporter-tls\") pod \"node-exporter-mcwhq\" (UID: \"aae64ea5-a201-4205-b675-7f24942d6b45\") " pod="openshift-monitoring/node-exporter-mcwhq" Apr 22 19:27:02.858892 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:02.858798 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8634c7a-287f-4afc-8a9d-55a3e85c0c45-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-9bw78\" (UID: \"f8634c7a-287f-4afc-8a9d-55a3e85c0c45\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9bw78" Apr 22 19:27:03.013134 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:03.013086 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-9bw78" Apr 22 19:27:03.040213 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:03.040181 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-mcwhq" Apr 22 19:27:03.152172 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:03.152148 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-9bw78"] Apr 22 19:27:03.153767 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:27:03.153737 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8634c7a_287f_4afc_8a9d_55a3e85c0c45.slice/crio-11afa0680e3aa212a5443f7f28d626497a4e6583a1c1c8600f9d60f4460df267 WatchSource:0}: Error finding container 11afa0680e3aa212a5443f7f28d626497a4e6583a1c1c8600f9d60f4460df267: Status 404 returned error can't find the container with id 11afa0680e3aa212a5443f7f28d626497a4e6583a1c1c8600f9d60f4460df267 Apr 22 19:27:03.885076 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:03.885037 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mcwhq" event={"ID":"aae64ea5-a201-4205-b675-7f24942d6b45","Type":"ContainerStarted","Data":"4a7ba65642a4e2ca8c91b848acf745836bbd1fefaa0d91a8e9e56f6a76f3b2af"} Apr 22 19:27:03.885217 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:03.885094 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mcwhq" event={"ID":"aae64ea5-a201-4205-b675-7f24942d6b45","Type":"ContainerStarted","Data":"52495e2197bcb29f43151a67f279fd5fac0a6ecd8ef4a9acb0abc115fdd806ee"} Apr 22 19:27:03.886492 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:03.886432 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-9bw78" event={"ID":"f8634c7a-287f-4afc-8a9d-55a3e85c0c45","Type":"ContainerStarted","Data":"11afa0680e3aa212a5443f7f28d626497a4e6583a1c1c8600f9d60f4460df267"} Apr 22 19:27:04.890299 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:04.890262 2574 generic.go:358] "Generic (PLEG): container finished" podID="aae64ea5-a201-4205-b675-7f24942d6b45" containerID="4a7ba65642a4e2ca8c91b848acf745836bbd1fefaa0d91a8e9e56f6a76f3b2af" exitCode=0 Apr 22 19:27:04.890694 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:04.890337 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mcwhq" event={"ID":"aae64ea5-a201-4205-b675-7f24942d6b45","Type":"ContainerDied","Data":"4a7ba65642a4e2ca8c91b848acf745836bbd1fefaa0d91a8e9e56f6a76f3b2af"} Apr 22 19:27:04.892117 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:04.892092 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-9bw78" event={"ID":"f8634c7a-287f-4afc-8a9d-55a3e85c0c45","Type":"ContainerStarted","Data":"2a94bf57555f5aa5f35cf658dcb303ab677d66b48a977d5c93344fb246c2b557"} Apr 22 19:27:04.892228 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:04.892121 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-9bw78" event={"ID":"f8634c7a-287f-4afc-8a9d-55a3e85c0c45","Type":"ContainerStarted","Data":"02ccb82be7764bcf96048fe5fd718e66fb908eee69a4bc3f531259ba8fd1b520"} Apr 22 19:27:04.892228 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:04.892131 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-9bw78" event={"ID":"f8634c7a-287f-4afc-8a9d-55a3e85c0c45","Type":"ContainerStarted","Data":"403c53f6ae712445213cf3081a029638af56f55f68d6ed72b8c8df987aa2c0fc"} Apr 22 19:27:04.936180 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:04.936128 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-9bw78" podStartSLOduration=1.736455222 podStartE2EDuration="2.936109407s" podCreationTimestamp="2026-04-22 19:27:02 +0000 UTC" firstStartedPulling="2026-04-22 19:27:03.155807751 +0000 UTC m=+200.439478201" lastFinishedPulling="2026-04-22 19:27:04.355461928 +0000 UTC m=+201.639132386" observedRunningTime="2026-04-22 19:27:04.935650514 +0000 UTC m=+202.219320988" watchObservedRunningTime="2026-04-22 19:27:04.936109407 +0000 UTC m=+202.219779881" Apr 22 19:27:05.081691 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.081659 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5d78bbfc6-lrjq2"] Apr 22 19:27:05.083999 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.083983 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5d78bbfc6-lrjq2" Apr 22 19:27:05.087994 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.087966 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 22 19:27:05.087994 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.087978 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 22 19:27:05.088158 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.088001 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 22 19:27:05.088158 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.088045 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-2pnpn\"" Apr 22 19:27:05.088158 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.088093 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-8muted3bb6hjd\"" Apr 22 19:27:05.088373 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.088356 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 22 19:27:05.088457 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.088404 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 22 19:27:05.105111 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.105083 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5d78bbfc6-lrjq2"] Apr 22 19:27:05.176572 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.176486 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/71de3a3d-9050-4664-b74b-e79fb26b7478-secret-grpc-tls\") pod \"thanos-querier-5d78bbfc6-lrjq2\" (UID: \"71de3a3d-9050-4664-b74b-e79fb26b7478\") " pod="openshift-monitoring/thanos-querier-5d78bbfc6-lrjq2" Apr 22 19:27:05.176572 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.176540 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/71de3a3d-9050-4664-b74b-e79fb26b7478-metrics-client-ca\") pod \"thanos-querier-5d78bbfc6-lrjq2\" (UID: \"71de3a3d-9050-4664-b74b-e79fb26b7478\") " pod="openshift-monitoring/thanos-querier-5d78bbfc6-lrjq2" Apr 22 19:27:05.176572 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.176562 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdkcd\" (UniqueName: \"kubernetes.io/projected/71de3a3d-9050-4664-b74b-e79fb26b7478-kube-api-access-bdkcd\") pod \"thanos-querier-5d78bbfc6-lrjq2\" (UID: \"71de3a3d-9050-4664-b74b-e79fb26b7478\") " pod="openshift-monitoring/thanos-querier-5d78bbfc6-lrjq2" Apr 22 19:27:05.176822 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.176614 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/71de3a3d-9050-4664-b74b-e79fb26b7478-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5d78bbfc6-lrjq2\" (UID: \"71de3a3d-9050-4664-b74b-e79fb26b7478\") " pod="openshift-monitoring/thanos-querier-5d78bbfc6-lrjq2" Apr 22 19:27:05.176822 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.176679 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/71de3a3d-9050-4664-b74b-e79fb26b7478-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5d78bbfc6-lrjq2\" (UID: \"71de3a3d-9050-4664-b74b-e79fb26b7478\") " pod="openshift-monitoring/thanos-querier-5d78bbfc6-lrjq2" Apr 22 19:27:05.176822 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.176722 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/71de3a3d-9050-4664-b74b-e79fb26b7478-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5d78bbfc6-lrjq2\" (UID: \"71de3a3d-9050-4664-b74b-e79fb26b7478\") " pod="openshift-monitoring/thanos-querier-5d78bbfc6-lrjq2" Apr 22 19:27:05.176822 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.176751 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/71de3a3d-9050-4664-b74b-e79fb26b7478-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5d78bbfc6-lrjq2\" (UID: \"71de3a3d-9050-4664-b74b-e79fb26b7478\") " pod="openshift-monitoring/thanos-querier-5d78bbfc6-lrjq2" Apr 22 19:27:05.176822 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.176769 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/71de3a3d-9050-4664-b74b-e79fb26b7478-secret-thanos-querier-tls\") pod \"thanos-querier-5d78bbfc6-lrjq2\" (UID: \"71de3a3d-9050-4664-b74b-e79fb26b7478\") " pod="openshift-monitoring/thanos-querier-5d78bbfc6-lrjq2" Apr 22 19:27:05.278128 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.278098 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bdkcd\" (UniqueName: \"kubernetes.io/projected/71de3a3d-9050-4664-b74b-e79fb26b7478-kube-api-access-bdkcd\") pod \"thanos-querier-5d78bbfc6-lrjq2\" (UID: \"71de3a3d-9050-4664-b74b-e79fb26b7478\") " pod="openshift-monitoring/thanos-querier-5d78bbfc6-lrjq2" Apr 22 19:27:05.278206 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.278142 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/71de3a3d-9050-4664-b74b-e79fb26b7478-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5d78bbfc6-lrjq2\" (UID: \"71de3a3d-9050-4664-b74b-e79fb26b7478\") " pod="openshift-monitoring/thanos-querier-5d78bbfc6-lrjq2" Apr 22 19:27:05.278338 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.278315 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/71de3a3d-9050-4664-b74b-e79fb26b7478-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5d78bbfc6-lrjq2\" (UID: \"71de3a3d-9050-4664-b74b-e79fb26b7478\") " pod="openshift-monitoring/thanos-querier-5d78bbfc6-lrjq2" Apr 22 19:27:05.278386 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.278365 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/71de3a3d-9050-4664-b74b-e79fb26b7478-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5d78bbfc6-lrjq2\" (UID: \"71de3a3d-9050-4664-b74b-e79fb26b7478\") " pod="openshift-monitoring/thanos-querier-5d78bbfc6-lrjq2" Apr 22 19:27:05.278445 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.278421 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/71de3a3d-9050-4664-b74b-e79fb26b7478-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5d78bbfc6-lrjq2\" (UID: \"71de3a3d-9050-4664-b74b-e79fb26b7478\") " pod="openshift-monitoring/thanos-querier-5d78bbfc6-lrjq2" Apr 22 19:27:05.278500 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.278474 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/71de3a3d-9050-4664-b74b-e79fb26b7478-secret-thanos-querier-tls\") pod \"thanos-querier-5d78bbfc6-lrjq2\" (UID: \"71de3a3d-9050-4664-b74b-e79fb26b7478\") " pod="openshift-monitoring/thanos-querier-5d78bbfc6-lrjq2" Apr 22 19:27:05.278552 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.278541 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/71de3a3d-9050-4664-b74b-e79fb26b7478-secret-grpc-tls\") pod \"thanos-querier-5d78bbfc6-lrjq2\" (UID: \"71de3a3d-9050-4664-b74b-e79fb26b7478\") " pod="openshift-monitoring/thanos-querier-5d78bbfc6-lrjq2" Apr 22 19:27:05.278658 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.278612 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/71de3a3d-9050-4664-b74b-e79fb26b7478-metrics-client-ca\") pod \"thanos-querier-5d78bbfc6-lrjq2\" (UID: \"71de3a3d-9050-4664-b74b-e79fb26b7478\") " pod="openshift-monitoring/thanos-querier-5d78bbfc6-lrjq2" Apr 22 19:27:05.279410 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.279382 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/71de3a3d-9050-4664-b74b-e79fb26b7478-metrics-client-ca\") pod \"thanos-querier-5d78bbfc6-lrjq2\" (UID: \"71de3a3d-9050-4664-b74b-e79fb26b7478\") " pod="openshift-monitoring/thanos-querier-5d78bbfc6-lrjq2" Apr 22 19:27:05.280810 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.280761 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/71de3a3d-9050-4664-b74b-e79fb26b7478-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5d78bbfc6-lrjq2\" (UID: \"71de3a3d-9050-4664-b74b-e79fb26b7478\") " pod="openshift-monitoring/thanos-querier-5d78bbfc6-lrjq2" Apr 22 19:27:05.281202 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.281181 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/71de3a3d-9050-4664-b74b-e79fb26b7478-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5d78bbfc6-lrjq2\" (UID: \"71de3a3d-9050-4664-b74b-e79fb26b7478\") " pod="openshift-monitoring/thanos-querier-5d78bbfc6-lrjq2" Apr 22 19:27:05.281298 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.281244 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/71de3a3d-9050-4664-b74b-e79fb26b7478-secret-thanos-querier-tls\") pod \"thanos-querier-5d78bbfc6-lrjq2\" (UID: \"71de3a3d-9050-4664-b74b-e79fb26b7478\") " pod="openshift-monitoring/thanos-querier-5d78bbfc6-lrjq2" Apr 22 19:27:05.281298 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.281238 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/71de3a3d-9050-4664-b74b-e79fb26b7478-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5d78bbfc6-lrjq2\" (UID: \"71de3a3d-9050-4664-b74b-e79fb26b7478\") " pod="openshift-monitoring/thanos-querier-5d78bbfc6-lrjq2" Apr 22 19:27:05.281378 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.281346 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/71de3a3d-9050-4664-b74b-e79fb26b7478-secret-grpc-tls\") pod \"thanos-querier-5d78bbfc6-lrjq2\" (UID: \"71de3a3d-9050-4664-b74b-e79fb26b7478\") " pod="openshift-monitoring/thanos-querier-5d78bbfc6-lrjq2" Apr 22 19:27:05.281520 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.281503 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/71de3a3d-9050-4664-b74b-e79fb26b7478-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5d78bbfc6-lrjq2\" (UID: \"71de3a3d-9050-4664-b74b-e79fb26b7478\") " pod="openshift-monitoring/thanos-querier-5d78bbfc6-lrjq2" Apr 22 19:27:05.285694 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.285677 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdkcd\" (UniqueName: \"kubernetes.io/projected/71de3a3d-9050-4664-b74b-e79fb26b7478-kube-api-access-bdkcd\") pod \"thanos-querier-5d78bbfc6-lrjq2\" (UID: \"71de3a3d-9050-4664-b74b-e79fb26b7478\") " pod="openshift-monitoring/thanos-querier-5d78bbfc6-lrjq2" Apr 22 19:27:05.393304 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.393265 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5d78bbfc6-lrjq2" Apr 22 19:27:05.534795 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.534756 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5d78bbfc6-lrjq2"] Apr 22 19:27:05.538024 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:27:05.537991 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71de3a3d_9050_4664_b74b_e79fb26b7478.slice/crio-dec4c50ff05b2fd53626170e98816f133a6925541137f935699a9c81a5360bda WatchSource:0}: Error finding container dec4c50ff05b2fd53626170e98816f133a6925541137f935699a9c81a5360bda: Status 404 returned error can't find the container with id dec4c50ff05b2fd53626170e98816f133a6925541137f935699a9c81a5360bda Apr 22 19:27:05.895725 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.895688 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d78bbfc6-lrjq2" event={"ID":"71de3a3d-9050-4664-b74b-e79fb26b7478","Type":"ContainerStarted","Data":"dec4c50ff05b2fd53626170e98816f133a6925541137f935699a9c81a5360bda"} Apr 22 19:27:05.897366 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.897340 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mcwhq" event={"ID":"aae64ea5-a201-4205-b675-7f24942d6b45","Type":"ContainerStarted","Data":"e77cc3f8648ef687e3e078efacc679e5f1df9f2fc1c7ae4d89992c89129d5b7a"} Apr 22 19:27:05.897499 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.897371 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mcwhq" event={"ID":"aae64ea5-a201-4205-b675-7f24942d6b45","Type":"ContainerStarted","Data":"3ca930f215b20bb45deae21a83219df52d49f0da9b4a6e80cb1a2a76ecf58519"} Apr 22 19:27:05.918898 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:05.918854 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-mcwhq" podStartSLOduration=3.164597225 podStartE2EDuration="3.91884151s" podCreationTimestamp="2026-04-22 19:27:02 +0000 UTC" firstStartedPulling="2026-04-22 19:27:03.052102821 +0000 UTC m=+200.335773271" lastFinishedPulling="2026-04-22 19:27:03.806347102 +0000 UTC m=+201.090017556" observedRunningTime="2026-04-22 19:27:05.917164479 +0000 UTC m=+203.200834952" watchObservedRunningTime="2026-04-22 19:27:05.91884151 +0000 UTC m=+203.202511982" Apr 22 19:27:07.905401 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:07.905364 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d78bbfc6-lrjq2" event={"ID":"71de3a3d-9050-4664-b74b-e79fb26b7478","Type":"ContainerStarted","Data":"9e5cb0a6ae3698acf44b72222e4fc1f202307d84d3eedbb7ff714d43c4f28fcd"} Apr 22 19:27:07.905401 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:07.905406 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d78bbfc6-lrjq2" event={"ID":"71de3a3d-9050-4664-b74b-e79fb26b7478","Type":"ContainerStarted","Data":"22fdd0d104a62fe745bdfbf119eee743918bed788bccf22824ccf38fd26f3705"} Apr 22 19:27:07.905827 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:07.905416 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d78bbfc6-lrjq2" event={"ID":"71de3a3d-9050-4664-b74b-e79fb26b7478","Type":"ContainerStarted","Data":"dec8f87932792c2f7346a0ca9574d8303ac8ddfa611355f82f259adc63a92827"} Apr 22 19:27:08.237576 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.237363 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 19:27:08.240623 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.240590 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.244360 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.244125 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 19:27:08.244360 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.244185 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 19:27:08.244360 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.244125 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 19:27:08.244582 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.244362 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 19:27:08.244649 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.244593 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-13shseuhu1e6d\"" Apr 22 19:27:08.245880 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.245597 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 19:27:08.245880 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.245702 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 19:27:08.245880 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.245597 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 19:27:08.246091 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.245923 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 19:27:08.246174 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.246137 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 19:27:08.246799 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.246399 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 19:27:08.246799 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.246513 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 19:27:08.246799 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.246677 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-t2rjk\"" Apr 22 19:27:08.247249 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.247217 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 19:27:08.248304 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.248278 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 19:27:08.255850 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.255659 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 19:27:08.406461 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.406421 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.406638 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.406476 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.406638 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.406509 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klznw\" (UniqueName: \"kubernetes.io/projected/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-kube-api-access-klznw\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.406638 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.406593 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.406810 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.406655 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-web-config\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.406810 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.406695 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.406810 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.406749 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.406810 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.406766 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.406998 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.406830 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.406998 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.406873 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.406998 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.406901 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-config\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.406998 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.406930 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-config-out\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.406998 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.406961 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.406998 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.406986 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.407300 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.407033 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.407300 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.407060 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.407300 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.407089 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.407300 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.407118 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.508250 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.508223 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.508433 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.508411 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.508488 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.508454 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.508543 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.508497 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.508709 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.508642 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.508709 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.508683 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-config\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.508864 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.508707 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-config-out\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.508864 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.508738 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.508864 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.508766 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.508864 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.508805 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.508864 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.508842 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.509102 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.508875 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.509102 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.508906 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.509102 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.508966 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.509102 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.509046 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.509102 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.509068 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.509102 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.509079 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.509373 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.509125 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-klznw\" (UniqueName: \"kubernetes.io/projected/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-kube-api-access-klznw\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.509373 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.509159 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.509373 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.509202 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-web-config\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.509501 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.509461 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.509750 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.509727 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.510333 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.510304 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.511924 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.511548 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.513926 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.513866 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-config-out\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.515284 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.515241 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.515451 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.515384 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.515511 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.515477 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-config\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.516275 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.516251 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-web-config\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.518828 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.518751 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.522427 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.520332 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.522427 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.522231 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-klznw\" (UniqueName: \"kubernetes.io/projected/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-kube-api-access-klznw\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.522979 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.522839 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.522979 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.522941 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.524743 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.524719 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.524860 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.524809 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.557090 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.557025 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:08.710818 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.710797 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 19:27:08.713238 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:27:08.713206 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ea47d7f_14f1_4dfa_a03b_bfc687e938cb.slice/crio-6bdc7a5f2a1afb0ab74823d6832abcc66ba41161fd7675181d12fa6fc2941820 WatchSource:0}: Error finding container 6bdc7a5f2a1afb0ab74823d6832abcc66ba41161fd7675181d12fa6fc2941820: Status 404 returned error can't find the container with id 6bdc7a5f2a1afb0ab74823d6832abcc66ba41161fd7675181d12fa6fc2941820 Apr 22 19:27:08.908739 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.908649 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb","Type":"ContainerStarted","Data":"6bdc7a5f2a1afb0ab74823d6832abcc66ba41161fd7675181d12fa6fc2941820"} Apr 22 19:27:08.911132 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.911107 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d78bbfc6-lrjq2" event={"ID":"71de3a3d-9050-4664-b74b-e79fb26b7478","Type":"ContainerStarted","Data":"62656d50d79337b1b649d011ceb277147379238b78c30f1746fdc7e2ccbdcc31"} Apr 22 19:27:08.911244 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.911142 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d78bbfc6-lrjq2" event={"ID":"71de3a3d-9050-4664-b74b-e79fb26b7478","Type":"ContainerStarted","Data":"2a5c3a756155875875f2ac3b56eba9e76918f6c6c0f342345751f15f57297203"} Apr 22 19:27:08.911244 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.911157 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d78bbfc6-lrjq2" event={"ID":"71de3a3d-9050-4664-b74b-e79fb26b7478","Type":"ContainerStarted","Data":"7f52428c3f151e7b6179a651cef7f2edb466242e26addf8c42a71ed0caa8798d"} Apr 22 19:27:08.911397 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.911376 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-5d78bbfc6-lrjq2" Apr 22 19:27:08.933436 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:08.933393 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5d78bbfc6-lrjq2" podStartSLOduration=1.019117233 podStartE2EDuration="3.933379898s" podCreationTimestamp="2026-04-22 19:27:05 +0000 UTC" firstStartedPulling="2026-04-22 19:27:05.539976562 +0000 UTC m=+202.823647012" lastFinishedPulling="2026-04-22 19:27:08.454239222 +0000 UTC m=+205.737909677" observedRunningTime="2026-04-22 19:27:08.933070059 +0000 UTC m=+206.216740543" watchObservedRunningTime="2026-04-22 19:27:08.933379898 +0000 UTC m=+206.217050774" Apr 22 19:27:09.914675 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:09.914640 2574 generic.go:358] "Generic (PLEG): container finished" podID="6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" containerID="31be76f55f39b31526822e2805c9cc6dcedb4f0e13122f0eb1b39c400781c9da" exitCode=0 Apr 22 19:27:09.915033 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:09.914705 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb","Type":"ContainerDied","Data":"31be76f55f39b31526822e2805c9cc6dcedb4f0e13122f0eb1b39c400781c9da"} Apr 22 19:27:12.926425 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:12.926340 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb","Type":"ContainerStarted","Data":"550a816df00b179a7b454177bcec6ad273f23337b85756b48f8600a6917c1f26"} Apr 22 19:27:12.926425 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:12.926382 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb","Type":"ContainerStarted","Data":"27bac8357b9f957d1b9f628f582a3b2348a55e0da560d4b72353160099686d2a"} Apr 22 19:27:12.926425 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:12.926395 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb","Type":"ContainerStarted","Data":"bec1a49cd600b02fca9ab88548ab95cbfe5ad7d9cf0418dfe9e0c7fd7ee7c430"} Apr 22 19:27:13.931951 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:13.931914 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb","Type":"ContainerStarted","Data":"66d037a2108a703d00c2e084f9b02245aee4d7374fd828933530e9a0e44de03f"} Apr 22 19:27:13.931951 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:13.931952 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb","Type":"ContainerStarted","Data":"eeb8f83e81d27811de9a6e34526f2831e592b4d49ac1f2405911e02e107c7dc7"} Apr 22 19:27:13.932363 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:13.931964 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb","Type":"ContainerStarted","Data":"68206fb2e5ed92db5c7ad568fc3ea0b63f100c599fecb9d48bcb4c379a1816e4"} Apr 22 19:27:13.975705 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:13.975655 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.969849773 podStartE2EDuration="5.975640953s" podCreationTimestamp="2026-04-22 19:27:08 +0000 UTC" firstStartedPulling="2026-04-22 19:27:08.715834769 +0000 UTC m=+205.999505220" lastFinishedPulling="2026-04-22 19:27:12.721625946 +0000 UTC m=+210.005296400" observedRunningTime="2026-04-22 19:27:13.973160915 +0000 UTC m=+211.256831409" watchObservedRunningTime="2026-04-22 19:27:13.975640953 +0000 UTC m=+211.259311427" Apr 22 19:27:14.921065 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:14.921034 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5d78bbfc6-lrjq2" Apr 22 19:27:15.269393 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:15.269358 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-658bf49c4d-qxl4w"] Apr 22 19:27:15.269759 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:27:15.269592 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-658bf49c4d-qxl4w" podUID="4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e" Apr 22 19:27:15.938277 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:15.938243 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-658bf49c4d-qxl4w" Apr 22 19:27:15.942990 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:15.942969 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-658bf49c4d-qxl4w" Apr 22 19:27:15.983337 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:15.983314 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-image-registry-private-configuration\") pod \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\" (UID: \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\") " Apr 22 19:27:15.983464 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:15.983349 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-installation-pull-secrets\") pod \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\" (UID: \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\") " Apr 22 19:27:15.983549 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:15.983532 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-registry-certificates\") pod \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\" (UID: \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\") " Apr 22 19:27:15.983609 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:15.983585 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-bound-sa-token\") pod \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\" (UID: \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\") " Apr 22 19:27:15.983664 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:15.983635 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-trusted-ca\") pod \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\" (UID: \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\") " Apr 22 19:27:15.983664 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:15.983659 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-ca-trust-extracted\") pod \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\" (UID: \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\") " Apr 22 19:27:15.983759 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:15.983688 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwhj8\" (UniqueName: \"kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-kube-api-access-wwhj8\") pod \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\" (UID: \"4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e\") " Apr 22 19:27:15.983880 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:15.983849 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e" (UID: "4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:27:15.984022 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:15.983979 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e" (UID: "4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:27:15.984131 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:15.984114 2574 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-registry-certificates\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:27:15.984233 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:15.984136 2574 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-ca-trust-extracted\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:27:15.984233 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:15.984148 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e" (UID: "4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:27:15.985731 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:15.985706 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e" (UID: "4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:27:15.985848 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:15.985738 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e" (UID: "4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:27:15.985899 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:15.985840 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e" (UID: "4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:27:15.985943 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:15.985906 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-kube-api-access-wwhj8" (OuterVolumeSpecName: "kube-api-access-wwhj8") pod "4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e" (UID: "4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e"). InnerVolumeSpecName "kube-api-access-wwhj8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:27:16.084682 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:16.084645 2574 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-image-registry-private-configuration\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:27:16.084682 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:16.084676 2574 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-installation-pull-secrets\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:27:16.084682 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:16.084687 2574 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-bound-sa-token\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:27:16.084926 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:16.084696 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-trusted-ca\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:27:16.084926 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:16.084705 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wwhj8\" (UniqueName: \"kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-kube-api-access-wwhj8\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:27:16.940502 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:16.940471 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-658bf49c4d-qxl4w" Apr 22 19:27:16.975580 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:16.975550 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-658bf49c4d-qxl4w"] Apr 22 19:27:16.979038 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:16.979015 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-658bf49c4d-qxl4w"] Apr 22 19:27:16.992308 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:16.992282 2574 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e-registry-tls\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:27:17.349532 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:17.349499 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e" path="/var/lib/kubelet/pods/4a2b8746-20a2-4a1d-bca8-4c3cf1efa47e/volumes" Apr 22 19:27:18.557302 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:18.557271 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:27:37.790904 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:37.790876 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zsm2s_e794b825-b003-4cc7-9af6-8dd829fbea84/dns-node-resolver/0.log" Apr 22 19:27:45.016383 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:45.016347 2574 generic.go:358] "Generic (PLEG): container finished" podID="0185be2e-9ff1-4f5d-a6ea-d439437f5421" containerID="ce034462fe819e5908ea779ef67b060d17a41e98714ec6771232a5867b15f634" exitCode=0 Apr 22 19:27:45.016765 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:45.016425 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8fqmb" event={"ID":"0185be2e-9ff1-4f5d-a6ea-d439437f5421","Type":"ContainerDied","Data":"ce034462fe819e5908ea779ef67b060d17a41e98714ec6771232a5867b15f634"} Apr 22 19:27:45.016765 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:45.016741 2574 scope.go:117] "RemoveContainer" containerID="ce034462fe819e5908ea779ef67b060d17a41e98714ec6771232a5867b15f634" Apr 22 19:27:46.020896 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:46.020863 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8fqmb" event={"ID":"0185be2e-9ff1-4f5d-a6ea-d439437f5421","Type":"ContainerStarted","Data":"51376db097b3d4bdc15c2b659da80b6e72d1d23e52ed6ecba3a3737d5eec639c"} Apr 22 19:27:55.125060 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:55.124969 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93109160-9bbe-497f-9b25-d7fa7e08508f-metrics-certs\") pod \"network-metrics-daemon-mdzdp\" (UID: \"93109160-9bbe-497f-9b25-d7fa7e08508f\") " pod="openshift-multus/network-metrics-daemon-mdzdp" Apr 22 19:27:55.127240 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:55.127219 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93109160-9bbe-497f-9b25-d7fa7e08508f-metrics-certs\") pod \"network-metrics-daemon-mdzdp\" (UID: \"93109160-9bbe-497f-9b25-d7fa7e08508f\") " pod="openshift-multus/network-metrics-daemon-mdzdp" Apr 22 19:27:55.149268 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:55.149240 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tlsqf\"" Apr 22 19:27:55.156728 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:55.156709 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdzdp" Apr 22 19:27:55.270037 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:55.270003 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mdzdp"] Apr 22 19:27:55.273359 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:27:55.273336 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93109160_9bbe_497f_9b25_d7fa7e08508f.slice/crio-17f0e50dc6ba5994d743ecc7badc26279be962dd3c8ef9899aa11924a02bb85c WatchSource:0}: Error finding container 17f0e50dc6ba5994d743ecc7badc26279be962dd3c8ef9899aa11924a02bb85c: Status 404 returned error can't find the container with id 17f0e50dc6ba5994d743ecc7badc26279be962dd3c8ef9899aa11924a02bb85c Apr 22 19:27:56.050043 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:56.049990 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mdzdp" event={"ID":"93109160-9bbe-497f-9b25-d7fa7e08508f","Type":"ContainerStarted","Data":"17f0e50dc6ba5994d743ecc7badc26279be962dd3c8ef9899aa11924a02bb85c"} Apr 22 19:27:57.054515 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:57.054483 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mdzdp" event={"ID":"93109160-9bbe-497f-9b25-d7fa7e08508f","Type":"ContainerStarted","Data":"4c5afa645d990a6f8af13b97f75b1b22b16c0ad123a52d75188d1c9398f6794c"} Apr 22 19:27:57.054515 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:57.054518 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mdzdp" event={"ID":"93109160-9bbe-497f-9b25-d7fa7e08508f","Type":"ContainerStarted","Data":"f05355d25840cdf1ee936ee131688341a8f2ad1a238b34bd870a385ee43f66f9"} Apr 22 19:27:57.072513 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:27:57.072461 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-mdzdp" podStartSLOduration=253.211389526 podStartE2EDuration="4m14.072443207s" podCreationTimestamp="2026-04-22 19:23:43 +0000 UTC" firstStartedPulling="2026-04-22 19:27:55.275064141 +0000 UTC m=+252.558734591" lastFinishedPulling="2026-04-22 19:27:56.136117806 +0000 UTC m=+253.419788272" observedRunningTime="2026-04-22 19:27:57.072106432 +0000 UTC m=+254.355776916" watchObservedRunningTime="2026-04-22 19:27:57.072443207 +0000 UTC m=+254.356113681" Apr 22 19:28:08.557974 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:08.557933 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:08.576871 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:08.576844 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:09.101527 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:09.101503 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:21.757328 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:28:21.757285 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-qsnbz" podUID="8207bbac-ad34-4f98-a8af-1a2daaa6ea59" Apr 22 19:28:21.757328 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:28:21.757317 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-lnt5s" podUID="be5b5131-1460-4746-96a8-4720ed712cf1" Apr 22 19:28:22.121361 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:22.121286 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qsnbz" Apr 22 19:28:22.121501 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:22.121286 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lnt5s" Apr 22 19:28:22.759604 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:28:22.759558 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-2cwpp" podUID="76e16566-d9dd-4f0b-9cc5-a72e27e4518a" Apr 22 19:28:23.124482 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:23.124404 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-2cwpp" Apr 22 19:28:25.083794 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:25.083746 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8207bbac-ad34-4f98-a8af-1a2daaa6ea59-cert\") pod \"ingress-canary-qsnbz\" (UID: \"8207bbac-ad34-4f98-a8af-1a2daaa6ea59\") " pod="openshift-ingress-canary/ingress-canary-qsnbz" Apr 22 19:28:25.086117 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:25.086094 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8207bbac-ad34-4f98-a8af-1a2daaa6ea59-cert\") pod \"ingress-canary-qsnbz\" (UID: \"8207bbac-ad34-4f98-a8af-1a2daaa6ea59\") " pod="openshift-ingress-canary/ingress-canary-qsnbz" Apr 22 19:28:25.125207 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:25.125182 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-76gtl\"" Apr 22 19:28:25.132981 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:25.132963 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qsnbz" Apr 22 19:28:25.185358 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:25.185322 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be5b5131-1460-4746-96a8-4720ed712cf1-metrics-tls\") pod \"dns-default-lnt5s\" (UID: \"be5b5131-1460-4746-96a8-4720ed712cf1\") " pod="openshift-dns/dns-default-lnt5s" Apr 22 19:28:25.187962 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:25.187936 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be5b5131-1460-4746-96a8-4720ed712cf1-metrics-tls\") pod \"dns-default-lnt5s\" (UID: \"be5b5131-1460-4746-96a8-4720ed712cf1\") " pod="openshift-dns/dns-default-lnt5s" Apr 22 19:28:25.255661 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:25.255610 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qsnbz"] Apr 22 19:28:25.259357 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:28:25.259328 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8207bbac_ad34_4f98_a8af_1a2daaa6ea59.slice/crio-a6fad51438b339754162ff48f6bc7721eb033348973417aac768cbfd83e18c33 WatchSource:0}: Error finding container a6fad51438b339754162ff48f6bc7721eb033348973417aac768cbfd83e18c33: Status 404 returned error can't find the container with id a6fad51438b339754162ff48f6bc7721eb033348973417aac768cbfd83e18c33 Apr 22 19:28:25.424914 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:25.424832 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-jd9zz\"" Apr 22 19:28:25.432957 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:25.432937 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lnt5s" Apr 22 19:28:25.546470 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:25.546444 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lnt5s"] Apr 22 19:28:25.548904 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:28:25.548872 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe5b5131_1460_4746_96a8_4720ed712cf1.slice/crio-f430c7b796113a3e5f985223b825ae19fa0558dada813bfed4b5d00b6d0718b8 WatchSource:0}: Error finding container f430c7b796113a3e5f985223b825ae19fa0558dada813bfed4b5d00b6d0718b8: Status 404 returned error can't find the container with id f430c7b796113a3e5f985223b825ae19fa0558dada813bfed4b5d00b6d0718b8 Apr 22 19:28:25.892123 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:25.892087 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/76e16566-d9dd-4f0b-9cc5-a72e27e4518a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-2cwpp\" (UID: \"76e16566-d9dd-4f0b-9cc5-a72e27e4518a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-2cwpp" Apr 22 19:28:25.894419 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:25.894401 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/76e16566-d9dd-4f0b-9cc5-a72e27e4518a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-2cwpp\" (UID: \"76e16566-d9dd-4f0b-9cc5-a72e27e4518a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-2cwpp" Apr 22 19:28:26.127897 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:26.127852 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-l8htc\"" Apr 22 19:28:26.134096 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:26.134058 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lnt5s" event={"ID":"be5b5131-1460-4746-96a8-4720ed712cf1","Type":"ContainerStarted","Data":"f430c7b796113a3e5f985223b825ae19fa0558dada813bfed4b5d00b6d0718b8"} Apr 22 19:28:26.134834 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:26.134808 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-2cwpp" Apr 22 19:28:26.135630 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:26.135605 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qsnbz" event={"ID":"8207bbac-ad34-4f98-a8af-1a2daaa6ea59","Type":"ContainerStarted","Data":"a6fad51438b339754162ff48f6bc7721eb033348973417aac768cbfd83e18c33"} Apr 22 19:28:26.330622 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:26.330565 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-2cwpp"] Apr 22 19:28:26.338194 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:28:26.338034 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76e16566_d9dd_4f0b_9cc5_a72e27e4518a.slice/crio-b8569e098b762405964fb7e96a0bde18ed1572489ddf976c874b644964c00f49 WatchSource:0}: Error finding container b8569e098b762405964fb7e96a0bde18ed1572489ddf976c874b644964c00f49: Status 404 returned error can't find the container with id b8569e098b762405964fb7e96a0bde18ed1572489ddf976c874b644964c00f49 Apr 22 19:28:26.737639 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:26.737595 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 19:28:26.738704 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:26.738234 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" containerName="kube-rbac-proxy" containerID="cri-o://eeb8f83e81d27811de9a6e34526f2831e592b4d49ac1f2405911e02e107c7dc7" gracePeriod=600 Apr 22 19:28:26.738704 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:26.738319 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" containerName="kube-rbac-proxy-web" containerID="cri-o://68206fb2e5ed92db5c7ad568fc3ea0b63f100c599fecb9d48bcb4c379a1816e4" gracePeriod=600 Apr 22 19:28:26.738704 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:26.738375 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" containerName="thanos-sidecar" containerID="cri-o://550a816df00b179a7b454177bcec6ad273f23337b85756b48f8600a6917c1f26" gracePeriod=600 Apr 22 19:28:26.738704 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:26.738429 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" containerName="config-reloader" containerID="cri-o://27bac8357b9f957d1b9f628f582a3b2348a55e0da560d4b72353160099686d2a" gracePeriod=600 Apr 22 19:28:26.738704 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:26.738478 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" containerName="prometheus" containerID="cri-o://bec1a49cd600b02fca9ab88548ab95cbfe5ad7d9cf0418dfe9e0c7fd7ee7c430" gracePeriod=600 Apr 22 19:28:26.738704 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:26.738544 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" containerName="kube-rbac-proxy-thanos" containerID="cri-o://66d037a2108a703d00c2e084f9b02245aee4d7374fd828933530e9a0e44de03f" gracePeriod=600 Apr 22 19:28:27.140837 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:27.140800 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-2cwpp" event={"ID":"76e16566-d9dd-4f0b-9cc5-a72e27e4518a","Type":"ContainerStarted","Data":"b8569e098b762405964fb7e96a0bde18ed1572489ddf976c874b644964c00f49"} Apr 22 19:28:27.809615 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:27.809565 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:27.912497 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:27.912467 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-config\") pod \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " Apr 22 19:28:27.912642 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:27.912519 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-prometheus-trusted-ca-bundle\") pod \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " Apr 22 19:28:27.912642 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:27.912547 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-secret-prometheus-k8s-tls\") pod \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " Apr 22 19:28:27.912775 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:27.912636 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-secret-kube-rbac-proxy\") pod \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " Apr 22 19:28:27.912775 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:27.912683 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-configmap-serving-certs-ca-bundle\") pod \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " Apr 22 19:28:27.912775 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:27.912731 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-secret-metrics-client-certs\") pod \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " Apr 22 19:28:27.912775 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:27.912756 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klznw\" (UniqueName: \"kubernetes.io/projected/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-kube-api-access-klznw\") pod \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " Apr 22 19:28:27.913008 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:27.912804 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-secret-grpc-tls\") pod \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " Apr 22 19:28:27.913008 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:27.912853 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-prometheus-k8s-db\") pod \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " Apr 22 19:28:27.913008 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:27.912879 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-config-out\") pod \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " Apr 22 19:28:27.913008 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:27.912912 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-thanos-prometheus-http-client-file\") pod \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " Apr 22 19:28:27.913008 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:27.912950 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-configmap-kubelet-serving-ca-bundle\") pod \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " Apr 22 19:28:27.913008 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:27.912976 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-prometheus-k8s-rulefiles-0\") pod \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " Apr 22 19:28:27.913008 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:27.913004 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " Apr 22 19:28:27.913327 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:27.913028 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" (UID: "6ea47d7f-14f1-4dfa-a03b-bfc687e938cb"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:28:27.913327 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:27.913042 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-tls-assets\") pod \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " Apr 22 19:28:27.913327 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:27.913101 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-web-config\") pod \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " Apr 22 19:28:27.913327 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:27.913147 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-configmap-metrics-client-ca\") pod \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " Apr 22 19:28:27.913327 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:27.913195 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\" (UID: \"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb\") " Apr 22 19:28:27.913555 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:27.913446 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-prometheus-trusted-ca-bundle\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:28:27.916126 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:27.916061 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" (UID: "6ea47d7f-14f1-4dfa-a03b-bfc687e938cb"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:28:27.916246 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:27.916172 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" (UID: "6ea47d7f-14f1-4dfa-a03b-bfc687e938cb"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:28:27.917434 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:27.916883 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" (UID: "6ea47d7f-14f1-4dfa-a03b-bfc687e938cb"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:28:27.917434 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:27.916942 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" (UID: "6ea47d7f-14f1-4dfa-a03b-bfc687e938cb"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:28:27.917434 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:27.916965 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-kube-api-access-klznw" (OuterVolumeSpecName: "kube-api-access-klznw") pod "6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" (UID: "6ea47d7f-14f1-4dfa-a03b-bfc687e938cb"). InnerVolumeSpecName "kube-api-access-klznw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:28:27.917434 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:27.917271 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-config" (OuterVolumeSpecName: "config") pod "6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" (UID: "6ea47d7f-14f1-4dfa-a03b-bfc687e938cb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:28:27.918615 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:27.917915 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" (UID: "6ea47d7f-14f1-4dfa-a03b-bfc687e938cb"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:28:27.918615 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:27.918381 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" (UID: "6ea47d7f-14f1-4dfa-a03b-bfc687e938cb"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:28:27.919641 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:27.919612 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" (UID: "6ea47d7f-14f1-4dfa-a03b-bfc687e938cb"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:28:27.919921 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:27.919885 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" (UID: "6ea47d7f-14f1-4dfa-a03b-bfc687e938cb"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:28:27.920709 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:27.920675 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" (UID: "6ea47d7f-14f1-4dfa-a03b-bfc687e938cb"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:28:27.920990 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:27.920909 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" (UID: "6ea47d7f-14f1-4dfa-a03b-bfc687e938cb"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:28:27.920990 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:27.920919 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" (UID: "6ea47d7f-14f1-4dfa-a03b-bfc687e938cb"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:28:27.921130 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:27.921049 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-config-out" (OuterVolumeSpecName: "config-out") pod "6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" (UID: "6ea47d7f-14f1-4dfa-a03b-bfc687e938cb"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:28:27.922378 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:27.922351 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" (UID: "6ea47d7f-14f1-4dfa-a03b-bfc687e938cb"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:28:27.923167 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:27.923127 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" (UID: "6ea47d7f-14f1-4dfa-a03b-bfc687e938cb"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:28:27.930363 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:27.930332 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-web-config" (OuterVolumeSpecName: "web-config") pod "6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" (UID: "6ea47d7f-14f1-4dfa-a03b-bfc687e938cb"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:28:28.014286 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.014250 2574 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-secret-metrics-client-certs\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:28:28.014286 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.014281 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-klznw\" (UniqueName: \"kubernetes.io/projected/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-kube-api-access-klznw\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:28:28.014286 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.014297 2574 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-secret-grpc-tls\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:28:28.014567 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.014307 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-prometheus-k8s-db\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:28:28.014567 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.014318 2574 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-config-out\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:28:28.014567 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.014327 2574 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-thanos-prometheus-http-client-file\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:28:28.014567 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.014337 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:28:28.014567 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.014347 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:28:28.014567 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.014358 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:28:28.014567 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.014373 2574 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-tls-assets\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:28:28.014567 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.014387 2574 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-web-config\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:28:28.014567 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.014399 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-configmap-metrics-client-ca\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:28:28.014567 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.014408 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:28:28.014567 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.014417 2574 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-config\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:28:28.014567 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.014427 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-secret-prometheus-k8s-tls\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:28:28.014567 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.014437 2574 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-secret-kube-rbac-proxy\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:28:28.014567 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.014450 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:28:28.146801 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.146742 2574 generic.go:358] "Generic (PLEG): container finished" podID="6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" containerID="66d037a2108a703d00c2e084f9b02245aee4d7374fd828933530e9a0e44de03f" exitCode=0 Apr 22 19:28:28.146801 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.146770 2574 generic.go:358] "Generic (PLEG): container finished" podID="6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" containerID="eeb8f83e81d27811de9a6e34526f2831e592b4d49ac1f2405911e02e107c7dc7" exitCode=0 Apr 22 19:28:28.146801 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.146808 2574 generic.go:358] "Generic (PLEG): container finished" podID="6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" containerID="68206fb2e5ed92db5c7ad568fc3ea0b63f100c599fecb9d48bcb4c379a1816e4" exitCode=0 Apr 22 19:28:28.147319 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.146816 2574 generic.go:358] "Generic (PLEG): container finished" podID="6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" containerID="550a816df00b179a7b454177bcec6ad273f23337b85756b48f8600a6917c1f26" exitCode=0 Apr 22 19:28:28.147319 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.146826 2574 generic.go:358] "Generic (PLEG): container finished" podID="6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" containerID="27bac8357b9f957d1b9f628f582a3b2348a55e0da560d4b72353160099686d2a" exitCode=0 Apr 22 19:28:28.147319 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.146832 2574 generic.go:358] "Generic (PLEG): container finished" podID="6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" containerID="bec1a49cd600b02fca9ab88548ab95cbfe5ad7d9cf0418dfe9e0c7fd7ee7c430" exitCode=0 Apr 22 19:28:28.147319 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.146830 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb","Type":"ContainerDied","Data":"66d037a2108a703d00c2e084f9b02245aee4d7374fd828933530e9a0e44de03f"} Apr 22 19:28:28.147319 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.146863 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.147319 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.146870 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb","Type":"ContainerDied","Data":"eeb8f83e81d27811de9a6e34526f2831e592b4d49ac1f2405911e02e107c7dc7"} Apr 22 19:28:28.147319 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.146887 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb","Type":"ContainerDied","Data":"68206fb2e5ed92db5c7ad568fc3ea0b63f100c599fecb9d48bcb4c379a1816e4"} Apr 22 19:28:28.147319 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.146898 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb","Type":"ContainerDied","Data":"550a816df00b179a7b454177bcec6ad273f23337b85756b48f8600a6917c1f26"} Apr 22 19:28:28.147319 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.146910 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb","Type":"ContainerDied","Data":"27bac8357b9f957d1b9f628f582a3b2348a55e0da560d4b72353160099686d2a"} Apr 22 19:28:28.147319 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.146924 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb","Type":"ContainerDied","Data":"bec1a49cd600b02fca9ab88548ab95cbfe5ad7d9cf0418dfe9e0c7fd7ee7c430"} Apr 22 19:28:28.147319 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.146933 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6ea47d7f-14f1-4dfa-a03b-bfc687e938cb","Type":"ContainerDied","Data":"6bdc7a5f2a1afb0ab74823d6832abcc66ba41161fd7675181d12fa6fc2941820"} Apr 22 19:28:28.147319 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.146946 2574 scope.go:117] "RemoveContainer" containerID="66d037a2108a703d00c2e084f9b02245aee4d7374fd828933530e9a0e44de03f" Apr 22 19:28:28.148906 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.148880 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lnt5s" event={"ID":"be5b5131-1460-4746-96a8-4720ed712cf1","Type":"ContainerStarted","Data":"4588d83fbfd099dbc4e5f0db280a5b1e73d352bcacc42b8e6e4c395222dedfd9"} Apr 22 19:28:28.149022 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.148913 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lnt5s" event={"ID":"be5b5131-1460-4746-96a8-4720ed712cf1","Type":"ContainerStarted","Data":"a0cd3eae099e938a905b27d4febf38750b543c71ae54d6b39d087597371f9bd0"} Apr 22 19:28:28.149022 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.148977 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-lnt5s" Apr 22 19:28:28.150610 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.150576 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qsnbz" event={"ID":"8207bbac-ad34-4f98-a8af-1a2daaa6ea59","Type":"ContainerStarted","Data":"9df9151e557424b7317f171fd74427c03e6945b823b208bee3dfadb551e1be31"} Apr 22 19:28:28.152124 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.152101 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-2cwpp" event={"ID":"76e16566-d9dd-4f0b-9cc5-a72e27e4518a","Type":"ContainerStarted","Data":"4b777c03773c2a6bfaeff6422def034d4aa743bb7706514270c841ab0f975988"} Apr 22 19:28:28.167954 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.167904 2574 scope.go:117] "RemoveContainer" containerID="eeb8f83e81d27811de9a6e34526f2831e592b4d49ac1f2405911e02e107c7dc7" Apr 22 19:28:28.171405 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.171365 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-2cwpp" podStartSLOduration=251.69087464 podStartE2EDuration="4m13.171352439s" podCreationTimestamp="2026-04-22 19:24:15 +0000 UTC" firstStartedPulling="2026-04-22 19:28:26.341431466 +0000 UTC m=+283.625101932" lastFinishedPulling="2026-04-22 19:28:27.821909273 +0000 UTC m=+285.105579731" observedRunningTime="2026-04-22 19:28:28.168958728 +0000 UTC m=+285.452629205" watchObservedRunningTime="2026-04-22 19:28:28.171352439 +0000 UTC m=+285.455022971" Apr 22 19:28:28.177072 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.177049 2574 scope.go:117] "RemoveContainer" containerID="68206fb2e5ed92db5c7ad568fc3ea0b63f100c599fecb9d48bcb4c379a1816e4" Apr 22 19:28:28.185710 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.185678 2574 scope.go:117] "RemoveContainer" containerID="550a816df00b179a7b454177bcec6ad273f23337b85756b48f8600a6917c1f26" Apr 22 19:28:28.193605 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.193563 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-lnt5s" podStartSLOduration=250.951342224 podStartE2EDuration="4m13.193551305s" podCreationTimestamp="2026-04-22 19:24:15 +0000 UTC" firstStartedPulling="2026-04-22 19:28:25.550544612 +0000 UTC m=+282.834215064" lastFinishedPulling="2026-04-22 19:28:27.79275369 +0000 UTC m=+285.076424145" observedRunningTime="2026-04-22 19:28:28.19338322 +0000 UTC m=+285.477053693" watchObservedRunningTime="2026-04-22 19:28:28.193551305 +0000 UTC m=+285.477221778" Apr 22 19:28:28.193847 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.193726 2574 scope.go:117] "RemoveContainer" containerID="27bac8357b9f957d1b9f628f582a3b2348a55e0da560d4b72353160099686d2a" Apr 22 19:28:28.200500 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.200482 2574 scope.go:117] "RemoveContainer" containerID="bec1a49cd600b02fca9ab88548ab95cbfe5ad7d9cf0418dfe9e0c7fd7ee7c430" Apr 22 19:28:28.207637 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.207612 2574 scope.go:117] "RemoveContainer" containerID="31be76f55f39b31526822e2805c9cc6dcedb4f0e13122f0eb1b39c400781c9da" Apr 22 19:28:28.213849 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.213831 2574 scope.go:117] "RemoveContainer" containerID="66d037a2108a703d00c2e084f9b02245aee4d7374fd828933530e9a0e44de03f" Apr 22 19:28:28.214133 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:28:28.214110 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66d037a2108a703d00c2e084f9b02245aee4d7374fd828933530e9a0e44de03f\": container with ID starting with 66d037a2108a703d00c2e084f9b02245aee4d7374fd828933530e9a0e44de03f not found: ID does not exist" containerID="66d037a2108a703d00c2e084f9b02245aee4d7374fd828933530e9a0e44de03f" Apr 22 19:28:28.214205 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.214141 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66d037a2108a703d00c2e084f9b02245aee4d7374fd828933530e9a0e44de03f"} err="failed to get container status \"66d037a2108a703d00c2e084f9b02245aee4d7374fd828933530e9a0e44de03f\": rpc error: code = NotFound desc = could not find container \"66d037a2108a703d00c2e084f9b02245aee4d7374fd828933530e9a0e44de03f\": container with ID starting with 66d037a2108a703d00c2e084f9b02245aee4d7374fd828933530e9a0e44de03f not found: ID does not exist" Apr 22 19:28:28.214205 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.214176 2574 scope.go:117] "RemoveContainer" containerID="eeb8f83e81d27811de9a6e34526f2831e592b4d49ac1f2405911e02e107c7dc7" Apr 22 19:28:28.214384 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:28:28.214367 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eeb8f83e81d27811de9a6e34526f2831e592b4d49ac1f2405911e02e107c7dc7\": container with ID starting with eeb8f83e81d27811de9a6e34526f2831e592b4d49ac1f2405911e02e107c7dc7 not found: ID does not exist" containerID="eeb8f83e81d27811de9a6e34526f2831e592b4d49ac1f2405911e02e107c7dc7" Apr 22 19:28:28.214422 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.214390 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eeb8f83e81d27811de9a6e34526f2831e592b4d49ac1f2405911e02e107c7dc7"} err="failed to get container status \"eeb8f83e81d27811de9a6e34526f2831e592b4d49ac1f2405911e02e107c7dc7\": rpc error: code = NotFound desc = could not find container \"eeb8f83e81d27811de9a6e34526f2831e592b4d49ac1f2405911e02e107c7dc7\": container with ID starting with eeb8f83e81d27811de9a6e34526f2831e592b4d49ac1f2405911e02e107c7dc7 not found: ID does not exist" Apr 22 19:28:28.214422 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.214408 2574 scope.go:117] "RemoveContainer" containerID="68206fb2e5ed92db5c7ad568fc3ea0b63f100c599fecb9d48bcb4c379a1816e4" Apr 22 19:28:28.214685 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:28:28.214653 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68206fb2e5ed92db5c7ad568fc3ea0b63f100c599fecb9d48bcb4c379a1816e4\": container with ID starting with 68206fb2e5ed92db5c7ad568fc3ea0b63f100c599fecb9d48bcb4c379a1816e4 not found: ID does not exist" containerID="68206fb2e5ed92db5c7ad568fc3ea0b63f100c599fecb9d48bcb4c379a1816e4" Apr 22 19:28:28.214742 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.214687 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68206fb2e5ed92db5c7ad568fc3ea0b63f100c599fecb9d48bcb4c379a1816e4"} err="failed to get container status \"68206fb2e5ed92db5c7ad568fc3ea0b63f100c599fecb9d48bcb4c379a1816e4\": rpc error: code = NotFound desc = could not find container \"68206fb2e5ed92db5c7ad568fc3ea0b63f100c599fecb9d48bcb4c379a1816e4\": container with ID starting with 68206fb2e5ed92db5c7ad568fc3ea0b63f100c599fecb9d48bcb4c379a1816e4 not found: ID does not exist" Apr 22 19:28:28.214742 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.214708 2574 scope.go:117] "RemoveContainer" containerID="550a816df00b179a7b454177bcec6ad273f23337b85756b48f8600a6917c1f26" Apr 22 19:28:28.214964 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:28:28.214936 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"550a816df00b179a7b454177bcec6ad273f23337b85756b48f8600a6917c1f26\": container with ID starting with 550a816df00b179a7b454177bcec6ad273f23337b85756b48f8600a6917c1f26 not found: ID does not exist" containerID="550a816df00b179a7b454177bcec6ad273f23337b85756b48f8600a6917c1f26" Apr 22 19:28:28.215033 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.214966 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"550a816df00b179a7b454177bcec6ad273f23337b85756b48f8600a6917c1f26"} err="failed to get container status \"550a816df00b179a7b454177bcec6ad273f23337b85756b48f8600a6917c1f26\": rpc error: code = NotFound desc = could not find container \"550a816df00b179a7b454177bcec6ad273f23337b85756b48f8600a6917c1f26\": container with ID starting with 550a816df00b179a7b454177bcec6ad273f23337b85756b48f8600a6917c1f26 not found: ID does not exist" Apr 22 19:28:28.215033 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.214978 2574 scope.go:117] "RemoveContainer" containerID="27bac8357b9f957d1b9f628f582a3b2348a55e0da560d4b72353160099686d2a" Apr 22 19:28:28.215184 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:28:28.215171 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27bac8357b9f957d1b9f628f582a3b2348a55e0da560d4b72353160099686d2a\": container with ID starting with 27bac8357b9f957d1b9f628f582a3b2348a55e0da560d4b72353160099686d2a not found: ID does not exist" containerID="27bac8357b9f957d1b9f628f582a3b2348a55e0da560d4b72353160099686d2a" Apr 22 19:28:28.215222 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.215186 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27bac8357b9f957d1b9f628f582a3b2348a55e0da560d4b72353160099686d2a"} err="failed to get container status \"27bac8357b9f957d1b9f628f582a3b2348a55e0da560d4b72353160099686d2a\": rpc error: code = NotFound desc = could not find container \"27bac8357b9f957d1b9f628f582a3b2348a55e0da560d4b72353160099686d2a\": container with ID starting with 27bac8357b9f957d1b9f628f582a3b2348a55e0da560d4b72353160099686d2a not found: ID does not exist" Apr 22 19:28:28.215222 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.215207 2574 scope.go:117] "RemoveContainer" containerID="bec1a49cd600b02fca9ab88548ab95cbfe5ad7d9cf0418dfe9e0c7fd7ee7c430" Apr 22 19:28:28.215409 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:28:28.215395 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bec1a49cd600b02fca9ab88548ab95cbfe5ad7d9cf0418dfe9e0c7fd7ee7c430\": container with ID starting with bec1a49cd600b02fca9ab88548ab95cbfe5ad7d9cf0418dfe9e0c7fd7ee7c430 not found: ID does not exist" containerID="bec1a49cd600b02fca9ab88548ab95cbfe5ad7d9cf0418dfe9e0c7fd7ee7c430" Apr 22 19:28:28.215453 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.215413 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bec1a49cd600b02fca9ab88548ab95cbfe5ad7d9cf0418dfe9e0c7fd7ee7c430"} err="failed to get container status \"bec1a49cd600b02fca9ab88548ab95cbfe5ad7d9cf0418dfe9e0c7fd7ee7c430\": rpc error: code = NotFound desc = could not find container \"bec1a49cd600b02fca9ab88548ab95cbfe5ad7d9cf0418dfe9e0c7fd7ee7c430\": container with ID starting with bec1a49cd600b02fca9ab88548ab95cbfe5ad7d9cf0418dfe9e0c7fd7ee7c430 not found: ID does not exist" Apr 22 19:28:28.215453 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.215423 2574 scope.go:117] "RemoveContainer" containerID="31be76f55f39b31526822e2805c9cc6dcedb4f0e13122f0eb1b39c400781c9da" Apr 22 19:28:28.215642 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:28:28.215627 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31be76f55f39b31526822e2805c9cc6dcedb4f0e13122f0eb1b39c400781c9da\": container with ID starting with 31be76f55f39b31526822e2805c9cc6dcedb4f0e13122f0eb1b39c400781c9da not found: ID does not exist" containerID="31be76f55f39b31526822e2805c9cc6dcedb4f0e13122f0eb1b39c400781c9da" Apr 22 19:28:28.215696 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.215644 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31be76f55f39b31526822e2805c9cc6dcedb4f0e13122f0eb1b39c400781c9da"} err="failed to get container status \"31be76f55f39b31526822e2805c9cc6dcedb4f0e13122f0eb1b39c400781c9da\": rpc error: code = NotFound desc = could not find container \"31be76f55f39b31526822e2805c9cc6dcedb4f0e13122f0eb1b39c400781c9da\": container with ID starting with 31be76f55f39b31526822e2805c9cc6dcedb4f0e13122f0eb1b39c400781c9da not found: ID does not exist" Apr 22 19:28:28.215696 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.215655 2574 scope.go:117] "RemoveContainer" containerID="66d037a2108a703d00c2e084f9b02245aee4d7374fd828933530e9a0e44de03f" Apr 22 19:28:28.215903 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.215884 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66d037a2108a703d00c2e084f9b02245aee4d7374fd828933530e9a0e44de03f"} err="failed to get container status \"66d037a2108a703d00c2e084f9b02245aee4d7374fd828933530e9a0e44de03f\": rpc error: code = NotFound desc = could not find container \"66d037a2108a703d00c2e084f9b02245aee4d7374fd828933530e9a0e44de03f\": container with ID starting with 66d037a2108a703d00c2e084f9b02245aee4d7374fd828933530e9a0e44de03f not found: ID does not exist" Apr 22 19:28:28.215958 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.215904 2574 scope.go:117] "RemoveContainer" containerID="eeb8f83e81d27811de9a6e34526f2831e592b4d49ac1f2405911e02e107c7dc7" Apr 22 19:28:28.216138 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.216120 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eeb8f83e81d27811de9a6e34526f2831e592b4d49ac1f2405911e02e107c7dc7"} err="failed to get container status \"eeb8f83e81d27811de9a6e34526f2831e592b4d49ac1f2405911e02e107c7dc7\": rpc error: code = NotFound desc = could not find container \"eeb8f83e81d27811de9a6e34526f2831e592b4d49ac1f2405911e02e107c7dc7\": container with ID starting with eeb8f83e81d27811de9a6e34526f2831e592b4d49ac1f2405911e02e107c7dc7 not found: ID does not exist" Apr 22 19:28:28.216198 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.216138 2574 scope.go:117] "RemoveContainer" containerID="68206fb2e5ed92db5c7ad568fc3ea0b63f100c599fecb9d48bcb4c379a1816e4" Apr 22 19:28:28.216366 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.216347 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68206fb2e5ed92db5c7ad568fc3ea0b63f100c599fecb9d48bcb4c379a1816e4"} err="failed to get container status \"68206fb2e5ed92db5c7ad568fc3ea0b63f100c599fecb9d48bcb4c379a1816e4\": rpc error: code = NotFound desc = could not find container \"68206fb2e5ed92db5c7ad568fc3ea0b63f100c599fecb9d48bcb4c379a1816e4\": container with ID starting with 68206fb2e5ed92db5c7ad568fc3ea0b63f100c599fecb9d48bcb4c379a1816e4 not found: ID does not exist" Apr 22 19:28:28.216435 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.216367 2574 scope.go:117] "RemoveContainer" containerID="550a816df00b179a7b454177bcec6ad273f23337b85756b48f8600a6917c1f26" Apr 22 19:28:28.216581 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.216564 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"550a816df00b179a7b454177bcec6ad273f23337b85756b48f8600a6917c1f26"} err="failed to get container status \"550a816df00b179a7b454177bcec6ad273f23337b85756b48f8600a6917c1f26\": rpc error: code = NotFound desc = could not find container \"550a816df00b179a7b454177bcec6ad273f23337b85756b48f8600a6917c1f26\": container with ID starting with 550a816df00b179a7b454177bcec6ad273f23337b85756b48f8600a6917c1f26 not found: ID does not exist" Apr 22 19:28:28.216646 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.216584 2574 scope.go:117] "RemoveContainer" containerID="27bac8357b9f957d1b9f628f582a3b2348a55e0da560d4b72353160099686d2a" Apr 22 19:28:28.216806 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.216768 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27bac8357b9f957d1b9f628f582a3b2348a55e0da560d4b72353160099686d2a"} err="failed to get container status \"27bac8357b9f957d1b9f628f582a3b2348a55e0da560d4b72353160099686d2a\": rpc error: code = NotFound desc = could not find container \"27bac8357b9f957d1b9f628f582a3b2348a55e0da560d4b72353160099686d2a\": container with ID starting with 27bac8357b9f957d1b9f628f582a3b2348a55e0da560d4b72353160099686d2a not found: ID does not exist" Apr 22 19:28:28.216879 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.216809 2574 scope.go:117] "RemoveContainer" containerID="bec1a49cd600b02fca9ab88548ab95cbfe5ad7d9cf0418dfe9e0c7fd7ee7c430" Apr 22 19:28:28.217000 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.216984 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bec1a49cd600b02fca9ab88548ab95cbfe5ad7d9cf0418dfe9e0c7fd7ee7c430"} err="failed to get container status \"bec1a49cd600b02fca9ab88548ab95cbfe5ad7d9cf0418dfe9e0c7fd7ee7c430\": rpc error: code = NotFound desc = could not find container \"bec1a49cd600b02fca9ab88548ab95cbfe5ad7d9cf0418dfe9e0c7fd7ee7c430\": container with ID starting with bec1a49cd600b02fca9ab88548ab95cbfe5ad7d9cf0418dfe9e0c7fd7ee7c430 not found: ID does not exist" Apr 22 19:28:28.217059 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.217002 2574 scope.go:117] "RemoveContainer" containerID="31be76f55f39b31526822e2805c9cc6dcedb4f0e13122f0eb1b39c400781c9da" Apr 22 19:28:28.217201 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.217179 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31be76f55f39b31526822e2805c9cc6dcedb4f0e13122f0eb1b39c400781c9da"} err="failed to get container status \"31be76f55f39b31526822e2805c9cc6dcedb4f0e13122f0eb1b39c400781c9da\": rpc error: code = NotFound desc = could not find container \"31be76f55f39b31526822e2805c9cc6dcedb4f0e13122f0eb1b39c400781c9da\": container with ID starting with 31be76f55f39b31526822e2805c9cc6dcedb4f0e13122f0eb1b39c400781c9da not found: ID does not exist" Apr 22 19:28:28.217270 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.217202 2574 scope.go:117] "RemoveContainer" containerID="66d037a2108a703d00c2e084f9b02245aee4d7374fd828933530e9a0e44de03f" Apr 22 19:28:28.217408 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.217390 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66d037a2108a703d00c2e084f9b02245aee4d7374fd828933530e9a0e44de03f"} err="failed to get container status \"66d037a2108a703d00c2e084f9b02245aee4d7374fd828933530e9a0e44de03f\": rpc error: code = NotFound desc = could not find container \"66d037a2108a703d00c2e084f9b02245aee4d7374fd828933530e9a0e44de03f\": container with ID starting with 66d037a2108a703d00c2e084f9b02245aee4d7374fd828933530e9a0e44de03f not found: ID does not exist" Apr 22 19:28:28.217468 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.217409 2574 scope.go:117] "RemoveContainer" containerID="eeb8f83e81d27811de9a6e34526f2831e592b4d49ac1f2405911e02e107c7dc7" Apr 22 19:28:28.217682 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.217659 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eeb8f83e81d27811de9a6e34526f2831e592b4d49ac1f2405911e02e107c7dc7"} err="failed to get container status \"eeb8f83e81d27811de9a6e34526f2831e592b4d49ac1f2405911e02e107c7dc7\": rpc error: code = NotFound desc = could not find container \"eeb8f83e81d27811de9a6e34526f2831e592b4d49ac1f2405911e02e107c7dc7\": container with ID starting with eeb8f83e81d27811de9a6e34526f2831e592b4d49ac1f2405911e02e107c7dc7 not found: ID does not exist" Apr 22 19:28:28.217723 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.217685 2574 scope.go:117] "RemoveContainer" containerID="68206fb2e5ed92db5c7ad568fc3ea0b63f100c599fecb9d48bcb4c379a1816e4" Apr 22 19:28:28.217979 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.217960 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68206fb2e5ed92db5c7ad568fc3ea0b63f100c599fecb9d48bcb4c379a1816e4"} err="failed to get container status \"68206fb2e5ed92db5c7ad568fc3ea0b63f100c599fecb9d48bcb4c379a1816e4\": rpc error: code = NotFound desc = could not find container \"68206fb2e5ed92db5c7ad568fc3ea0b63f100c599fecb9d48bcb4c379a1816e4\": container with ID starting with 68206fb2e5ed92db5c7ad568fc3ea0b63f100c599fecb9d48bcb4c379a1816e4 not found: ID does not exist" Apr 22 19:28:28.218057 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.217979 2574 scope.go:117] "RemoveContainer" containerID="550a816df00b179a7b454177bcec6ad273f23337b85756b48f8600a6917c1f26" Apr 22 19:28:28.218192 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.218177 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"550a816df00b179a7b454177bcec6ad273f23337b85756b48f8600a6917c1f26"} err="failed to get container status \"550a816df00b179a7b454177bcec6ad273f23337b85756b48f8600a6917c1f26\": rpc error: code = NotFound desc = could not find container \"550a816df00b179a7b454177bcec6ad273f23337b85756b48f8600a6917c1f26\": container with ID starting with 550a816df00b179a7b454177bcec6ad273f23337b85756b48f8600a6917c1f26 not found: ID does not exist" Apr 22 19:28:28.218255 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.218193 2574 scope.go:117] "RemoveContainer" containerID="27bac8357b9f957d1b9f628f582a3b2348a55e0da560d4b72353160099686d2a" Apr 22 19:28:28.218407 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.218389 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27bac8357b9f957d1b9f628f582a3b2348a55e0da560d4b72353160099686d2a"} err="failed to get container status \"27bac8357b9f957d1b9f628f582a3b2348a55e0da560d4b72353160099686d2a\": rpc error: code = NotFound desc = could not find container \"27bac8357b9f957d1b9f628f582a3b2348a55e0da560d4b72353160099686d2a\": container with ID starting with 27bac8357b9f957d1b9f628f582a3b2348a55e0da560d4b72353160099686d2a not found: ID does not exist" Apr 22 19:28:28.218470 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.218407 2574 scope.go:117] "RemoveContainer" containerID="bec1a49cd600b02fca9ab88548ab95cbfe5ad7d9cf0418dfe9e0c7fd7ee7c430" Apr 22 19:28:28.218606 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.218590 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bec1a49cd600b02fca9ab88548ab95cbfe5ad7d9cf0418dfe9e0c7fd7ee7c430"} err="failed to get container status \"bec1a49cd600b02fca9ab88548ab95cbfe5ad7d9cf0418dfe9e0c7fd7ee7c430\": rpc error: code = NotFound desc = could not find container \"bec1a49cd600b02fca9ab88548ab95cbfe5ad7d9cf0418dfe9e0c7fd7ee7c430\": container with ID starting with bec1a49cd600b02fca9ab88548ab95cbfe5ad7d9cf0418dfe9e0c7fd7ee7c430 not found: ID does not exist" Apr 22 19:28:28.218670 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.218608 2574 scope.go:117] "RemoveContainer" containerID="31be76f55f39b31526822e2805c9cc6dcedb4f0e13122f0eb1b39c400781c9da" Apr 22 19:28:28.218855 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.218835 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31be76f55f39b31526822e2805c9cc6dcedb4f0e13122f0eb1b39c400781c9da"} err="failed to get container status \"31be76f55f39b31526822e2805c9cc6dcedb4f0e13122f0eb1b39c400781c9da\": rpc error: code = NotFound desc = could not find container \"31be76f55f39b31526822e2805c9cc6dcedb4f0e13122f0eb1b39c400781c9da\": container with ID starting with 31be76f55f39b31526822e2805c9cc6dcedb4f0e13122f0eb1b39c400781c9da not found: ID does not exist" Apr 22 19:28:28.218915 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.218856 2574 scope.go:117] "RemoveContainer" containerID="66d037a2108a703d00c2e084f9b02245aee4d7374fd828933530e9a0e44de03f" Apr 22 19:28:28.219098 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.219079 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66d037a2108a703d00c2e084f9b02245aee4d7374fd828933530e9a0e44de03f"} err="failed to get container status \"66d037a2108a703d00c2e084f9b02245aee4d7374fd828933530e9a0e44de03f\": rpc error: code = NotFound desc = could not find container \"66d037a2108a703d00c2e084f9b02245aee4d7374fd828933530e9a0e44de03f\": container with ID starting with 66d037a2108a703d00c2e084f9b02245aee4d7374fd828933530e9a0e44de03f not found: ID does not exist" Apr 22 19:28:28.219139 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.219099 2574 scope.go:117] "RemoveContainer" containerID="eeb8f83e81d27811de9a6e34526f2831e592b4d49ac1f2405911e02e107c7dc7" Apr 22 19:28:28.219316 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.219290 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eeb8f83e81d27811de9a6e34526f2831e592b4d49ac1f2405911e02e107c7dc7"} err="failed to get container status \"eeb8f83e81d27811de9a6e34526f2831e592b4d49ac1f2405911e02e107c7dc7\": rpc error: code = NotFound desc = could not find container \"eeb8f83e81d27811de9a6e34526f2831e592b4d49ac1f2405911e02e107c7dc7\": container with ID starting with eeb8f83e81d27811de9a6e34526f2831e592b4d49ac1f2405911e02e107c7dc7 not found: ID does not exist" Apr 22 19:28:28.219363 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.219319 2574 scope.go:117] "RemoveContainer" containerID="68206fb2e5ed92db5c7ad568fc3ea0b63f100c599fecb9d48bcb4c379a1816e4" Apr 22 19:28:28.219557 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.219535 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68206fb2e5ed92db5c7ad568fc3ea0b63f100c599fecb9d48bcb4c379a1816e4"} err="failed to get container status \"68206fb2e5ed92db5c7ad568fc3ea0b63f100c599fecb9d48bcb4c379a1816e4\": rpc error: code = NotFound desc = could not find container \"68206fb2e5ed92db5c7ad568fc3ea0b63f100c599fecb9d48bcb4c379a1816e4\": container with ID starting with 68206fb2e5ed92db5c7ad568fc3ea0b63f100c599fecb9d48bcb4c379a1816e4 not found: ID does not exist" Apr 22 19:28:28.219604 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.219560 2574 scope.go:117] "RemoveContainer" containerID="550a816df00b179a7b454177bcec6ad273f23337b85756b48f8600a6917c1f26" Apr 22 19:28:28.219826 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.219807 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"550a816df00b179a7b454177bcec6ad273f23337b85756b48f8600a6917c1f26"} err="failed to get container status \"550a816df00b179a7b454177bcec6ad273f23337b85756b48f8600a6917c1f26\": rpc error: code = NotFound desc = could not find container \"550a816df00b179a7b454177bcec6ad273f23337b85756b48f8600a6917c1f26\": container with ID starting with 550a816df00b179a7b454177bcec6ad273f23337b85756b48f8600a6917c1f26 not found: ID does not exist" Apr 22 19:28:28.219882 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.219827 2574 scope.go:117] "RemoveContainer" containerID="27bac8357b9f957d1b9f628f582a3b2348a55e0da560d4b72353160099686d2a" Apr 22 19:28:28.220062 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.220043 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27bac8357b9f957d1b9f628f582a3b2348a55e0da560d4b72353160099686d2a"} err="failed to get container status \"27bac8357b9f957d1b9f628f582a3b2348a55e0da560d4b72353160099686d2a\": rpc error: code = NotFound desc = could not find container \"27bac8357b9f957d1b9f628f582a3b2348a55e0da560d4b72353160099686d2a\": container with ID starting with 27bac8357b9f957d1b9f628f582a3b2348a55e0da560d4b72353160099686d2a not found: ID does not exist" Apr 22 19:28:28.220132 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.220063 2574 scope.go:117] "RemoveContainer" containerID="bec1a49cd600b02fca9ab88548ab95cbfe5ad7d9cf0418dfe9e0c7fd7ee7c430" Apr 22 19:28:28.220301 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.220284 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bec1a49cd600b02fca9ab88548ab95cbfe5ad7d9cf0418dfe9e0c7fd7ee7c430"} err="failed to get container status \"bec1a49cd600b02fca9ab88548ab95cbfe5ad7d9cf0418dfe9e0c7fd7ee7c430\": rpc error: code = NotFound desc = could not find container \"bec1a49cd600b02fca9ab88548ab95cbfe5ad7d9cf0418dfe9e0c7fd7ee7c430\": container with ID starting with bec1a49cd600b02fca9ab88548ab95cbfe5ad7d9cf0418dfe9e0c7fd7ee7c430 not found: ID does not exist" Apr 22 19:28:28.220348 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.220301 2574 scope.go:117] "RemoveContainer" containerID="31be76f55f39b31526822e2805c9cc6dcedb4f0e13122f0eb1b39c400781c9da" Apr 22 19:28:28.220494 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.220481 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31be76f55f39b31526822e2805c9cc6dcedb4f0e13122f0eb1b39c400781c9da"} err="failed to get container status \"31be76f55f39b31526822e2805c9cc6dcedb4f0e13122f0eb1b39c400781c9da\": rpc error: code = NotFound desc = could not find container \"31be76f55f39b31526822e2805c9cc6dcedb4f0e13122f0eb1b39c400781c9da\": container with ID starting with 31be76f55f39b31526822e2805c9cc6dcedb4f0e13122f0eb1b39c400781c9da not found: ID does not exist" Apr 22 19:28:28.220534 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.220494 2574 scope.go:117] "RemoveContainer" containerID="66d037a2108a703d00c2e084f9b02245aee4d7374fd828933530e9a0e44de03f" Apr 22 19:28:28.220678 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.220664 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66d037a2108a703d00c2e084f9b02245aee4d7374fd828933530e9a0e44de03f"} err="failed to get container status \"66d037a2108a703d00c2e084f9b02245aee4d7374fd828933530e9a0e44de03f\": rpc error: code = NotFound desc = could not find container \"66d037a2108a703d00c2e084f9b02245aee4d7374fd828933530e9a0e44de03f\": container with ID starting with 66d037a2108a703d00c2e084f9b02245aee4d7374fd828933530e9a0e44de03f not found: ID does not exist" Apr 22 19:28:28.220725 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.220677 2574 scope.go:117] "RemoveContainer" containerID="eeb8f83e81d27811de9a6e34526f2831e592b4d49ac1f2405911e02e107c7dc7" Apr 22 19:28:28.220879 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.220864 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eeb8f83e81d27811de9a6e34526f2831e592b4d49ac1f2405911e02e107c7dc7"} err="failed to get container status \"eeb8f83e81d27811de9a6e34526f2831e592b4d49ac1f2405911e02e107c7dc7\": rpc error: code = NotFound desc = could not find container \"eeb8f83e81d27811de9a6e34526f2831e592b4d49ac1f2405911e02e107c7dc7\": container with ID starting with eeb8f83e81d27811de9a6e34526f2831e592b4d49ac1f2405911e02e107c7dc7 not found: ID does not exist" Apr 22 19:28:28.220879 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.220878 2574 scope.go:117] "RemoveContainer" containerID="68206fb2e5ed92db5c7ad568fc3ea0b63f100c599fecb9d48bcb4c379a1816e4" Apr 22 19:28:28.221070 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.221047 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68206fb2e5ed92db5c7ad568fc3ea0b63f100c599fecb9d48bcb4c379a1816e4"} err="failed to get container status \"68206fb2e5ed92db5c7ad568fc3ea0b63f100c599fecb9d48bcb4c379a1816e4\": rpc error: code = NotFound desc = could not find container \"68206fb2e5ed92db5c7ad568fc3ea0b63f100c599fecb9d48bcb4c379a1816e4\": container with ID starting with 68206fb2e5ed92db5c7ad568fc3ea0b63f100c599fecb9d48bcb4c379a1816e4 not found: ID does not exist" Apr 22 19:28:28.221116 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.221071 2574 scope.go:117] "RemoveContainer" containerID="550a816df00b179a7b454177bcec6ad273f23337b85756b48f8600a6917c1f26" Apr 22 19:28:28.221244 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.221231 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"550a816df00b179a7b454177bcec6ad273f23337b85756b48f8600a6917c1f26"} err="failed to get container status \"550a816df00b179a7b454177bcec6ad273f23337b85756b48f8600a6917c1f26\": rpc error: code = NotFound desc = could not find container \"550a816df00b179a7b454177bcec6ad273f23337b85756b48f8600a6917c1f26\": container with ID starting with 550a816df00b179a7b454177bcec6ad273f23337b85756b48f8600a6917c1f26 not found: ID does not exist" Apr 22 19:28:28.221284 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.221245 2574 scope.go:117] "RemoveContainer" containerID="27bac8357b9f957d1b9f628f582a3b2348a55e0da560d4b72353160099686d2a" Apr 22 19:28:28.221433 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.221419 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27bac8357b9f957d1b9f628f582a3b2348a55e0da560d4b72353160099686d2a"} err="failed to get container status \"27bac8357b9f957d1b9f628f582a3b2348a55e0da560d4b72353160099686d2a\": rpc error: code = NotFound desc = could not find container \"27bac8357b9f957d1b9f628f582a3b2348a55e0da560d4b72353160099686d2a\": container with ID starting with 27bac8357b9f957d1b9f628f582a3b2348a55e0da560d4b72353160099686d2a not found: ID does not exist" Apr 22 19:28:28.221478 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.221433 2574 scope.go:117] "RemoveContainer" containerID="bec1a49cd600b02fca9ab88548ab95cbfe5ad7d9cf0418dfe9e0c7fd7ee7c430" Apr 22 19:28:28.221601 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.221586 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bec1a49cd600b02fca9ab88548ab95cbfe5ad7d9cf0418dfe9e0c7fd7ee7c430"} err="failed to get container status \"bec1a49cd600b02fca9ab88548ab95cbfe5ad7d9cf0418dfe9e0c7fd7ee7c430\": rpc error: code = NotFound desc = could not find container \"bec1a49cd600b02fca9ab88548ab95cbfe5ad7d9cf0418dfe9e0c7fd7ee7c430\": container with ID starting with bec1a49cd600b02fca9ab88548ab95cbfe5ad7d9cf0418dfe9e0c7fd7ee7c430 not found: ID does not exist" Apr 22 19:28:28.221650 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.221601 2574 scope.go:117] "RemoveContainer" containerID="31be76f55f39b31526822e2805c9cc6dcedb4f0e13122f0eb1b39c400781c9da" Apr 22 19:28:28.221766 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.221749 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31be76f55f39b31526822e2805c9cc6dcedb4f0e13122f0eb1b39c400781c9da"} err="failed to get container status \"31be76f55f39b31526822e2805c9cc6dcedb4f0e13122f0eb1b39c400781c9da\": rpc error: code = NotFound desc = could not find container \"31be76f55f39b31526822e2805c9cc6dcedb4f0e13122f0eb1b39c400781c9da\": container with ID starting with 31be76f55f39b31526822e2805c9cc6dcedb4f0e13122f0eb1b39c400781c9da not found: ID does not exist" Apr 22 19:28:28.221823 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.221766 2574 scope.go:117] "RemoveContainer" containerID="66d037a2108a703d00c2e084f9b02245aee4d7374fd828933530e9a0e44de03f" Apr 22 19:28:28.221979 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.221960 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66d037a2108a703d00c2e084f9b02245aee4d7374fd828933530e9a0e44de03f"} err="failed to get container status \"66d037a2108a703d00c2e084f9b02245aee4d7374fd828933530e9a0e44de03f\": rpc error: code = NotFound desc = could not find container \"66d037a2108a703d00c2e084f9b02245aee4d7374fd828933530e9a0e44de03f\": container with ID starting with 66d037a2108a703d00c2e084f9b02245aee4d7374fd828933530e9a0e44de03f not found: ID does not exist" Apr 22 19:28:28.222017 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.221979 2574 scope.go:117] "RemoveContainer" containerID="eeb8f83e81d27811de9a6e34526f2831e592b4d49ac1f2405911e02e107c7dc7" Apr 22 19:28:28.222178 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.222161 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eeb8f83e81d27811de9a6e34526f2831e592b4d49ac1f2405911e02e107c7dc7"} err="failed to get container status \"eeb8f83e81d27811de9a6e34526f2831e592b4d49ac1f2405911e02e107c7dc7\": rpc error: code = NotFound desc = could not find container \"eeb8f83e81d27811de9a6e34526f2831e592b4d49ac1f2405911e02e107c7dc7\": container with ID starting with eeb8f83e81d27811de9a6e34526f2831e592b4d49ac1f2405911e02e107c7dc7 not found: ID does not exist" Apr 22 19:28:28.222219 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.222177 2574 scope.go:117] "RemoveContainer" containerID="68206fb2e5ed92db5c7ad568fc3ea0b63f100c599fecb9d48bcb4c379a1816e4" Apr 22 19:28:28.222383 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.222361 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68206fb2e5ed92db5c7ad568fc3ea0b63f100c599fecb9d48bcb4c379a1816e4"} err="failed to get container status \"68206fb2e5ed92db5c7ad568fc3ea0b63f100c599fecb9d48bcb4c379a1816e4\": rpc error: code = NotFound desc = could not find container \"68206fb2e5ed92db5c7ad568fc3ea0b63f100c599fecb9d48bcb4c379a1816e4\": container with ID starting with 68206fb2e5ed92db5c7ad568fc3ea0b63f100c599fecb9d48bcb4c379a1816e4 not found: ID does not exist" Apr 22 19:28:28.222457 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.222387 2574 scope.go:117] "RemoveContainer" containerID="550a816df00b179a7b454177bcec6ad273f23337b85756b48f8600a6917c1f26" Apr 22 19:28:28.222608 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.222590 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"550a816df00b179a7b454177bcec6ad273f23337b85756b48f8600a6917c1f26"} err="failed to get container status \"550a816df00b179a7b454177bcec6ad273f23337b85756b48f8600a6917c1f26\": rpc error: code = NotFound desc = could not find container \"550a816df00b179a7b454177bcec6ad273f23337b85756b48f8600a6917c1f26\": container with ID starting with 550a816df00b179a7b454177bcec6ad273f23337b85756b48f8600a6917c1f26 not found: ID does not exist" Apr 22 19:28:28.222651 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.222609 2574 scope.go:117] "RemoveContainer" containerID="27bac8357b9f957d1b9f628f582a3b2348a55e0da560d4b72353160099686d2a" Apr 22 19:28:28.222884 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.222867 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27bac8357b9f957d1b9f628f582a3b2348a55e0da560d4b72353160099686d2a"} err="failed to get container status \"27bac8357b9f957d1b9f628f582a3b2348a55e0da560d4b72353160099686d2a\": rpc error: code = NotFound desc = could not find container \"27bac8357b9f957d1b9f628f582a3b2348a55e0da560d4b72353160099686d2a\": container with ID starting with 27bac8357b9f957d1b9f628f582a3b2348a55e0da560d4b72353160099686d2a not found: ID does not exist" Apr 22 19:28:28.222943 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.222885 2574 scope.go:117] "RemoveContainer" containerID="bec1a49cd600b02fca9ab88548ab95cbfe5ad7d9cf0418dfe9e0c7fd7ee7c430" Apr 22 19:28:28.223091 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.223070 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bec1a49cd600b02fca9ab88548ab95cbfe5ad7d9cf0418dfe9e0c7fd7ee7c430"} err="failed to get container status \"bec1a49cd600b02fca9ab88548ab95cbfe5ad7d9cf0418dfe9e0c7fd7ee7c430\": rpc error: code = NotFound desc = could not find container \"bec1a49cd600b02fca9ab88548ab95cbfe5ad7d9cf0418dfe9e0c7fd7ee7c430\": container with ID starting with bec1a49cd600b02fca9ab88548ab95cbfe5ad7d9cf0418dfe9e0c7fd7ee7c430 not found: ID does not exist" Apr 22 19:28:28.223152 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.223094 2574 scope.go:117] "RemoveContainer" containerID="31be76f55f39b31526822e2805c9cc6dcedb4f0e13122f0eb1b39c400781c9da" Apr 22 19:28:28.223288 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.223272 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31be76f55f39b31526822e2805c9cc6dcedb4f0e13122f0eb1b39c400781c9da"} err="failed to get container status \"31be76f55f39b31526822e2805c9cc6dcedb4f0e13122f0eb1b39c400781c9da\": rpc error: code = NotFound desc = could not find container \"31be76f55f39b31526822e2805c9cc6dcedb4f0e13122f0eb1b39c400781c9da\": container with ID starting with 31be76f55f39b31526822e2805c9cc6dcedb4f0e13122f0eb1b39c400781c9da not found: ID does not exist" Apr 22 19:28:28.225424 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.225388 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-qsnbz" podStartSLOduration=250.693766269 podStartE2EDuration="4m13.225377415s" podCreationTimestamp="2026-04-22 19:24:15 +0000 UTC" firstStartedPulling="2026-04-22 19:28:25.261068008 +0000 UTC m=+282.544738459" lastFinishedPulling="2026-04-22 19:28:27.792679154 +0000 UTC m=+285.076349605" observedRunningTime="2026-04-22 19:28:28.222538487 +0000 UTC m=+285.506208959" watchObservedRunningTime="2026-04-22 19:28:28.225377415 +0000 UTC m=+285.509047888" Apr 22 19:28:28.274642 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.274609 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 19:28:28.277336 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.277312 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 19:28:28.314727 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.314701 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 19:28:28.314993 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.314980 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" containerName="config-reloader" Apr 22 19:28:28.315035 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.314994 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" containerName="config-reloader" Apr 22 19:28:28.315035 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.315007 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" containerName="thanos-sidecar" Apr 22 19:28:28.315035 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.315013 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" containerName="thanos-sidecar" Apr 22 19:28:28.315035 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.315020 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" containerName="kube-rbac-proxy-thanos" Apr 22 19:28:28.315035 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.315026 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" containerName="kube-rbac-proxy-thanos" Apr 22 19:28:28.315035 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.315034 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" containerName="kube-rbac-proxy" Apr 22 19:28:28.315210 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.315039 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" containerName="kube-rbac-proxy" Apr 22 19:28:28.315210 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.315046 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" containerName="prometheus" Apr 22 19:28:28.315210 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.315051 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" containerName="prometheus" Apr 22 19:28:28.315210 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.315064 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" containerName="init-config-reloader" Apr 22 19:28:28.315210 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.315069 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" containerName="init-config-reloader" Apr 22 19:28:28.315210 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.315075 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" containerName="kube-rbac-proxy-web" Apr 22 19:28:28.315210 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.315081 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" containerName="kube-rbac-proxy-web" Apr 22 19:28:28.315210 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.315128 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" containerName="thanos-sidecar" Apr 22 19:28:28.315210 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.315138 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" containerName="kube-rbac-proxy-web" Apr 22 19:28:28.315210 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.315145 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" containerName="prometheus" Apr 22 19:28:28.315210 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.315152 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" containerName="config-reloader" Apr 22 19:28:28.315210 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.315158 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" containerName="kube-rbac-proxy" Apr 22 19:28:28.315210 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.315166 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" containerName="kube-rbac-proxy-thanos" Apr 22 19:28:28.320364 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.318833 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.325229 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.325209 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 19:28:28.325638 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.325621 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 19:28:28.325761 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.325739 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 19:28:28.325986 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.325937 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 19:28:28.326562 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.326432 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 19:28:28.327351 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.327336 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-t2rjk\"" Apr 22 19:28:28.328241 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.328226 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 19:28:28.328297 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.328268 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 19:28:28.328350 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.328226 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 19:28:28.328735 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.328721 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 19:28:28.329155 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.329135 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-13shseuhu1e6d\"" Apr 22 19:28:28.329254 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.329214 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 19:28:28.329324 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.329277 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 19:28:28.329622 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.329606 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 19:28:28.336287 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.336115 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 19:28:28.338247 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.338225 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 19:28:28.417223 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.417196 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a9e780ce-144b-4e25-9172-673e7dc43f69-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.417223 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.417225 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a9e780ce-144b-4e25-9172-673e7dc43f69-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.417397 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.417256 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a9e780ce-144b-4e25-9172-673e7dc43f69-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.417397 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.417287 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a9e780ce-144b-4e25-9172-673e7dc43f69-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.417397 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.417303 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9e780ce-144b-4e25-9172-673e7dc43f69-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.417397 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.417321 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jlrb\" (UniqueName: \"kubernetes.io/projected/a9e780ce-144b-4e25-9172-673e7dc43f69-kube-api-access-6jlrb\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.417397 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.417340 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a9e780ce-144b-4e25-9172-673e7dc43f69-config\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.417397 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.417360 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a9e780ce-144b-4e25-9172-673e7dc43f69-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.417397 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.417378 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9e780ce-144b-4e25-9172-673e7dc43f69-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.417397 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.417394 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a9e780ce-144b-4e25-9172-673e7dc43f69-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.417673 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.417413 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a9e780ce-144b-4e25-9172-673e7dc43f69-config-out\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.417673 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.417461 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a9e780ce-144b-4e25-9172-673e7dc43f69-web-config\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.417673 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.417477 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a9e780ce-144b-4e25-9172-673e7dc43f69-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.417673 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.417502 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a9e780ce-144b-4e25-9172-673e7dc43f69-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.417673 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.417520 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a9e780ce-144b-4e25-9172-673e7dc43f69-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.417673 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.417535 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a9e780ce-144b-4e25-9172-673e7dc43f69-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.417673 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.417548 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9e780ce-144b-4e25-9172-673e7dc43f69-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.417673 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.417571 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a9e780ce-144b-4e25-9172-673e7dc43f69-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.518346 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.518313 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a9e780ce-144b-4e25-9172-673e7dc43f69-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.518346 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.518349 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a9e780ce-144b-4e25-9172-673e7dc43f69-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.518540 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.518481 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a9e780ce-144b-4e25-9172-673e7dc43f69-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.518540 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.518522 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a9e780ce-144b-4e25-9172-673e7dc43f69-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.518667 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.518645 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9e780ce-144b-4e25-9172-673e7dc43f69-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.518704 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.518691 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6jlrb\" (UniqueName: \"kubernetes.io/projected/a9e780ce-144b-4e25-9172-673e7dc43f69-kube-api-access-6jlrb\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.518753 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.518727 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a9e780ce-144b-4e25-9172-673e7dc43f69-config\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.518840 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.518824 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a9e780ce-144b-4e25-9172-673e7dc43f69-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.518897 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.518864 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9e780ce-144b-4e25-9172-673e7dc43f69-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.518897 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.518888 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a9e780ce-144b-4e25-9172-673e7dc43f69-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.518994 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.518896 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a9e780ce-144b-4e25-9172-673e7dc43f69-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.518994 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.518926 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a9e780ce-144b-4e25-9172-673e7dc43f69-config-out\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.518994 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.518958 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a9e780ce-144b-4e25-9172-673e7dc43f69-web-config\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.518994 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.518988 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a9e780ce-144b-4e25-9172-673e7dc43f69-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.519188 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.519028 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a9e780ce-144b-4e25-9172-673e7dc43f69-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.519188 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.519060 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a9e780ce-144b-4e25-9172-673e7dc43f69-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.519188 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.519084 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a9e780ce-144b-4e25-9172-673e7dc43f69-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.519188 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.519107 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9e780ce-144b-4e25-9172-673e7dc43f69-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.519188 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.519142 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a9e780ce-144b-4e25-9172-673e7dc43f69-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.519974 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.519808 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a9e780ce-144b-4e25-9172-673e7dc43f69-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.521996 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.521657 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9e780ce-144b-4e25-9172-673e7dc43f69-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.522459 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.522432 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9e780ce-144b-4e25-9172-673e7dc43f69-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.522725 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.522678 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9e780ce-144b-4e25-9172-673e7dc43f69-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.522857 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.522807 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a9e780ce-144b-4e25-9172-673e7dc43f69-config\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.522952 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.522916 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a9e780ce-144b-4e25-9172-673e7dc43f69-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.523127 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.523104 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a9e780ce-144b-4e25-9172-673e7dc43f69-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.523453 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.523412 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a9e780ce-144b-4e25-9172-673e7dc43f69-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.523860 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.523815 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a9e780ce-144b-4e25-9172-673e7dc43f69-config-out\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.524119 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.524095 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a9e780ce-144b-4e25-9172-673e7dc43f69-web-config\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.524119 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.524109 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a9e780ce-144b-4e25-9172-673e7dc43f69-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.524507 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.524486 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a9e780ce-144b-4e25-9172-673e7dc43f69-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.524608 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.524513 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a9e780ce-144b-4e25-9172-673e7dc43f69-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.524608 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.524545 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a9e780ce-144b-4e25-9172-673e7dc43f69-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.524608 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.524547 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a9e780ce-144b-4e25-9172-673e7dc43f69-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.524805 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.524699 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a9e780ce-144b-4e25-9172-673e7dc43f69-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.529704 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.529678 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jlrb\" (UniqueName: \"kubernetes.io/projected/a9e780ce-144b-4e25-9172-673e7dc43f69-kube-api-access-6jlrb\") pod \"prometheus-k8s-0\" (UID: \"a9e780ce-144b-4e25-9172-673e7dc43f69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.629470 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.629387 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:28.757097 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:28.757058 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 19:28:28.763651 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:28:28.763597 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9e780ce_144b_4e25_9172_673e7dc43f69.slice/crio-b3d6fe85a84f9f5812e24dcce3e2f53a734f2fe34a3550121b69c9ce24aff487 WatchSource:0}: Error finding container b3d6fe85a84f9f5812e24dcce3e2f53a734f2fe34a3550121b69c9ce24aff487: Status 404 returned error can't find the container with id b3d6fe85a84f9f5812e24dcce3e2f53a734f2fe34a3550121b69c9ce24aff487 Apr 22 19:28:29.157721 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:29.157688 2574 generic.go:358] "Generic (PLEG): container finished" podID="a9e780ce-144b-4e25-9172-673e7dc43f69" containerID="bd82ca35d39c71a532cb4e262faefad929b2c387e8de33186ae2726b9ddb431b" exitCode=0 Apr 22 19:28:29.158173 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:29.157741 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a9e780ce-144b-4e25-9172-673e7dc43f69","Type":"ContainerDied","Data":"bd82ca35d39c71a532cb4e262faefad929b2c387e8de33186ae2726b9ddb431b"} Apr 22 19:28:29.158173 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:29.157775 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a9e780ce-144b-4e25-9172-673e7dc43f69","Type":"ContainerStarted","Data":"b3d6fe85a84f9f5812e24dcce3e2f53a734f2fe34a3550121b69c9ce24aff487"} Apr 22 19:28:29.354163 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:29.354132 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea47d7f-14f1-4dfa-a03b-bfc687e938cb" path="/var/lib/kubelet/pods/6ea47d7f-14f1-4dfa-a03b-bfc687e938cb/volumes" Apr 22 19:28:30.165210 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:30.165174 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a9e780ce-144b-4e25-9172-673e7dc43f69","Type":"ContainerStarted","Data":"668ceb8c4f4560367127acf2f3d8425110893f3724d9f67a13b8ec238665be01"} Apr 22 19:28:30.165210 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:30.165210 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a9e780ce-144b-4e25-9172-673e7dc43f69","Type":"ContainerStarted","Data":"57372b1967d3efb26ef4c5191b1fbf008ba57ba500365cbaa8d76134012a97fb"} Apr 22 19:28:30.165649 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:30.165225 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a9e780ce-144b-4e25-9172-673e7dc43f69","Type":"ContainerStarted","Data":"c1ef1e3adf42034ce90bd3b4f72cddaa3b4adb397bfead221458486fe14fa861"} Apr 22 19:28:30.165649 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:30.165237 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a9e780ce-144b-4e25-9172-673e7dc43f69","Type":"ContainerStarted","Data":"93e43b51492e1312c78d02c71e854f4e34e62b2146361357407fa9829fe011b9"} Apr 22 19:28:30.165649 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:30.165248 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a9e780ce-144b-4e25-9172-673e7dc43f69","Type":"ContainerStarted","Data":"e2e7388310fa3a714ec9780466c7c8aff458534aa7668ed4b9ea0b9831ddbd53"} Apr 22 19:28:30.165649 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:30.165259 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a9e780ce-144b-4e25-9172-673e7dc43f69","Type":"ContainerStarted","Data":"7698e441272f5ac7b0d285a314254ed3de4380c9c2e21a47cb625d943fb2b279"} Apr 22 19:28:30.226449 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:30.226398 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.226384287 podStartE2EDuration="2.226384287s" podCreationTimestamp="2026-04-22 19:28:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:28:30.224292256 +0000 UTC m=+287.507962729" watchObservedRunningTime="2026-04-22 19:28:30.226384287 +0000 UTC m=+287.510054760" Apr 22 19:28:33.630316 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:33.630283 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:28:38.161649 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:38.161616 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-lnt5s" Apr 22 19:28:43.236167 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:28:43.236144 2574 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 19:29:28.630251 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:29:28.630216 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:29:28.645671 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:29:28.645646 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:29:29.358749 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:29:29.358721 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:32:24.087971 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:32:24.087934 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-v4gf2"] Apr 22 19:32:24.091304 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:32:24.091265 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-v4gf2" Apr 22 19:32:24.093992 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:32:24.093970 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 22 19:32:24.093992 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:32:24.093977 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-skvkc\"" Apr 22 19:32:24.095391 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:32:24.095374 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 22 19:32:24.100578 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:32:24.100558 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-v4gf2"] Apr 22 19:32:24.211518 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:32:24.211488 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/be61d6d4-a0f1-44a9-b05f-e300c2d3b544-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-v4gf2\" (UID: \"be61d6d4-a0f1-44a9-b05f-e300c2d3b544\") " pod="cert-manager/cert-manager-webhook-587ccfb98-v4gf2" Apr 22 19:32:24.211518 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:32:24.211520 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db29l\" (UniqueName: \"kubernetes.io/projected/be61d6d4-a0f1-44a9-b05f-e300c2d3b544-kube-api-access-db29l\") pod \"cert-manager-webhook-587ccfb98-v4gf2\" (UID: \"be61d6d4-a0f1-44a9-b05f-e300c2d3b544\") " pod="cert-manager/cert-manager-webhook-587ccfb98-v4gf2" Apr 22 19:32:24.312834 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:32:24.312776 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/be61d6d4-a0f1-44a9-b05f-e300c2d3b544-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-v4gf2\" (UID: \"be61d6d4-a0f1-44a9-b05f-e300c2d3b544\") " pod="cert-manager/cert-manager-webhook-587ccfb98-v4gf2" Apr 22 19:32:24.312834 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:32:24.312833 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-db29l\" (UniqueName: \"kubernetes.io/projected/be61d6d4-a0f1-44a9-b05f-e300c2d3b544-kube-api-access-db29l\") pod \"cert-manager-webhook-587ccfb98-v4gf2\" (UID: \"be61d6d4-a0f1-44a9-b05f-e300c2d3b544\") " pod="cert-manager/cert-manager-webhook-587ccfb98-v4gf2" Apr 22 19:32:24.322338 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:32:24.322303 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/be61d6d4-a0f1-44a9-b05f-e300c2d3b544-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-v4gf2\" (UID: \"be61d6d4-a0f1-44a9-b05f-e300c2d3b544\") " pod="cert-manager/cert-manager-webhook-587ccfb98-v4gf2" Apr 22 19:32:24.322454 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:32:24.322417 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-db29l\" (UniqueName: \"kubernetes.io/projected/be61d6d4-a0f1-44a9-b05f-e300c2d3b544-kube-api-access-db29l\") pod \"cert-manager-webhook-587ccfb98-v4gf2\" (UID: \"be61d6d4-a0f1-44a9-b05f-e300c2d3b544\") " pod="cert-manager/cert-manager-webhook-587ccfb98-v4gf2" Apr 22 19:32:24.412573 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:32:24.412501 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-v4gf2" Apr 22 19:32:24.531302 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:32:24.531271 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-v4gf2"] Apr 22 19:32:24.534716 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:32:24.534675 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe61d6d4_a0f1_44a9_b05f_e300c2d3b544.slice/crio-63e8705efef1d0679f31d098e99332fb9b3455410d731315f61d635f99a19456 WatchSource:0}: Error finding container 63e8705efef1d0679f31d098e99332fb9b3455410d731315f61d635f99a19456: Status 404 returned error can't find the container with id 63e8705efef1d0679f31d098e99332fb9b3455410d731315f61d635f99a19456 Apr 22 19:32:24.536922 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:32:24.536904 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:32:24.846121 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:32:24.846081 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-v4gf2" event={"ID":"be61d6d4-a0f1-44a9-b05f-e300c2d3b544","Type":"ContainerStarted","Data":"63e8705efef1d0679f31d098e99332fb9b3455410d731315f61d635f99a19456"} Apr 22 19:32:27.856717 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:32:27.856685 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-v4gf2" event={"ID":"be61d6d4-a0f1-44a9-b05f-e300c2d3b544","Type":"ContainerStarted","Data":"40416d1de0dd644bc127b01444f460f8849936fa1c1871c5550e3bebaf5db33d"} Apr 22 19:32:27.857208 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:32:27.856801 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-v4gf2" Apr 22 19:32:27.873221 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:32:27.873175 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-v4gf2" podStartSLOduration=0.735696717 podStartE2EDuration="3.873160975s" podCreationTimestamp="2026-04-22 19:32:24 +0000 UTC" firstStartedPulling="2026-04-22 19:32:24.537065669 +0000 UTC m=+521.820736126" lastFinishedPulling="2026-04-22 19:32:27.674529933 +0000 UTC m=+524.958200384" observedRunningTime="2026-04-22 19:32:27.872204395 +0000 UTC m=+525.155874882" watchObservedRunningTime="2026-04-22 19:32:27.873160975 +0000 UTC m=+525.156831445" Apr 22 19:32:29.255717 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:32:29.255680 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-56h6w"] Apr 22 19:32:29.259132 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:32:29.259109 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-56h6w" Apr 22 19:32:29.261495 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:32:29.261469 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-gfvkn\"" Apr 22 19:32:29.266762 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:32:29.266738 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-56h6w"] Apr 22 19:32:29.357201 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:32:29.357162 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7chx\" (UniqueName: \"kubernetes.io/projected/a250c5fd-33a2-4806-8411-ffd9fc8b0ac8-kube-api-access-g7chx\") pod \"cert-manager-cainjector-68b757865b-56h6w\" (UID: \"a250c5fd-33a2-4806-8411-ffd9fc8b0ac8\") " pod="cert-manager/cert-manager-cainjector-68b757865b-56h6w" Apr 22 19:32:29.357367 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:32:29.357216 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a250c5fd-33a2-4806-8411-ffd9fc8b0ac8-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-56h6w\" (UID: \"a250c5fd-33a2-4806-8411-ffd9fc8b0ac8\") " pod="cert-manager/cert-manager-cainjector-68b757865b-56h6w" Apr 22 19:32:29.457909 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:32:29.457874 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g7chx\" (UniqueName: \"kubernetes.io/projected/a250c5fd-33a2-4806-8411-ffd9fc8b0ac8-kube-api-access-g7chx\") pod \"cert-manager-cainjector-68b757865b-56h6w\" (UID: \"a250c5fd-33a2-4806-8411-ffd9fc8b0ac8\") " pod="cert-manager/cert-manager-cainjector-68b757865b-56h6w" Apr 22 19:32:29.458100 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:32:29.457924 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a250c5fd-33a2-4806-8411-ffd9fc8b0ac8-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-56h6w\" (UID: \"a250c5fd-33a2-4806-8411-ffd9fc8b0ac8\") " pod="cert-manager/cert-manager-cainjector-68b757865b-56h6w" Apr 22 19:32:29.466307 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:32:29.466271 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a250c5fd-33a2-4806-8411-ffd9fc8b0ac8-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-56h6w\" (UID: \"a250c5fd-33a2-4806-8411-ffd9fc8b0ac8\") " pod="cert-manager/cert-manager-cainjector-68b757865b-56h6w" Apr 22 19:32:29.466440 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:32:29.466423 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7chx\" (UniqueName: \"kubernetes.io/projected/a250c5fd-33a2-4806-8411-ffd9fc8b0ac8-kube-api-access-g7chx\") pod \"cert-manager-cainjector-68b757865b-56h6w\" (UID: \"a250c5fd-33a2-4806-8411-ffd9fc8b0ac8\") " pod="cert-manager/cert-manager-cainjector-68b757865b-56h6w" Apr 22 19:32:29.568724 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:32:29.568638 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-56h6w" Apr 22 19:32:29.683632 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:32:29.683579 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-56h6w"] Apr 22 19:32:29.686094 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:32:29.686064 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda250c5fd_33a2_4806_8411_ffd9fc8b0ac8.slice/crio-c69455d2dc59a351627f0ea450663f81b94492cfc825bde47ebe1b7371fb92a1 WatchSource:0}: Error finding container c69455d2dc59a351627f0ea450663f81b94492cfc825bde47ebe1b7371fb92a1: Status 404 returned error can't find the container with id c69455d2dc59a351627f0ea450663f81b94492cfc825bde47ebe1b7371fb92a1 Apr 22 19:32:29.864088 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:32:29.863999 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-56h6w" event={"ID":"a250c5fd-33a2-4806-8411-ffd9fc8b0ac8","Type":"ContainerStarted","Data":"c5ef6f73c1e62098d457779ac07ffe4ddaf5c47e5b75a3fdb1ec4092d06e1ce3"} Apr 22 19:32:29.864088 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:32:29.864037 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-56h6w" event={"ID":"a250c5fd-33a2-4806-8411-ffd9fc8b0ac8","Type":"ContainerStarted","Data":"c69455d2dc59a351627f0ea450663f81b94492cfc825bde47ebe1b7371fb92a1"} Apr 22 19:32:29.881179 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:32:29.881100 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-56h6w" podStartSLOduration=0.881082454 podStartE2EDuration="881.082454ms" podCreationTimestamp="2026-04-22 19:32:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:32:29.879670598 +0000 UTC m=+527.163341072" watchObservedRunningTime="2026-04-22 19:32:29.881082454 +0000 UTC m=+527.164752928" Apr 22 19:32:33.862044 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:32:33.862016 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-v4gf2" Apr 22 19:33:06.091033 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:33:06.090996 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-98c76994c-dhrwk"] Apr 22 19:33:06.096638 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:33:06.096617 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-98c76994c-dhrwk" Apr 22 19:33:06.103350 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:33:06.103327 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 22 19:33:06.104525 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:33:06.104503 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:33:06.104654 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:33:06.104634 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 22 19:33:06.104760 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:33:06.104661 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-ghr98\"" Apr 22 19:33:06.104856 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:33:06.104605 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 22 19:33:06.104856 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:33:06.104697 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 22 19:33:06.112249 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:33:06.112224 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-98c76994c-dhrwk"] Apr 22 19:33:06.169831 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:33:06.169798 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/20ffec41-b93d-40f8-8aba-ee5a501bedc9-manager-config\") pod \"lws-controller-manager-98c76994c-dhrwk\" (UID: \"20ffec41-b93d-40f8-8aba-ee5a501bedc9\") " pod="openshift-lws-operator/lws-controller-manager-98c76994c-dhrwk" Apr 22 19:33:06.169831 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:33:06.169833 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20ffec41-b93d-40f8-8aba-ee5a501bedc9-cert\") pod \"lws-controller-manager-98c76994c-dhrwk\" (UID: \"20ffec41-b93d-40f8-8aba-ee5a501bedc9\") " pod="openshift-lws-operator/lws-controller-manager-98c76994c-dhrwk" Apr 22 19:33:06.170055 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:33:06.169943 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsnj2\" (UniqueName: \"kubernetes.io/projected/20ffec41-b93d-40f8-8aba-ee5a501bedc9-kube-api-access-wsnj2\") pod \"lws-controller-manager-98c76994c-dhrwk\" (UID: \"20ffec41-b93d-40f8-8aba-ee5a501bedc9\") " pod="openshift-lws-operator/lws-controller-manager-98c76994c-dhrwk" Apr 22 19:33:06.170099 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:33:06.170061 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/20ffec41-b93d-40f8-8aba-ee5a501bedc9-metrics-cert\") pod \"lws-controller-manager-98c76994c-dhrwk\" (UID: \"20ffec41-b93d-40f8-8aba-ee5a501bedc9\") " pod="openshift-lws-operator/lws-controller-manager-98c76994c-dhrwk" Apr 22 19:33:06.271060 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:33:06.271020 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/20ffec41-b93d-40f8-8aba-ee5a501bedc9-metrics-cert\") pod \"lws-controller-manager-98c76994c-dhrwk\" (UID: \"20ffec41-b93d-40f8-8aba-ee5a501bedc9\") " pod="openshift-lws-operator/lws-controller-manager-98c76994c-dhrwk" Apr 22 19:33:06.271060 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:33:06.271065 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/20ffec41-b93d-40f8-8aba-ee5a501bedc9-manager-config\") pod \"lws-controller-manager-98c76994c-dhrwk\" (UID: \"20ffec41-b93d-40f8-8aba-ee5a501bedc9\") " pod="openshift-lws-operator/lws-controller-manager-98c76994c-dhrwk" Apr 22 19:33:06.271266 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:33:06.271087 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20ffec41-b93d-40f8-8aba-ee5a501bedc9-cert\") pod \"lws-controller-manager-98c76994c-dhrwk\" (UID: \"20ffec41-b93d-40f8-8aba-ee5a501bedc9\") " pod="openshift-lws-operator/lws-controller-manager-98c76994c-dhrwk" Apr 22 19:33:06.271266 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:33:06.271130 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wsnj2\" (UniqueName: \"kubernetes.io/projected/20ffec41-b93d-40f8-8aba-ee5a501bedc9-kube-api-access-wsnj2\") pod \"lws-controller-manager-98c76994c-dhrwk\" (UID: \"20ffec41-b93d-40f8-8aba-ee5a501bedc9\") " pod="openshift-lws-operator/lws-controller-manager-98c76994c-dhrwk" Apr 22 19:33:06.271775 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:33:06.271748 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/20ffec41-b93d-40f8-8aba-ee5a501bedc9-manager-config\") pod \"lws-controller-manager-98c76994c-dhrwk\" (UID: \"20ffec41-b93d-40f8-8aba-ee5a501bedc9\") " pod="openshift-lws-operator/lws-controller-manager-98c76994c-dhrwk" Apr 22 19:33:06.273531 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:33:06.273506 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20ffec41-b93d-40f8-8aba-ee5a501bedc9-cert\") pod \"lws-controller-manager-98c76994c-dhrwk\" (UID: \"20ffec41-b93d-40f8-8aba-ee5a501bedc9\") " pod="openshift-lws-operator/lws-controller-manager-98c76994c-dhrwk" Apr 22 19:33:06.273634 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:33:06.273584 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/20ffec41-b93d-40f8-8aba-ee5a501bedc9-metrics-cert\") pod \"lws-controller-manager-98c76994c-dhrwk\" (UID: \"20ffec41-b93d-40f8-8aba-ee5a501bedc9\") " pod="openshift-lws-operator/lws-controller-manager-98c76994c-dhrwk" Apr 22 19:33:06.292553 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:33:06.292520 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsnj2\" (UniqueName: \"kubernetes.io/projected/20ffec41-b93d-40f8-8aba-ee5a501bedc9-kube-api-access-wsnj2\") pod \"lws-controller-manager-98c76994c-dhrwk\" (UID: \"20ffec41-b93d-40f8-8aba-ee5a501bedc9\") " pod="openshift-lws-operator/lws-controller-manager-98c76994c-dhrwk" Apr 22 19:33:06.405930 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:33:06.405836 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-98c76994c-dhrwk" Apr 22 19:33:06.555429 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:33:06.555399 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-98c76994c-dhrwk"] Apr 22 19:33:06.557530 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:33:06.557496 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20ffec41_b93d_40f8_8aba_ee5a501bedc9.slice/crio-e94576b1fa2f06d82a0f2775006706bdcb8c6b0f9c6521b8891f463032ec3640 WatchSource:0}: Error finding container e94576b1fa2f06d82a0f2775006706bdcb8c6b0f9c6521b8891f463032ec3640: Status 404 returned error can't find the container with id e94576b1fa2f06d82a0f2775006706bdcb8c6b0f9c6521b8891f463032ec3640 Apr 22 19:33:06.979752 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:33:06.979717 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-98c76994c-dhrwk" event={"ID":"20ffec41-b93d-40f8-8aba-ee5a501bedc9","Type":"ContainerStarted","Data":"e94576b1fa2f06d82a0f2775006706bdcb8c6b0f9c6521b8891f463032ec3640"} Apr 22 19:33:09.991425 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:33:09.991389 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-98c76994c-dhrwk" event={"ID":"20ffec41-b93d-40f8-8aba-ee5a501bedc9","Type":"ContainerStarted","Data":"f411c8b95e2b5c425856f2f4393b68372219f995f511e87dfaeceeaf36bf9f31"} Apr 22 19:33:09.991827 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:33:09.991514 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-98c76994c-dhrwk" Apr 22 19:33:10.018818 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:33:10.018746 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-98c76994c-dhrwk" podStartSLOduration=1.273393661 podStartE2EDuration="4.018726857s" podCreationTimestamp="2026-04-22 19:33:06 +0000 UTC" firstStartedPulling="2026-04-22 19:33:06.559464825 +0000 UTC m=+563.843135276" lastFinishedPulling="2026-04-22 19:33:09.304798004 +0000 UTC m=+566.588468472" observedRunningTime="2026-04-22 19:33:10.016414392 +0000 UTC m=+567.300084866" watchObservedRunningTime="2026-04-22 19:33:10.018726857 +0000 UTC m=+567.302397331" Apr 22 19:33:20.998645 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:33:20.998613 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-98c76994c-dhrwk" Apr 22 19:34:06.390645 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:06.390556 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-b7dtm"] Apr 22 19:34:06.393895 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:06.393878 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-b7dtm" Apr 22 19:34:06.396752 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:06.396730 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 19:34:06.397008 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:06.396985 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 19:34:06.397124 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:06.397055 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-94gpl\"" Apr 22 19:34:06.406284 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:06.406263 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-b7dtm"] Apr 22 19:34:06.470958 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:06.470927 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/4ad04b3e-845a-4da0-a28f-4ca4e954f9ac-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-b7dtm\" (UID: \"4ad04b3e-845a-4da0-a28f-4ca4e954f9ac\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-b7dtm" Apr 22 19:34:06.471117 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:06.471010 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtv9r\" (UniqueName: \"kubernetes.io/projected/4ad04b3e-845a-4da0-a28f-4ca4e954f9ac-kube-api-access-xtv9r\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-b7dtm\" (UID: \"4ad04b3e-845a-4da0-a28f-4ca4e954f9ac\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-b7dtm" Apr 22 19:34:06.571362 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:06.571327 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xtv9r\" (UniqueName: \"kubernetes.io/projected/4ad04b3e-845a-4da0-a28f-4ca4e954f9ac-kube-api-access-xtv9r\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-b7dtm\" (UID: \"4ad04b3e-845a-4da0-a28f-4ca4e954f9ac\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-b7dtm" Apr 22 19:34:06.571528 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:06.571385 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/4ad04b3e-845a-4da0-a28f-4ca4e954f9ac-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-b7dtm\" (UID: \"4ad04b3e-845a-4da0-a28f-4ca4e954f9ac\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-b7dtm" Apr 22 19:34:06.571702 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:06.571686 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/4ad04b3e-845a-4da0-a28f-4ca4e954f9ac-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-b7dtm\" (UID: \"4ad04b3e-845a-4da0-a28f-4ca4e954f9ac\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-b7dtm" Apr 22 19:34:06.590129 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:06.590097 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtv9r\" (UniqueName: \"kubernetes.io/projected/4ad04b3e-845a-4da0-a28f-4ca4e954f9ac-kube-api-access-xtv9r\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-b7dtm\" (UID: \"4ad04b3e-845a-4da0-a28f-4ca4e954f9ac\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-b7dtm" Apr 22 19:34:06.704432 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:06.704342 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-b7dtm" Apr 22 19:34:06.831724 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:06.831702 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-b7dtm"] Apr 22 19:34:06.834564 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:34:06.834535 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ad04b3e_845a_4da0_a28f_4ca4e954f9ac.slice/crio-bc3466f7e6697b7d2a27ebb0a8adfd1aa3d20b0576b1fcb6a4aafa8b694db0b1 WatchSource:0}: Error finding container bc3466f7e6697b7d2a27ebb0a8adfd1aa3d20b0576b1fcb6a4aafa8b694db0b1: Status 404 returned error can't find the container with id bc3466f7e6697b7d2a27ebb0a8adfd1aa3d20b0576b1fcb6a4aafa8b694db0b1 Apr 22 19:34:07.168666 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:07.168627 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-b7dtm" event={"ID":"4ad04b3e-845a-4da0-a28f-4ca4e954f9ac","Type":"ContainerStarted","Data":"bc3466f7e6697b7d2a27ebb0a8adfd1aa3d20b0576b1fcb6a4aafa8b694db0b1"} Apr 22 19:34:11.252512 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:11.252475 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-m2zrb"] Apr 22 19:34:11.255591 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:11.255574 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-m2zrb" Apr 22 19:34:11.258159 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:11.258132 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 22 19:34:11.258275 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:11.258135 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-2ncpz\"" Apr 22 19:34:11.265508 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:11.265483 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-m2zrb"] Apr 22 19:34:11.316154 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:11.316125 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vf6d\" (UniqueName: \"kubernetes.io/projected/d0950e55-6d4a-44e3-9bcb-cf034b753b22-kube-api-access-7vf6d\") pod \"dns-operator-controller-manager-844548ff4c-m2zrb\" (UID: \"d0950e55-6d4a-44e3-9bcb-cf034b753b22\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-m2zrb" Apr 22 19:34:11.417330 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:11.417300 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7vf6d\" (UniqueName: \"kubernetes.io/projected/d0950e55-6d4a-44e3-9bcb-cf034b753b22-kube-api-access-7vf6d\") pod \"dns-operator-controller-manager-844548ff4c-m2zrb\" (UID: \"d0950e55-6d4a-44e3-9bcb-cf034b753b22\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-m2zrb" Apr 22 19:34:11.430256 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:11.430228 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vf6d\" (UniqueName: \"kubernetes.io/projected/d0950e55-6d4a-44e3-9bcb-cf034b753b22-kube-api-access-7vf6d\") pod \"dns-operator-controller-manager-844548ff4c-m2zrb\" (UID: \"d0950e55-6d4a-44e3-9bcb-cf034b753b22\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-m2zrb" Apr 22 19:34:11.566703 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:11.566602 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-m2zrb" Apr 22 19:34:11.695417 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:11.695374 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-m2zrb"] Apr 22 19:34:11.697884 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:34:11.697855 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0950e55_6d4a_44e3_9bcb_cf034b753b22.slice/crio-b88b25f03c00d2bd69e45f3771cea1dbabd390aa2cced03d7aaefea0094465f1 WatchSource:0}: Error finding container b88b25f03c00d2bd69e45f3771cea1dbabd390aa2cced03d7aaefea0094465f1: Status 404 returned error can't find the container with id b88b25f03c00d2bd69e45f3771cea1dbabd390aa2cced03d7aaefea0094465f1 Apr 22 19:34:12.187885 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:12.187849 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-b7dtm" event={"ID":"4ad04b3e-845a-4da0-a28f-4ca4e954f9ac","Type":"ContainerStarted","Data":"4605e219bad1a014ccfc02447f2ba43bb5264b3d863710e5e12b9eb649cfe9e5"} Apr 22 19:34:12.188073 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:12.187946 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-b7dtm" Apr 22 19:34:12.188930 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:12.188904 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-m2zrb" event={"ID":"d0950e55-6d4a-44e3-9bcb-cf034b753b22","Type":"ContainerStarted","Data":"b88b25f03c00d2bd69e45f3771cea1dbabd390aa2cced03d7aaefea0094465f1"} Apr 22 19:34:12.211014 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:12.210969 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-b7dtm" podStartSLOduration=1.724780012 podStartE2EDuration="6.210953311s" podCreationTimestamp="2026-04-22 19:34:06 +0000 UTC" firstStartedPulling="2026-04-22 19:34:06.836936825 +0000 UTC m=+624.120607280" lastFinishedPulling="2026-04-22 19:34:11.323110115 +0000 UTC m=+628.606780579" observedRunningTime="2026-04-22 19:34:12.209070856 +0000 UTC m=+629.492741340" watchObservedRunningTime="2026-04-22 19:34:12.210953311 +0000 UTC m=+629.494623789" Apr 22 19:34:15.206554 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:15.206516 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-m2zrb" event={"ID":"d0950e55-6d4a-44e3-9bcb-cf034b753b22","Type":"ContainerStarted","Data":"f17e10a2e1abfcbca1037650ad367c06ed393da4db3a8abe9b67aee74f9c613f"} Apr 22 19:34:15.206955 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:15.206582 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-m2zrb" Apr 22 19:34:15.230581 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:15.230530 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-m2zrb" podStartSLOduration=1.635359431 podStartE2EDuration="4.230517393s" podCreationTimestamp="2026-04-22 19:34:11 +0000 UTC" firstStartedPulling="2026-04-22 19:34:11.699868549 +0000 UTC m=+628.983539002" lastFinishedPulling="2026-04-22 19:34:14.2950265 +0000 UTC m=+631.578696964" observedRunningTime="2026-04-22 19:34:15.23016723 +0000 UTC m=+632.513837707" watchObservedRunningTime="2026-04-22 19:34:15.230517393 +0000 UTC m=+632.514187865" Apr 22 19:34:23.200441 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:23.200408 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-b7dtm" Apr 22 19:34:26.213047 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:26.213013 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-m2zrb" Apr 22 19:34:56.208379 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:56.208343 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-jsk7d"] Apr 22 19:34:56.218092 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:56.218065 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-jsk7d" Apr 22 19:34:56.221085 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:56.221050 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-gl48n\"" Apr 22 19:34:56.221369 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:56.221349 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 22 19:34:56.224644 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:56.224616 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-jsk7d"] Apr 22 19:34:56.308821 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:56.308773 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-jsk7d"] Apr 22 19:34:56.314471 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:56.314447 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/44db1d11-6091-4b23-8270-734ee564e579-config-file\") pod \"limitador-limitador-64c8f475fb-jsk7d\" (UID: \"44db1d11-6091-4b23-8270-734ee564e579\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-jsk7d" Apr 22 19:34:56.314610 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:56.314495 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h26h5\" (UniqueName: \"kubernetes.io/projected/44db1d11-6091-4b23-8270-734ee564e579-kube-api-access-h26h5\") pod \"limitador-limitador-64c8f475fb-jsk7d\" (UID: \"44db1d11-6091-4b23-8270-734ee564e579\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-jsk7d" Apr 22 19:34:56.415143 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:56.415107 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h26h5\" (UniqueName: \"kubernetes.io/projected/44db1d11-6091-4b23-8270-734ee564e579-kube-api-access-h26h5\") pod \"limitador-limitador-64c8f475fb-jsk7d\" (UID: \"44db1d11-6091-4b23-8270-734ee564e579\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-jsk7d" Apr 22 19:34:56.415333 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:56.415214 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/44db1d11-6091-4b23-8270-734ee564e579-config-file\") pod \"limitador-limitador-64c8f475fb-jsk7d\" (UID: \"44db1d11-6091-4b23-8270-734ee564e579\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-jsk7d" Apr 22 19:34:56.415863 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:56.415845 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/44db1d11-6091-4b23-8270-734ee564e579-config-file\") pod \"limitador-limitador-64c8f475fb-jsk7d\" (UID: \"44db1d11-6091-4b23-8270-734ee564e579\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-jsk7d" Apr 22 19:34:56.423419 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:56.423391 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h26h5\" (UniqueName: \"kubernetes.io/projected/44db1d11-6091-4b23-8270-734ee564e579-kube-api-access-h26h5\") pod \"limitador-limitador-64c8f475fb-jsk7d\" (UID: \"44db1d11-6091-4b23-8270-734ee564e579\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-jsk7d" Apr 22 19:34:56.529710 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:56.529677 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-jsk7d" Apr 22 19:34:56.653846 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:56.653815 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-jsk7d"] Apr 22 19:34:56.656146 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:34:56.656117 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44db1d11_6091_4b23_8270_734ee564e579.slice/crio-9ef47cf3541cd2a2d10110c2bbbc3006feddb4ed717632c139a5882d853a296b WatchSource:0}: Error finding container 9ef47cf3541cd2a2d10110c2bbbc3006feddb4ed717632c139a5882d853a296b: Status 404 returned error can't find the container with id 9ef47cf3541cd2a2d10110c2bbbc3006feddb4ed717632c139a5882d853a296b Apr 22 19:34:57.344016 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:34:57.343980 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-jsk7d" event={"ID":"44db1d11-6091-4b23-8270-734ee564e579","Type":"ContainerStarted","Data":"9ef47cf3541cd2a2d10110c2bbbc3006feddb4ed717632c139a5882d853a296b"} Apr 22 19:35:01.361927 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:01.361890 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-jsk7d" event={"ID":"44db1d11-6091-4b23-8270-734ee564e579","Type":"ContainerStarted","Data":"1ec75025d09b0a1ec328e0aba9e5f10226864a69269cbd0956e182ceac7a86ac"} Apr 22 19:35:01.362409 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:01.362033 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-jsk7d" Apr 22 19:35:01.381817 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:01.381751 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-64c8f475fb-jsk7d" podStartSLOduration=1.558227633 podStartE2EDuration="5.38173708s" podCreationTimestamp="2026-04-22 19:34:56 +0000 UTC" firstStartedPulling="2026-04-22 19:34:56.658478919 +0000 UTC m=+673.942149370" lastFinishedPulling="2026-04-22 19:35:00.481988355 +0000 UTC m=+677.765658817" observedRunningTime="2026-04-22 19:35:01.380454937 +0000 UTC m=+678.664125413" watchObservedRunningTime="2026-04-22 19:35:01.38173708 +0000 UTC m=+678.665407553" Apr 22 19:35:12.104679 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:12.104644 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-jsk7d"] Apr 22 19:35:12.105100 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:12.104959 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-64c8f475fb-jsk7d" podUID="44db1d11-6091-4b23-8270-734ee564e579" containerName="limitador" containerID="cri-o://1ec75025d09b0a1ec328e0aba9e5f10226864a69269cbd0956e182ceac7a86ac" gracePeriod=30 Apr 22 19:35:12.105732 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:12.105643 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-jsk7d" Apr 22 19:35:12.645961 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:12.645938 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-jsk7d" Apr 22 19:35:12.762445 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:12.762414 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h26h5\" (UniqueName: \"kubernetes.io/projected/44db1d11-6091-4b23-8270-734ee564e579-kube-api-access-h26h5\") pod \"44db1d11-6091-4b23-8270-734ee564e579\" (UID: \"44db1d11-6091-4b23-8270-734ee564e579\") " Apr 22 19:35:12.762608 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:12.762462 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/44db1d11-6091-4b23-8270-734ee564e579-config-file\") pod \"44db1d11-6091-4b23-8270-734ee564e579\" (UID: \"44db1d11-6091-4b23-8270-734ee564e579\") " Apr 22 19:35:12.762871 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:12.762845 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44db1d11-6091-4b23-8270-734ee564e579-config-file" (OuterVolumeSpecName: "config-file") pod "44db1d11-6091-4b23-8270-734ee564e579" (UID: "44db1d11-6091-4b23-8270-734ee564e579"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:35:12.764587 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:12.764561 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44db1d11-6091-4b23-8270-734ee564e579-kube-api-access-h26h5" (OuterVolumeSpecName: "kube-api-access-h26h5") pod "44db1d11-6091-4b23-8270-734ee564e579" (UID: "44db1d11-6091-4b23-8270-734ee564e579"). InnerVolumeSpecName "kube-api-access-h26h5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:35:12.863151 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:12.863102 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h26h5\" (UniqueName: \"kubernetes.io/projected/44db1d11-6091-4b23-8270-734ee564e579-kube-api-access-h26h5\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:35:12.863151 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:12.863142 2574 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/44db1d11-6091-4b23-8270-734ee564e579-config-file\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:35:13.408409 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:13.408363 2574 generic.go:358] "Generic (PLEG): container finished" podID="44db1d11-6091-4b23-8270-734ee564e579" containerID="1ec75025d09b0a1ec328e0aba9e5f10226864a69269cbd0956e182ceac7a86ac" exitCode=0 Apr 22 19:35:13.408881 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:13.408426 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-jsk7d" event={"ID":"44db1d11-6091-4b23-8270-734ee564e579","Type":"ContainerDied","Data":"1ec75025d09b0a1ec328e0aba9e5f10226864a69269cbd0956e182ceac7a86ac"} Apr 22 19:35:13.408881 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:13.408456 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-jsk7d" Apr 22 19:35:13.408881 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:13.408466 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-jsk7d" event={"ID":"44db1d11-6091-4b23-8270-734ee564e579","Type":"ContainerDied","Data":"9ef47cf3541cd2a2d10110c2bbbc3006feddb4ed717632c139a5882d853a296b"} Apr 22 19:35:13.408881 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:13.408482 2574 scope.go:117] "RemoveContainer" containerID="1ec75025d09b0a1ec328e0aba9e5f10226864a69269cbd0956e182ceac7a86ac" Apr 22 19:35:13.416319 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:13.416305 2574 scope.go:117] "RemoveContainer" containerID="1ec75025d09b0a1ec328e0aba9e5f10226864a69269cbd0956e182ceac7a86ac" Apr 22 19:35:13.416558 ip-10-0-134-22 kubenswrapper[2574]: E0422 19:35:13.416539 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ec75025d09b0a1ec328e0aba9e5f10226864a69269cbd0956e182ceac7a86ac\": container with ID starting with 1ec75025d09b0a1ec328e0aba9e5f10226864a69269cbd0956e182ceac7a86ac not found: ID does not exist" containerID="1ec75025d09b0a1ec328e0aba9e5f10226864a69269cbd0956e182ceac7a86ac" Apr 22 19:35:13.416601 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:13.416565 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ec75025d09b0a1ec328e0aba9e5f10226864a69269cbd0956e182ceac7a86ac"} err="failed to get container status \"1ec75025d09b0a1ec328e0aba9e5f10226864a69269cbd0956e182ceac7a86ac\": rpc error: code = NotFound desc = could not find container \"1ec75025d09b0a1ec328e0aba9e5f10226864a69269cbd0956e182ceac7a86ac\": container with ID starting with 1ec75025d09b0a1ec328e0aba9e5f10226864a69269cbd0956e182ceac7a86ac not found: ID does not exist" Apr 22 19:35:13.425230 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:13.425211 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-jsk7d"] Apr 22 19:35:13.428490 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:13.428467 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-jsk7d"] Apr 22 19:35:15.350299 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:15.350270 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44db1d11-6091-4b23-8270-734ee564e579" path="/var/lib/kubelet/pods/44db1d11-6091-4b23-8270-734ee564e579/volumes" Apr 22 19:35:31.293503 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:31.293467 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-ndcbr"] Apr 22 19:35:31.293997 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:31.293843 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="44db1d11-6091-4b23-8270-734ee564e579" containerName="limitador" Apr 22 19:35:31.293997 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:31.293856 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="44db1d11-6091-4b23-8270-734ee564e579" containerName="limitador" Apr 22 19:35:31.293997 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:31.293919 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="44db1d11-6091-4b23-8270-734ee564e579" containerName="limitador" Apr 22 19:35:31.296729 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:31.296712 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ndcbr" Apr 22 19:35:31.299611 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:31.299587 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 22 19:35:31.299611 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:31.299602 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 22 19:35:31.300123 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:31.299655 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 19:35:31.300123 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:31.299666 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 22 19:35:31.300123 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:31.299679 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-mf4cd\"" Apr 22 19:35:31.300123 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:31.299713 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 19:35:31.300569 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:31.300551 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 22 19:35:31.308414 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:31.308393 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-ndcbr"] Apr 22 19:35:31.415204 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:31.415165 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-ndcbr\" (UID: \"f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ndcbr" Apr 22 19:35:31.415204 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:31.415202 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-ndcbr\" (UID: \"f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ndcbr" Apr 22 19:35:31.415436 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:31.415322 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc69z\" (UniqueName: \"kubernetes.io/projected/f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e-kube-api-access-wc69z\") pod \"istiod-openshift-gateway-55ff986f96-ndcbr\" (UID: \"f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ndcbr" Apr 22 19:35:31.415436 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:31.415359 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-ndcbr\" (UID: \"f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ndcbr" Apr 22 19:35:31.415436 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:31.415385 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-ndcbr\" (UID: \"f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ndcbr" Apr 22 19:35:31.415436 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:31.415419 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-ndcbr\" (UID: \"f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ndcbr" Apr 22 19:35:31.415583 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:31.415447 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-ndcbr\" (UID: \"f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ndcbr" Apr 22 19:35:31.516216 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:31.516177 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wc69z\" (UniqueName: \"kubernetes.io/projected/f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e-kube-api-access-wc69z\") pod \"istiod-openshift-gateway-55ff986f96-ndcbr\" (UID: \"f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ndcbr" Apr 22 19:35:31.516216 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:31.516223 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-ndcbr\" (UID: \"f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ndcbr" Apr 22 19:35:31.516531 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:31.516245 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-ndcbr\" (UID: \"f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ndcbr" Apr 22 19:35:31.516531 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:31.516275 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-ndcbr\" (UID: \"f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ndcbr" Apr 22 19:35:31.516531 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:31.516306 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-ndcbr\" (UID: \"f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ndcbr" Apr 22 19:35:31.516531 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:31.516361 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-ndcbr\" (UID: \"f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ndcbr" Apr 22 19:35:31.516531 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:31.516392 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-ndcbr\" (UID: \"f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ndcbr" Apr 22 19:35:31.517061 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:31.517037 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-ndcbr\" (UID: \"f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ndcbr" Apr 22 19:35:31.519003 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:31.518971 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-ndcbr\" (UID: \"f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ndcbr" Apr 22 19:35:31.519240 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:31.519116 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-ndcbr\" (UID: \"f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ndcbr" Apr 22 19:35:31.519338 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:31.519289 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-ndcbr\" (UID: \"f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ndcbr" Apr 22 19:35:31.519478 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:31.519453 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-ndcbr\" (UID: \"f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ndcbr" Apr 22 19:35:31.525423 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:31.525395 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-ndcbr\" (UID: \"f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ndcbr" Apr 22 19:35:31.525977 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:31.525952 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc69z\" (UniqueName: \"kubernetes.io/projected/f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e-kube-api-access-wc69z\") pod \"istiod-openshift-gateway-55ff986f96-ndcbr\" (UID: \"f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ndcbr" Apr 22 19:35:31.610986 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:31.606594 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ndcbr" Apr 22 19:35:31.748344 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:31.748316 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-ndcbr"] Apr 22 19:35:31.750796 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:35:31.750750 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9b091e1_c5c0_4ad1_8ee8_8cb03bd1318e.slice/crio-78d258f31ce2e67d93af4f71e0d1c50604ded1693b247157d9a60409c338d71e WatchSource:0}: Error finding container 78d258f31ce2e67d93af4f71e0d1c50604ded1693b247157d9a60409c338d71e: Status 404 returned error can't find the container with id 78d258f31ce2e67d93af4f71e0d1c50604ded1693b247157d9a60409c338d71e Apr 22 19:35:32.479334 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:32.479297 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ndcbr" event={"ID":"f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e","Type":"ContainerStarted","Data":"78d258f31ce2e67d93af4f71e0d1c50604ded1693b247157d9a60409c338d71e"} Apr 22 19:35:34.425124 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:34.425088 2574 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 22 19:35:34.425375 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:34.425157 2574 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 22 19:35:35.491520 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:35.491476 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ndcbr" event={"ID":"f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e","Type":"ContainerStarted","Data":"529d3d1d69fb28d5c8e808e81130d99ca3d611cdd6598a64aaf72508bed8f106"} Apr 22 19:35:35.492012 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:35.491628 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ndcbr" Apr 22 19:35:35.493320 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:35.493300 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ndcbr" Apr 22 19:35:35.517736 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:35:35.517674 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ndcbr" podStartSLOduration=1.845679667 podStartE2EDuration="4.517654219s" podCreationTimestamp="2026-04-22 19:35:31 +0000 UTC" firstStartedPulling="2026-04-22 19:35:31.752817848 +0000 UTC m=+709.036488299" lastFinishedPulling="2026-04-22 19:35:34.424792397 +0000 UTC m=+711.708462851" observedRunningTime="2026-04-22 19:35:35.516110975 +0000 UTC m=+712.799781448" watchObservedRunningTime="2026-04-22 19:35:35.517654219 +0000 UTC m=+712.801324694" Apr 22 19:52:43.154194 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:52:43.154148 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-sk59w/must-gather-cl96l"] Apr 22 19:52:43.156727 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:52:43.156705 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sk59w/must-gather-cl96l" Apr 22 19:52:43.159559 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:52:43.159527 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-sk59w\"/\"kube-root-ca.crt\"" Apr 22 19:52:43.159559 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:52:43.159548 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-sk59w\"/\"default-dockercfg-79pxb\"" Apr 22 19:52:43.160693 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:52:43.160678 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-sk59w\"/\"openshift-service-ca.crt\"" Apr 22 19:52:43.165341 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:52:43.165316 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-sk59w/must-gather-cl96l"] Apr 22 19:52:43.287259 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:52:43.287215 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llzkl\" (UniqueName: \"kubernetes.io/projected/bc905e5a-ff9b-407f-a50c-415eb5b9699a-kube-api-access-llzkl\") pod \"must-gather-cl96l\" (UID: \"bc905e5a-ff9b-407f-a50c-415eb5b9699a\") " pod="openshift-must-gather-sk59w/must-gather-cl96l" Apr 22 19:52:43.287413 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:52:43.287301 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bc905e5a-ff9b-407f-a50c-415eb5b9699a-must-gather-output\") pod \"must-gather-cl96l\" (UID: \"bc905e5a-ff9b-407f-a50c-415eb5b9699a\") " pod="openshift-must-gather-sk59w/must-gather-cl96l" Apr 22 19:52:43.388365 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:52:43.388321 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bc905e5a-ff9b-407f-a50c-415eb5b9699a-must-gather-output\") pod \"must-gather-cl96l\" (UID: \"bc905e5a-ff9b-407f-a50c-415eb5b9699a\") " pod="openshift-must-gather-sk59w/must-gather-cl96l" Apr 22 19:52:43.388550 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:52:43.388502 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-llzkl\" (UniqueName: \"kubernetes.io/projected/bc905e5a-ff9b-407f-a50c-415eb5b9699a-kube-api-access-llzkl\") pod \"must-gather-cl96l\" (UID: \"bc905e5a-ff9b-407f-a50c-415eb5b9699a\") " pod="openshift-must-gather-sk59w/must-gather-cl96l" Apr 22 19:52:43.388718 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:52:43.388699 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bc905e5a-ff9b-407f-a50c-415eb5b9699a-must-gather-output\") pod \"must-gather-cl96l\" (UID: \"bc905e5a-ff9b-407f-a50c-415eb5b9699a\") " pod="openshift-must-gather-sk59w/must-gather-cl96l" Apr 22 19:52:43.399046 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:52:43.399019 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-sk59w\"/\"kube-root-ca.crt\"" Apr 22 19:52:43.409641 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:52:43.409569 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-sk59w\"/\"openshift-service-ca.crt\"" Apr 22 19:52:43.419645 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:52:43.419618 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-llzkl\" (UniqueName: \"kubernetes.io/projected/bc905e5a-ff9b-407f-a50c-415eb5b9699a-kube-api-access-llzkl\") pod \"must-gather-cl96l\" (UID: \"bc905e5a-ff9b-407f-a50c-415eb5b9699a\") " pod="openshift-must-gather-sk59w/must-gather-cl96l" Apr 22 19:52:43.469308 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:52:43.469273 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-sk59w\"/\"default-dockercfg-79pxb\"" Apr 22 19:52:43.477507 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:52:43.477484 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sk59w/must-gather-cl96l" Apr 22 19:52:43.600566 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:52:43.600537 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-sk59w/must-gather-cl96l"] Apr 22 19:52:43.603254 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:52:43.603226 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc905e5a_ff9b_407f_a50c_415eb5b9699a.slice/crio-3e86f0e18fc03a80787e01db1888177480530a0ad7de552326416e164c25da08 WatchSource:0}: Error finding container 3e86f0e18fc03a80787e01db1888177480530a0ad7de552326416e164c25da08: Status 404 returned error can't find the container with id 3e86f0e18fc03a80787e01db1888177480530a0ad7de552326416e164c25da08 Apr 22 19:52:43.604709 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:52:43.604693 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:52:43.956739 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:52:43.956701 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sk59w/must-gather-cl96l" event={"ID":"bc905e5a-ff9b-407f-a50c-415eb5b9699a","Type":"ContainerStarted","Data":"3e86f0e18fc03a80787e01db1888177480530a0ad7de552326416e164c25da08"} Apr 22 19:52:47.976508 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:52:47.976446 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sk59w/must-gather-cl96l" event={"ID":"bc905e5a-ff9b-407f-a50c-415eb5b9699a","Type":"ContainerStarted","Data":"34c08d63ce29dd16461bf5f2444f69fc07ac141a792000cd6cc87aad9bca944f"} Apr 22 19:52:48.981845 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:52:48.981807 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sk59w/must-gather-cl96l" event={"ID":"bc905e5a-ff9b-407f-a50c-415eb5b9699a","Type":"ContainerStarted","Data":"9d11dc02abea5904b21a89904c369179d88fa6961a95a3441cf8468499df02e8"} Apr 22 19:52:48.999285 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:52:48.999221 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-sk59w/must-gather-cl96l" podStartSLOduration=1.784237243 podStartE2EDuration="5.999201725s" podCreationTimestamp="2026-04-22 19:52:43 +0000 UTC" firstStartedPulling="2026-04-22 19:52:43.604840799 +0000 UTC m=+1740.888511250" lastFinishedPulling="2026-04-22 19:52:47.81980528 +0000 UTC m=+1745.103475732" observedRunningTime="2026-04-22 19:52:48.997807778 +0000 UTC m=+1746.281478267" watchObservedRunningTime="2026-04-22 19:52:48.999201725 +0000 UTC m=+1746.282872199" Apr 22 19:53:11.139332 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:11.139303 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-ndcbr_f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e/discovery/0.log" Apr 22 19:53:11.945391 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:11.945358 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-ndcbr_f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e/discovery/0.log" Apr 22 19:53:12.743295 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:12.743270 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-m2zrb_d0950e55-6d4a-44e3-9bcb-cf034b753b22/manager/0.log" Apr 22 19:53:12.793704 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:12.793678 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-b7dtm_4ad04b3e-845a-4da0-a28f-4ca4e954f9ac/manager/0.log" Apr 22 19:53:14.072918 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:14.072885 2574 generic.go:358] "Generic (PLEG): container finished" podID="bc905e5a-ff9b-407f-a50c-415eb5b9699a" containerID="34c08d63ce29dd16461bf5f2444f69fc07ac141a792000cd6cc87aad9bca944f" exitCode=0 Apr 22 19:53:14.073340 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:14.072958 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sk59w/must-gather-cl96l" event={"ID":"bc905e5a-ff9b-407f-a50c-415eb5b9699a","Type":"ContainerDied","Data":"34c08d63ce29dd16461bf5f2444f69fc07ac141a792000cd6cc87aad9bca944f"} Apr 22 19:53:14.073340 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:14.073273 2574 scope.go:117] "RemoveContainer" containerID="34c08d63ce29dd16461bf5f2444f69fc07ac141a792000cd6cc87aad9bca944f" Apr 22 19:53:14.451100 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:14.451013 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-sk59w_must-gather-cl96l_bc905e5a-ff9b-407f-a50c-415eb5b9699a/gather/0.log" Apr 22 19:53:18.031018 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:18.030983 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-hcbdv_029be2e7-8cf1-404e-bf0d-59ccb446ec17/global-pull-secret-syncer/0.log" Apr 22 19:53:18.104168 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:18.104139 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-ns824_e586a32a-1d89-4ae4-a0b6-0667215a50e4/konnectivity-agent/0.log" Apr 22 19:53:18.172585 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:18.172557 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-22.ec2.internal_1b90ee820fd4186f1e6cd40d24ef3276/haproxy/0.log" Apr 22 19:53:19.940438 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:19.940403 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-sk59w/must-gather-cl96l"] Apr 22 19:53:19.940846 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:19.940632 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-sk59w/must-gather-cl96l" podUID="bc905e5a-ff9b-407f-a50c-415eb5b9699a" containerName="copy" containerID="cri-o://9d11dc02abea5904b21a89904c369179d88fa6961a95a3441cf8468499df02e8" gracePeriod=2 Apr 22 19:53:19.945833 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:19.945806 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-sk59w/must-gather-cl96l"] Apr 22 19:53:20.095573 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:20.095543 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-sk59w_must-gather-cl96l_bc905e5a-ff9b-407f-a50c-415eb5b9699a/copy/0.log" Apr 22 19:53:20.095950 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:20.095920 2574 generic.go:358] "Generic (PLEG): container finished" podID="bc905e5a-ff9b-407f-a50c-415eb5b9699a" containerID="9d11dc02abea5904b21a89904c369179d88fa6961a95a3441cf8468499df02e8" exitCode=143 Apr 22 19:53:20.173635 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:20.173612 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-sk59w_must-gather-cl96l_bc905e5a-ff9b-407f-a50c-415eb5b9699a/copy/0.log" Apr 22 19:53:20.173962 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:20.173945 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sk59w/must-gather-cl96l" Apr 22 19:53:20.176213 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:20.176191 2574 status_manager.go:895] "Failed to get status for pod" podUID="bc905e5a-ff9b-407f-a50c-415eb5b9699a" pod="openshift-must-gather-sk59w/must-gather-cl96l" err="pods \"must-gather-cl96l\" is forbidden: User \"system:node:ip-10-0-134-22.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-sk59w\": no relationship found between node 'ip-10-0-134-22.ec2.internal' and this object" Apr 22 19:53:20.215210 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:20.215148 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bc905e5a-ff9b-407f-a50c-415eb5b9699a-must-gather-output\") pod \"bc905e5a-ff9b-407f-a50c-415eb5b9699a\" (UID: \"bc905e5a-ff9b-407f-a50c-415eb5b9699a\") " Apr 22 19:53:20.215210 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:20.215201 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llzkl\" (UniqueName: \"kubernetes.io/projected/bc905e5a-ff9b-407f-a50c-415eb5b9699a-kube-api-access-llzkl\") pod \"bc905e5a-ff9b-407f-a50c-415eb5b9699a\" (UID: \"bc905e5a-ff9b-407f-a50c-415eb5b9699a\") " Apr 22 19:53:20.217369 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:20.217336 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc905e5a-ff9b-407f-a50c-415eb5b9699a-kube-api-access-llzkl" (OuterVolumeSpecName: "kube-api-access-llzkl") pod "bc905e5a-ff9b-407f-a50c-415eb5b9699a" (UID: "bc905e5a-ff9b-407f-a50c-415eb5b9699a"). InnerVolumeSpecName "kube-api-access-llzkl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:53:20.220946 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:20.220924 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc905e5a-ff9b-407f-a50c-415eb5b9699a-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "bc905e5a-ff9b-407f-a50c-415eb5b9699a" (UID: "bc905e5a-ff9b-407f-a50c-415eb5b9699a"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:53:20.316079 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:20.316041 2574 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bc905e5a-ff9b-407f-a50c-415eb5b9699a-must-gather-output\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:53:20.316079 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:20.316074 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-llzkl\" (UniqueName: \"kubernetes.io/projected/bc905e5a-ff9b-407f-a50c-415eb5b9699a-kube-api-access-llzkl\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:53:21.100610 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:21.100584 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-sk59w_must-gather-cl96l_bc905e5a-ff9b-407f-a50c-415eb5b9699a/copy/0.log" Apr 22 19:53:21.101022 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:21.100996 2574 scope.go:117] "RemoveContainer" containerID="9d11dc02abea5904b21a89904c369179d88fa6961a95a3441cf8468499df02e8" Apr 22 19:53:21.101073 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:21.100997 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sk59w/must-gather-cl96l" Apr 22 19:53:21.103351 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:21.103321 2574 status_manager.go:895] "Failed to get status for pod" podUID="bc905e5a-ff9b-407f-a50c-415eb5b9699a" pod="openshift-must-gather-sk59w/must-gather-cl96l" err="pods \"must-gather-cl96l\" is forbidden: User \"system:node:ip-10-0-134-22.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-sk59w\": no relationship found between node 'ip-10-0-134-22.ec2.internal' and this object" Apr 22 19:53:21.109484 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:21.109202 2574 scope.go:117] "RemoveContainer" containerID="34c08d63ce29dd16461bf5f2444f69fc07ac141a792000cd6cc87aad9bca944f" Apr 22 19:53:21.111998 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:21.111972 2574 status_manager.go:895] "Failed to get status for pod" podUID="bc905e5a-ff9b-407f-a50c-415eb5b9699a" pod="openshift-must-gather-sk59w/must-gather-cl96l" err="pods \"must-gather-cl96l\" is forbidden: User \"system:node:ip-10-0-134-22.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-sk59w\": no relationship found between node 'ip-10-0-134-22.ec2.internal' and this object" Apr 22 19:53:21.349498 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:21.349462 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc905e5a-ff9b-407f-a50c-415eb5b9699a" path="/var/lib/kubelet/pods/bc905e5a-ff9b-407f-a50c-415eb5b9699a/volumes" Apr 22 19:53:22.114096 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:22.114064 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-m2zrb_d0950e55-6d4a-44e3-9bcb-cf034b753b22/manager/0.log" Apr 22 19:53:22.186076 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:22.186032 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-b7dtm_4ad04b3e-845a-4da0-a28f-4ca4e954f9ac/manager/0.log" Apr 22 19:53:23.387062 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:23.387031 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-9bw78_f8634c7a-287f-4afc-8a9d-55a3e85c0c45/kube-state-metrics/0.log" Apr 22 19:53:23.410496 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:23.410470 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-9bw78_f8634c7a-287f-4afc-8a9d-55a3e85c0c45/kube-rbac-proxy-main/0.log" Apr 22 19:53:23.429830 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:23.429775 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-9bw78_f8634c7a-287f-4afc-8a9d-55a3e85c0c45/kube-rbac-proxy-self/0.log" Apr 22 19:53:23.590124 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:23.590099 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-mcwhq_aae64ea5-a201-4205-b675-7f24942d6b45/node-exporter/0.log" Apr 22 19:53:23.607946 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:23.607919 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-mcwhq_aae64ea5-a201-4205-b675-7f24942d6b45/kube-rbac-proxy/0.log" Apr 22 19:53:23.627192 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:23.627164 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-mcwhq_aae64ea5-a201-4205-b675-7f24942d6b45/init-textfile/0.log" Apr 22 19:53:23.789294 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:23.789269 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a9e780ce-144b-4e25-9172-673e7dc43f69/prometheus/0.log" Apr 22 19:53:23.804731 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:23.804700 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a9e780ce-144b-4e25-9172-673e7dc43f69/config-reloader/0.log" Apr 22 19:53:23.824852 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:23.824828 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a9e780ce-144b-4e25-9172-673e7dc43f69/thanos-sidecar/0.log" Apr 22 19:53:23.843420 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:23.843356 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a9e780ce-144b-4e25-9172-673e7dc43f69/kube-rbac-proxy-web/0.log" Apr 22 19:53:23.861354 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:23.861326 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a9e780ce-144b-4e25-9172-673e7dc43f69/kube-rbac-proxy/0.log" Apr 22 19:53:23.882273 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:23.882246 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a9e780ce-144b-4e25-9172-673e7dc43f69/kube-rbac-proxy-thanos/0.log" Apr 22 19:53:23.904686 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:23.904654 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a9e780ce-144b-4e25-9172-673e7dc43f69/init-config-reloader/0.log" Apr 22 19:53:23.933697 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:23.933667 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-ppqd5_172d410c-869f-4053-99a6-d41927b1b8e3/prometheus-operator/0.log" Apr 22 19:53:23.955232 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:23.955204 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-ppqd5_172d410c-869f-4053-99a6-d41927b1b8e3/kube-rbac-proxy/0.log" Apr 22 19:53:23.983030 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:23.982999 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-77g4t_3b1b56ad-1a25-404b-bb00-1eaca8e5e50b/prometheus-operator-admission-webhook/0.log" Apr 22 19:53:24.078844 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:24.078814 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d78bbfc6-lrjq2_71de3a3d-9050-4664-b74b-e79fb26b7478/thanos-query/0.log" Apr 22 19:53:24.109232 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:24.109125 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d78bbfc6-lrjq2_71de3a3d-9050-4664-b74b-e79fb26b7478/kube-rbac-proxy-web/0.log" Apr 22 19:53:24.128407 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:24.128376 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d78bbfc6-lrjq2_71de3a3d-9050-4664-b74b-e79fb26b7478/kube-rbac-proxy/0.log" Apr 22 19:53:24.149006 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:24.148973 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d78bbfc6-lrjq2_71de3a3d-9050-4664-b74b-e79fb26b7478/prom-label-proxy/0.log" Apr 22 19:53:24.171582 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:24.171549 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d78bbfc6-lrjq2_71de3a3d-9050-4664-b74b-e79fb26b7478/kube-rbac-proxy-rules/0.log" Apr 22 19:53:24.202463 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:24.202441 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d78bbfc6-lrjq2_71de3a3d-9050-4664-b74b-e79fb26b7478/kube-rbac-proxy-metrics/0.log" Apr 22 19:53:25.539389 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:25.539360 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-2cwpp_76e16566-d9dd-4f0b-9cc5-a72e27e4518a/networking-console-plugin/0.log" Apr 22 19:53:26.850875 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:26.850841 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-x642z/perf-node-gather-daemonset-btv6j"] Apr 22 19:53:26.851256 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:26.851180 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc905e5a-ff9b-407f-a50c-415eb5b9699a" containerName="gather" Apr 22 19:53:26.851256 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:26.851190 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc905e5a-ff9b-407f-a50c-415eb5b9699a" containerName="gather" Apr 22 19:53:26.851256 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:26.851208 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc905e5a-ff9b-407f-a50c-415eb5b9699a" containerName="copy" Apr 22 19:53:26.851256 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:26.851213 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc905e5a-ff9b-407f-a50c-415eb5b9699a" containerName="copy" Apr 22 19:53:26.851385 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:26.851281 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc905e5a-ff9b-407f-a50c-415eb5b9699a" containerName="gather" Apr 22 19:53:26.851385 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:26.851290 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc905e5a-ff9b-407f-a50c-415eb5b9699a" containerName="copy" Apr 22 19:53:26.857860 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:26.857837 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x642z/perf-node-gather-daemonset-btv6j" Apr 22 19:53:26.860506 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:26.860452 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-x642z\"/\"default-dockercfg-6r5dj\"" Apr 22 19:53:26.860711 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:26.860682 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-x642z\"/\"openshift-service-ca.crt\"" Apr 22 19:53:26.861607 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:26.861580 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-x642z\"/\"kube-root-ca.crt\"" Apr 22 19:53:26.863691 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:26.863666 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x642z/perf-node-gather-daemonset-btv6j"] Apr 22 19:53:26.977830 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:26.977769 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4ab0a0ed-09ab-4a18-b647-5b6f9eed59ff-sys\") pod \"perf-node-gather-daemonset-btv6j\" (UID: \"4ab0a0ed-09ab-4a18-b647-5b6f9eed59ff\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-btv6j" Apr 22 19:53:26.977830 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:26.977832 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4ab0a0ed-09ab-4a18-b647-5b6f9eed59ff-proc\") pod \"perf-node-gather-daemonset-btv6j\" (UID: \"4ab0a0ed-09ab-4a18-b647-5b6f9eed59ff\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-btv6j" Apr 22 19:53:26.978081 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:26.977958 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4ab0a0ed-09ab-4a18-b647-5b6f9eed59ff-lib-modules\") pod \"perf-node-gather-daemonset-btv6j\" (UID: \"4ab0a0ed-09ab-4a18-b647-5b6f9eed59ff\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-btv6j" Apr 22 19:53:26.978081 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:26.977989 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4ab0a0ed-09ab-4a18-b647-5b6f9eed59ff-podres\") pod \"perf-node-gather-daemonset-btv6j\" (UID: \"4ab0a0ed-09ab-4a18-b647-5b6f9eed59ff\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-btv6j" Apr 22 19:53:26.978081 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:26.978015 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9sfn\" (UniqueName: \"kubernetes.io/projected/4ab0a0ed-09ab-4a18-b647-5b6f9eed59ff-kube-api-access-v9sfn\") pod \"perf-node-gather-daemonset-btv6j\" (UID: \"4ab0a0ed-09ab-4a18-b647-5b6f9eed59ff\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-btv6j" Apr 22 19:53:27.078772 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:27.078730 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4ab0a0ed-09ab-4a18-b647-5b6f9eed59ff-lib-modules\") pod \"perf-node-gather-daemonset-btv6j\" (UID: \"4ab0a0ed-09ab-4a18-b647-5b6f9eed59ff\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-btv6j" Apr 22 19:53:27.078772 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:27.078773 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4ab0a0ed-09ab-4a18-b647-5b6f9eed59ff-podres\") pod \"perf-node-gather-daemonset-btv6j\" (UID: \"4ab0a0ed-09ab-4a18-b647-5b6f9eed59ff\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-btv6j" Apr 22 19:53:27.079002 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:27.078817 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v9sfn\" (UniqueName: \"kubernetes.io/projected/4ab0a0ed-09ab-4a18-b647-5b6f9eed59ff-kube-api-access-v9sfn\") pod \"perf-node-gather-daemonset-btv6j\" (UID: \"4ab0a0ed-09ab-4a18-b647-5b6f9eed59ff\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-btv6j" Apr 22 19:53:27.079002 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:27.078844 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4ab0a0ed-09ab-4a18-b647-5b6f9eed59ff-sys\") pod \"perf-node-gather-daemonset-btv6j\" (UID: \"4ab0a0ed-09ab-4a18-b647-5b6f9eed59ff\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-btv6j" Apr 22 19:53:27.079002 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:27.078861 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4ab0a0ed-09ab-4a18-b647-5b6f9eed59ff-proc\") pod \"perf-node-gather-daemonset-btv6j\" (UID: \"4ab0a0ed-09ab-4a18-b647-5b6f9eed59ff\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-btv6j" Apr 22 19:53:27.079002 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:27.078939 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4ab0a0ed-09ab-4a18-b647-5b6f9eed59ff-lib-modules\") pod \"perf-node-gather-daemonset-btv6j\" (UID: \"4ab0a0ed-09ab-4a18-b647-5b6f9eed59ff\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-btv6j" Apr 22 19:53:27.079002 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:27.078947 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4ab0a0ed-09ab-4a18-b647-5b6f9eed59ff-sys\") pod \"perf-node-gather-daemonset-btv6j\" (UID: \"4ab0a0ed-09ab-4a18-b647-5b6f9eed59ff\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-btv6j" Apr 22 19:53:27.079002 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:27.078987 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4ab0a0ed-09ab-4a18-b647-5b6f9eed59ff-podres\") pod \"perf-node-gather-daemonset-btv6j\" (UID: \"4ab0a0ed-09ab-4a18-b647-5b6f9eed59ff\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-btv6j" Apr 22 19:53:27.079188 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:27.079003 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4ab0a0ed-09ab-4a18-b647-5b6f9eed59ff-proc\") pod \"perf-node-gather-daemonset-btv6j\" (UID: \"4ab0a0ed-09ab-4a18-b647-5b6f9eed59ff\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-btv6j" Apr 22 19:53:27.086954 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:27.086929 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9sfn\" (UniqueName: \"kubernetes.io/projected/4ab0a0ed-09ab-4a18-b647-5b6f9eed59ff-kube-api-access-v9sfn\") pod \"perf-node-gather-daemonset-btv6j\" (UID: \"4ab0a0ed-09ab-4a18-b647-5b6f9eed59ff\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-btv6j" Apr 22 19:53:27.169309 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:27.169211 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x642z/perf-node-gather-daemonset-btv6j" Apr 22 19:53:27.290771 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:27.290738 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x642z/perf-node-gather-daemonset-btv6j"] Apr 22 19:53:27.293299 ip-10-0-134-22 kubenswrapper[2574]: W0422 19:53:27.293266 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4ab0a0ed_09ab_4a18_b647_5b6f9eed59ff.slice/crio-3e832793aa1251463218680e5910aa3fac7ff441efa10129b1143fcd8b6099b2 WatchSource:0}: Error finding container 3e832793aa1251463218680e5910aa3fac7ff441efa10129b1143fcd8b6099b2: Status 404 returned error can't find the container with id 3e832793aa1251463218680e5910aa3fac7ff441efa10129b1143fcd8b6099b2 Apr 22 19:53:27.761165 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:27.761135 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lnt5s_be5b5131-1460-4746-96a8-4720ed712cf1/dns/0.log" Apr 22 19:53:27.783192 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:27.783156 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lnt5s_be5b5131-1460-4746-96a8-4720ed712cf1/kube-rbac-proxy/0.log" Apr 22 19:53:27.932386 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:27.932359 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zsm2s_e794b825-b003-4cc7-9af6-8dd829fbea84/dns-node-resolver/0.log" Apr 22 19:53:28.128481 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:28.128387 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x642z/perf-node-gather-daemonset-btv6j" event={"ID":"4ab0a0ed-09ab-4a18-b647-5b6f9eed59ff","Type":"ContainerStarted","Data":"bde8a3656262fe74bd74c5efd272d846c66f660bfab0e36f890ee2d605f2c592"} Apr 22 19:53:28.128481 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:28.128421 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x642z/perf-node-gather-daemonset-btv6j" event={"ID":"4ab0a0ed-09ab-4a18-b647-5b6f9eed59ff","Type":"ContainerStarted","Data":"3e832793aa1251463218680e5910aa3fac7ff441efa10129b1143fcd8b6099b2"} Apr 22 19:53:28.128703 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:28.128487 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-x642z/perf-node-gather-daemonset-btv6j" Apr 22 19:53:28.145993 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:28.145939 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-x642z/perf-node-gather-daemonset-btv6j" podStartSLOduration=2.145923372 podStartE2EDuration="2.145923372s" podCreationTimestamp="2026-04-22 19:53:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:53:28.145092662 +0000 UTC m=+1785.428763136" watchObservedRunningTime="2026-04-22 19:53:28.145923372 +0000 UTC m=+1785.429593844" Apr 22 19:53:28.401268 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:28.401162 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-95pbb_fb2a25eb-d7fc-4fe6-a965-be2f517c03ab/node-ca/0.log" Apr 22 19:53:29.222165 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:29.222134 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-ndcbr_f9b091e1-c5c0-4ad1-8ee8-8cb03bd1318e/discovery/0.log" Apr 22 19:53:29.705189 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:29.705159 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-qsnbz_8207bbac-ad34-4f98-a8af-1a2daaa6ea59/serve-healthcheck-canary/0.log" Apr 22 19:53:30.182436 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:30.182405 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8w2jt_d5544954-5800-4b59-b1a1-0e15d44a7aeb/kube-rbac-proxy/0.log" Apr 22 19:53:30.200427 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:30.200403 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8w2jt_d5544954-5800-4b59-b1a1-0e15d44a7aeb/exporter/0.log" Apr 22 19:53:30.219149 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:30.219120 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8w2jt_d5544954-5800-4b59-b1a1-0e15d44a7aeb/extractor/0.log" Apr 22 19:53:32.793551 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:32.793519 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-98c76994c-dhrwk_20ffec41-b93d-40f8-8aba-ee5a501bedc9/manager/0.log" Apr 22 19:53:34.142306 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:34.142279 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-x642z/perf-node-gather-daemonset-btv6j" Apr 22 19:53:38.370071 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:38.370043 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-bwzd9_73b8fc2e-2383-453a-813a-8253bb341485/migrator/0.log" Apr 22 19:53:38.387858 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:38.387819 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-bwzd9_73b8fc2e-2383-453a-813a-8253bb341485/graceful-termination/0.log" Apr 22 19:53:39.880940 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:39.880913 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z6hh4_20682e94-9131-4a8d-a325-b9f45d2fd64f/kube-multus-additional-cni-plugins/0.log" Apr 22 19:53:39.902656 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:39.902634 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z6hh4_20682e94-9131-4a8d-a325-b9f45d2fd64f/egress-router-binary-copy/0.log" Apr 22 19:53:39.921317 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:39.921295 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z6hh4_20682e94-9131-4a8d-a325-b9f45d2fd64f/cni-plugins/0.log" Apr 22 19:53:39.942002 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:39.941975 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z6hh4_20682e94-9131-4a8d-a325-b9f45d2fd64f/bond-cni-plugin/0.log" Apr 22 19:53:39.962343 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:39.962320 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z6hh4_20682e94-9131-4a8d-a325-b9f45d2fd64f/routeoverride-cni/0.log" Apr 22 19:53:39.982445 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:39.982425 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z6hh4_20682e94-9131-4a8d-a325-b9f45d2fd64f/whereabouts-cni-bincopy/0.log" Apr 22 19:53:40.001207 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:40.001187 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z6hh4_20682e94-9131-4a8d-a325-b9f45d2fd64f/whereabouts-cni/0.log" Apr 22 19:53:40.030573 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:40.030539 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gkjzf_68be7f81-7c86-4929-92a2-0347981c9140/kube-multus/0.log" Apr 22 19:53:40.148123 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:40.148102 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-mdzdp_93109160-9bbe-497f-9b25-d7fa7e08508f/network-metrics-daemon/0.log" Apr 22 19:53:40.166814 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:40.166771 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-mdzdp_93109160-9bbe-497f-9b25-d7fa7e08508f/kube-rbac-proxy/0.log" Apr 22 19:53:41.613284 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:41.613255 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqkn7_06227928-4a6e-4e0e-b991-1f9a395b21c4/ovn-controller/0.log" Apr 22 19:53:41.637685 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:41.637656 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqkn7_06227928-4a6e-4e0e-b991-1f9a395b21c4/ovn-acl-logging/0.log" Apr 22 19:53:41.654164 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:41.654140 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqkn7_06227928-4a6e-4e0e-b991-1f9a395b21c4/kube-rbac-proxy-node/0.log" Apr 22 19:53:41.673651 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:41.673621 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqkn7_06227928-4a6e-4e0e-b991-1f9a395b21c4/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 19:53:41.689312 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:41.689287 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqkn7_06227928-4a6e-4e0e-b991-1f9a395b21c4/northd/0.log" Apr 22 19:53:41.708492 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:41.708471 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqkn7_06227928-4a6e-4e0e-b991-1f9a395b21c4/nbdb/0.log" Apr 22 19:53:41.727715 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:41.727694 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqkn7_06227928-4a6e-4e0e-b991-1f9a395b21c4/sbdb/0.log" Apr 22 19:53:41.818074 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:41.818045 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqkn7_06227928-4a6e-4e0e-b991-1f9a395b21c4/ovnkube-controller/0.log" Apr 22 19:53:42.834374 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:42.834347 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-nms75_cb58056c-3091-4679-8665-a73d6668e604/check-endpoints/0.log" Apr 22 19:53:42.855033 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:42.855005 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-jzltp_9bbee64a-2154-4ea2-9299-c15d3614e769/network-check-target-container/0.log" Apr 22 19:53:43.809397 ip-10-0-134-22 kubenswrapper[2574]: I0422 19:53:43.809365 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-284wh_b9527658-a6b8-4270-ae5d-f451e61ca79f/iptables-alerter/0.log"