Apr 20 15:02:22.512646 ip-10-0-133-198 systemd[1]: Starting Kubernetes Kubelet... Apr 20 15:02:22.935347 ip-10-0-133-198 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 15:02:22.935347 ip-10-0-133-198 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 15:02:22.935347 ip-10-0-133-198 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 15:02:22.935347 ip-10-0-133-198 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 15:02:22.935347 ip-10-0-133-198 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 15:02:22.937503 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.937411 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 15:02:22.941302 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941282 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 15:02:22.941302 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941302 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 15:02:22.941375 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941306 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 15:02:22.941375 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941310 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 15:02:22.941375 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941313 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 15:02:22.941375 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941316 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 15:02:22.941375 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941319 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 15:02:22.941375 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941322 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 15:02:22.941375 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941326 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 15:02:22.941375 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941328 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 15:02:22.941375 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941333 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 15:02:22.941375 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941335 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 15:02:22.941375 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941339 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 15:02:22.941375 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941341 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 15:02:22.941375 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941344 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 15:02:22.941375 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941347 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 15:02:22.941375 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941350 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 15:02:22.941375 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941352 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 15:02:22.941375 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941355 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 15:02:22.941375 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941358 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 15:02:22.941375 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941360 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 15:02:22.941375 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941363 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 15:02:22.941888 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941366 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 15:02:22.941888 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941368 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 15:02:22.941888 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941371 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 15:02:22.941888 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941394 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 15:02:22.941888 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941399 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 15:02:22.941888 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941402 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 15:02:22.941888 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941405 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 15:02:22.941888 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941408 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 15:02:22.941888 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941411 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 15:02:22.941888 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941414 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 15:02:22.941888 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941417 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 15:02:22.941888 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941420 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 15:02:22.941888 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941423 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 15:02:22.941888 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941426 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 15:02:22.941888 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941428 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 15:02:22.941888 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941431 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 15:02:22.941888 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941434 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 15:02:22.941888 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941438 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 15:02:22.941888 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941440 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 15:02:22.941888 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941443 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 15:02:22.942403 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941447 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 15:02:22.942403 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941451 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 15:02:22.942403 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941454 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 15:02:22.942403 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941457 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 15:02:22.942403 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941460 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 15:02:22.942403 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941463 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 15:02:22.942403 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941465 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 15:02:22.942403 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941468 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 15:02:22.942403 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941471 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 15:02:22.942403 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941474 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 15:02:22.942403 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941477 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 15:02:22.942403 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941479 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 15:02:22.942403 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941482 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 15:02:22.942403 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941485 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 15:02:22.942403 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941488 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 15:02:22.942403 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941490 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 15:02:22.942403 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941493 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 15:02:22.942403 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941496 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 15:02:22.942403 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941498 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 15:02:22.942870 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941501 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 15:02:22.942870 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941503 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 15:02:22.942870 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941506 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 15:02:22.942870 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941509 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 15:02:22.942870 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941512 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 15:02:22.942870 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941515 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 15:02:22.942870 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941517 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 15:02:22.942870 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941520 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 15:02:22.942870 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941522 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 15:02:22.942870 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941526 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 15:02:22.942870 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941545 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 15:02:22.942870 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941550 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 15:02:22.942870 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941553 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 15:02:22.942870 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941556 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 15:02:22.942870 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941559 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 15:02:22.942870 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941561 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 15:02:22.942870 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941565 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 15:02:22.942870 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941568 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 15:02:22.942870 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941570 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 15:02:22.943352 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941573 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 15:02:22.943352 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941576 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 15:02:22.943352 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941579 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 15:02:22.943352 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941581 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 15:02:22.943352 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941584 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 15:02:22.943352 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.941587 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 15:02:22.943352 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942601 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 15:02:22.943352 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942608 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 15:02:22.943352 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942612 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 15:02:22.943352 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942615 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 15:02:22.943352 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942619 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 15:02:22.943352 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942622 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 15:02:22.943352 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942625 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 15:02:22.943352 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942628 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 15:02:22.943352 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942631 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 15:02:22.943352 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942634 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 15:02:22.943352 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942637 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 15:02:22.943352 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942640 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 15:02:22.943352 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942643 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 15:02:22.943352 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942645 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 15:02:22.943849 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942648 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 15:02:22.943849 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942650 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 15:02:22.943849 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942653 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 15:02:22.943849 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942656 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 15:02:22.943849 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942659 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 15:02:22.943849 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942662 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 15:02:22.943849 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942664 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 15:02:22.943849 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942667 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 15:02:22.943849 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942669 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 15:02:22.943849 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942672 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 15:02:22.943849 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942675 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 15:02:22.943849 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942677 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 15:02:22.943849 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942680 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 15:02:22.943849 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942683 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 15:02:22.943849 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942685 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 15:02:22.943849 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942688 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 15:02:22.943849 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942690 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 15:02:22.943849 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942693 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 15:02:22.943849 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942696 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 15:02:22.943849 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942698 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 15:02:22.944380 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942700 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 15:02:22.944380 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942703 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 15:02:22.944380 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942706 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 15:02:22.944380 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942708 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 15:02:22.944380 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942711 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 15:02:22.944380 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942713 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 15:02:22.944380 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942716 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 15:02:22.944380 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942719 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 15:02:22.944380 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942724 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 15:02:22.944380 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942727 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 15:02:22.944380 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942730 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 15:02:22.944380 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942732 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 15:02:22.944380 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942735 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 15:02:22.944380 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942738 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 15:02:22.944380 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942740 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 15:02:22.944380 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942744 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 15:02:22.944380 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942746 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 15:02:22.944380 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942749 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 15:02:22.944380 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942752 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 15:02:22.944380 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942754 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 15:02:22.944876 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942757 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 15:02:22.944876 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942759 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 15:02:22.944876 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942762 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 15:02:22.944876 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942764 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 15:02:22.944876 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942767 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 15:02:22.944876 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942769 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 15:02:22.944876 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942773 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 15:02:22.944876 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942776 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 15:02:22.944876 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942779 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 15:02:22.944876 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942782 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 15:02:22.944876 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942785 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 15:02:22.944876 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942787 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 15:02:22.944876 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942789 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 15:02:22.944876 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942792 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 15:02:22.944876 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942794 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 15:02:22.944876 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942797 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 15:02:22.944876 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942801 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 15:02:22.944876 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942805 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 15:02:22.944876 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942809 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 15:02:22.945365 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942812 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 15:02:22.945365 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942815 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 15:02:22.945365 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942818 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 15:02:22.945365 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942820 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 15:02:22.945365 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942823 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 15:02:22.945365 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942826 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 15:02:22.945365 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942829 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 15:02:22.945365 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942832 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 15:02:22.945365 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942834 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 15:02:22.945365 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942837 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 15:02:22.945365 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942840 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 15:02:22.945365 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942842 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 15:02:22.945365 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.942845 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 15:02:22.945365 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.942916 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 15:02:22.945365 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.942924 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 15:02:22.945365 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.942931 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 15:02:22.945365 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.942935 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 15:02:22.945365 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.942940 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 15:02:22.945365 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.942951 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 15:02:22.945365 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.942957 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 15:02:22.945365 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.942962 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 15:02:22.945891 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.942965 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 15:02:22.945891 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.942968 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 15:02:22.945891 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.942972 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 15:02:22.945891 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.942975 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 15:02:22.945891 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.942979 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 15:02:22.945891 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.942982 2575 flags.go:64] FLAG: --cgroup-root="" Apr 20 15:02:22.945891 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.942985 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 15:02:22.945891 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.942988 2575 flags.go:64] FLAG: --client-ca-file="" Apr 20 15:02:22.945891 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.942991 2575 flags.go:64] FLAG: --cloud-config="" Apr 20 15:02:22.945891 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.942994 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 20 15:02:22.945891 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.942997 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 15:02:22.945891 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943003 2575 flags.go:64] FLAG: --cluster-domain="" Apr 20 15:02:22.945891 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943006 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 15:02:22.945891 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943009 2575 flags.go:64] FLAG: --config-dir="" Apr 20 15:02:22.945891 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943012 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 15:02:22.945891 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943017 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 15:02:22.945891 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943021 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 15:02:22.945891 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943024 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 15:02:22.945891 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943027 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 15:02:22.945891 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943031 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 15:02:22.945891 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943034 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 20 15:02:22.945891 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943037 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 15:02:22.945891 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943041 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 15:02:22.945891 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943044 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 15:02:22.945891 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943047 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 15:02:22.946535 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943051 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 15:02:22.946535 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943054 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 15:02:22.946535 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943057 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 15:02:22.946535 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943060 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 15:02:22.946535 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943063 2575 flags.go:64] FLAG: --enable-server="true" Apr 20 15:02:22.946535 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943066 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 15:02:22.946535 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943071 2575 flags.go:64] FLAG: --event-burst="100" Apr 20 15:02:22.946535 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943074 2575 flags.go:64] FLAG: --event-qps="50" Apr 20 15:02:22.946535 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943077 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 15:02:22.946535 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943080 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 15:02:22.946535 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943083 2575 flags.go:64] FLAG: --eviction-hard="" Apr 20 15:02:22.946535 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943087 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 15:02:22.946535 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943090 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 15:02:22.946535 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943093 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 15:02:22.946535 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943096 2575 flags.go:64] FLAG: --eviction-soft="" Apr 20 15:02:22.946535 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943099 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 15:02:22.946535 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943102 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 15:02:22.946535 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943105 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 15:02:22.946535 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943108 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 15:02:22.946535 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943112 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 15:02:22.946535 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943115 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 15:02:22.946535 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943118 2575 flags.go:64] FLAG: --feature-gates="" Apr 20 15:02:22.946535 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943122 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 15:02:22.946535 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943126 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 15:02:22.946535 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943129 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 15:02:22.947153 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943132 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 15:02:22.947153 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943135 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 20 15:02:22.947153 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943138 2575 flags.go:64] FLAG: --help="false" Apr 20 15:02:22.947153 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943142 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-133-198.ec2.internal" Apr 20 15:02:22.947153 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943145 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 15:02:22.947153 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943148 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 15:02:22.947153 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943150 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 15:02:22.947153 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943154 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 15:02:22.947153 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943157 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 15:02:22.947153 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943160 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 15:02:22.947153 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943163 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 15:02:22.947153 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943165 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 15:02:22.947153 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943168 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 15:02:22.947153 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943171 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 15:02:22.947153 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943175 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 15:02:22.947153 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943177 2575 flags.go:64] FLAG: --kube-reserved="" Apr 20 15:02:22.947153 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943180 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 15:02:22.947153 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943183 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 15:02:22.947153 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943186 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 15:02:22.947153 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943189 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 15:02:22.947153 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943192 2575 flags.go:64] FLAG: --lock-file="" Apr 20 15:02:22.947153 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943195 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 15:02:22.947153 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943198 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 15:02:22.947153 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943201 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 15:02:22.947790 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943206 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 15:02:22.947790 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943209 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 15:02:22.947790 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943212 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 15:02:22.947790 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943215 2575 flags.go:64] FLAG: --logging-format="text" Apr 20 15:02:22.947790 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943219 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 15:02:22.947790 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943222 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 15:02:22.947790 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943225 2575 flags.go:64] FLAG: --manifest-url="" Apr 20 15:02:22.947790 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943228 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 20 15:02:22.947790 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943233 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 15:02:22.947790 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943236 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 15:02:22.947790 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943240 2575 flags.go:64] FLAG: --max-pods="110" Apr 20 15:02:22.947790 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943243 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 15:02:22.947790 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943246 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 15:02:22.947790 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943249 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 15:02:22.947790 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943253 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 15:02:22.947790 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943256 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 15:02:22.947790 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943259 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 15:02:22.947790 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943261 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 15:02:22.947790 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943283 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 15:02:22.947790 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943286 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 15:02:22.947790 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943289 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 15:02:22.947790 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943293 2575 flags.go:64] FLAG: --pod-cidr="" Apr 20 15:02:22.947790 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943296 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 15:02:22.948379 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943302 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 15:02:22.948379 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943310 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 15:02:22.948379 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943314 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 20 15:02:22.948379 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943316 2575 flags.go:64] FLAG: --port="10250" Apr 20 15:02:22.948379 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943320 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 15:02:22.948379 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943322 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0fe2b4f004bdafeb1" Apr 20 15:02:22.948379 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943326 2575 flags.go:64] FLAG: --qos-reserved="" Apr 20 15:02:22.948379 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943329 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 20 15:02:22.948379 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943332 2575 flags.go:64] FLAG: --register-node="true" Apr 20 15:02:22.948379 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943335 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 20 15:02:22.948379 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943338 2575 flags.go:64] FLAG: --register-with-taints="" Apr 20 15:02:22.948379 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943341 2575 flags.go:64] FLAG: --registry-burst="10" Apr 20 15:02:22.948379 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943345 2575 flags.go:64] FLAG: --registry-qps="5" Apr 20 15:02:22.948379 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943348 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 20 15:02:22.948379 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943350 2575 flags.go:64] FLAG: --reserved-memory="" Apr 20 15:02:22.948379 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943354 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 15:02:22.948379 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943358 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 15:02:22.948379 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943361 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 15:02:22.948379 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943364 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 15:02:22.948379 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943367 2575 flags.go:64] FLAG: --runonce="false" Apr 20 15:02:22.948379 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943370 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 15:02:22.948379 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943373 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 15:02:22.948379 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943376 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 20 15:02:22.948379 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943379 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 15:02:22.948379 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943381 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 15:02:22.948379 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943385 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 15:02:22.949037 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943387 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 15:02:22.949037 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943391 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 15:02:22.949037 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943393 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 15:02:22.949037 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943396 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 15:02:22.949037 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943399 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 15:02:22.949037 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943402 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 15:02:22.949037 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943405 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 15:02:22.949037 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943408 2575 flags.go:64] FLAG: --system-cgroups="" Apr 20 15:02:22.949037 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943411 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 15:02:22.949037 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943417 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 15:02:22.949037 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943419 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 20 15:02:22.949037 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943423 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 15:02:22.949037 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943427 2575 flags.go:64] FLAG: --tls-min-version="" Apr 20 15:02:22.949037 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943430 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 15:02:22.949037 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943433 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 15:02:22.949037 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943436 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 15:02:22.949037 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943438 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 15:02:22.949037 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943441 2575 flags.go:64] FLAG: --v="2" Apr 20 15:02:22.949037 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943453 2575 flags.go:64] FLAG: --version="false" Apr 20 15:02:22.949037 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943457 2575 flags.go:64] FLAG: --vmodule="" Apr 20 15:02:22.949037 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943461 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 15:02:22.949037 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.943464 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 15:02:22.949037 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943555 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 15:02:22.949037 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943559 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 15:02:22.949662 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943562 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 15:02:22.949662 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943565 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 15:02:22.949662 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943568 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 15:02:22.949662 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943570 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 15:02:22.949662 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943573 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 15:02:22.949662 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943576 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 15:02:22.949662 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943579 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 15:02:22.949662 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943582 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 15:02:22.949662 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943585 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 15:02:22.949662 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943588 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 15:02:22.949662 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943590 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 15:02:22.949662 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943593 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 15:02:22.949662 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943596 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 15:02:22.949662 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943599 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 15:02:22.949662 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943601 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 15:02:22.949662 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943604 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 15:02:22.949662 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943607 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 15:02:22.949662 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943610 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 15:02:22.949662 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943612 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 15:02:22.950326 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943615 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 15:02:22.950326 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943618 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 15:02:22.950326 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943620 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 15:02:22.950326 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943623 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 15:02:22.950326 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943625 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 15:02:22.950326 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943628 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 15:02:22.950326 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943630 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 15:02:22.950326 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943635 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 15:02:22.950326 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943637 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 15:02:22.950326 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943640 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 15:02:22.950326 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943642 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 15:02:22.950326 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943645 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 15:02:22.950326 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943647 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 15:02:22.950326 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943650 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 15:02:22.950326 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943654 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 15:02:22.950326 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943658 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 15:02:22.950326 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943661 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 15:02:22.950326 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943663 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 15:02:22.950326 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943665 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 15:02:22.951098 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943668 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 15:02:22.951098 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943671 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 15:02:22.951098 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943675 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 15:02:22.951098 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943678 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 15:02:22.951098 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943681 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 15:02:22.951098 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943683 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 15:02:22.951098 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943686 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 15:02:22.951098 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943689 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 15:02:22.951098 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943691 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 15:02:22.951098 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943694 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 15:02:22.951098 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943697 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 15:02:22.951098 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943700 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 15:02:22.951098 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943703 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 15:02:22.951098 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943705 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 15:02:22.951098 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943708 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 15:02:22.951098 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943710 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 15:02:22.951098 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943713 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 15:02:22.951098 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943715 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 15:02:22.951098 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943718 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 15:02:22.951098 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943720 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 15:02:22.951832 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943724 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 15:02:22.951832 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943727 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 15:02:22.951832 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943729 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 15:02:22.951832 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943732 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 15:02:22.951832 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943734 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 15:02:22.951832 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943737 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 15:02:22.951832 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943739 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 15:02:22.951832 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943742 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 15:02:22.951832 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943745 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 15:02:22.951832 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943747 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 15:02:22.951832 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943750 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 15:02:22.951832 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943752 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 15:02:22.951832 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943755 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 15:02:22.951832 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943757 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 15:02:22.951832 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943760 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 15:02:22.951832 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943762 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 15:02:22.951832 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943765 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 15:02:22.951832 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943767 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 15:02:22.951832 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943770 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 15:02:22.951832 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943772 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 15:02:22.952381 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943775 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 15:02:22.952381 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943778 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 15:02:22.952381 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943781 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 15:02:22.952381 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943784 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 15:02:22.952381 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943786 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 15:02:22.952381 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.943789 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 15:02:22.952381 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.944366 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 15:02:22.952568 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.952536 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 15:02:22.952568 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.952554 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 15:02:22.952624 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952606 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 15:02:22.952624 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952613 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 15:02:22.952624 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952618 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 15:02:22.952624 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952623 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 15:02:22.952624 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952627 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 15:02:22.952755 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952630 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 15:02:22.952755 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952634 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 15:02:22.952755 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952637 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 15:02:22.952755 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952641 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 15:02:22.952755 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952643 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 15:02:22.952755 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952646 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 15:02:22.952755 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952649 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 15:02:22.952755 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952651 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 15:02:22.952755 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952654 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 15:02:22.952755 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952656 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 15:02:22.952755 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952660 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 15:02:22.952755 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952662 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 15:02:22.952755 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952665 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 15:02:22.952755 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952667 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 15:02:22.952755 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952670 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 15:02:22.952755 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952672 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 15:02:22.952755 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952675 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 15:02:22.952755 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952677 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 15:02:22.952755 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952680 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 15:02:22.952755 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952683 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 15:02:22.953287 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952686 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 15:02:22.953287 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952689 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 15:02:22.953287 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952691 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 15:02:22.953287 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952694 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 15:02:22.953287 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952696 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 15:02:22.953287 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952699 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 15:02:22.953287 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952701 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 15:02:22.953287 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952704 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 15:02:22.953287 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952707 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 15:02:22.953287 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952710 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 15:02:22.953287 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952712 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 15:02:22.953287 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952715 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 15:02:22.953287 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952718 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 15:02:22.953287 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952721 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 15:02:22.953287 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952724 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 15:02:22.953287 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952727 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 15:02:22.953287 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952729 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 15:02:22.953287 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952732 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 15:02:22.953287 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952734 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 15:02:22.953287 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952737 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 15:02:22.953797 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952739 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 15:02:22.953797 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952742 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 15:02:22.953797 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952745 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 15:02:22.953797 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952747 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 15:02:22.953797 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952750 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 15:02:22.953797 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952753 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 15:02:22.953797 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952756 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 15:02:22.953797 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952758 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 15:02:22.953797 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952761 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 15:02:22.953797 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952764 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 15:02:22.953797 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952767 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 15:02:22.953797 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952769 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 15:02:22.953797 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952772 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 15:02:22.953797 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952775 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 15:02:22.953797 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952778 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 15:02:22.953797 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952781 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 15:02:22.953797 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952784 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 15:02:22.953797 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952787 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 15:02:22.953797 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952789 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 15:02:22.953797 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952792 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 15:02:22.954401 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952794 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 15:02:22.954401 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952797 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 15:02:22.954401 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952799 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 15:02:22.954401 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952802 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 15:02:22.954401 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952805 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 15:02:22.954401 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952807 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 15:02:22.954401 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952810 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 15:02:22.954401 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952813 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 15:02:22.954401 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952816 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 15:02:22.954401 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952818 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 15:02:22.954401 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952821 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 15:02:22.954401 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952832 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 15:02:22.954401 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952835 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 15:02:22.954401 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952838 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 15:02:22.954401 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952840 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 15:02:22.954401 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952843 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 15:02:22.954401 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952845 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 15:02:22.954401 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952848 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 15:02:22.954401 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952850 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 15:02:22.954401 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952853 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 15:02:22.954894 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952855 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 15:02:22.954894 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.952861 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 15:02:22.954894 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952964 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 15:02:22.954894 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952970 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 15:02:22.954894 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952973 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 15:02:22.954894 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952976 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 15:02:22.954894 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952979 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 15:02:22.954894 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952982 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 15:02:22.954894 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952984 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 15:02:22.954894 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952987 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 15:02:22.954894 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952990 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 15:02:22.954894 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952993 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 15:02:22.954894 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952995 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 15:02:22.954894 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.952998 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 15:02:22.954894 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953000 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 15:02:22.954894 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953003 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 15:02:22.955317 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953006 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 15:02:22.955317 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953008 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 15:02:22.955317 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953011 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 15:02:22.955317 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953014 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 15:02:22.955317 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953016 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 15:02:22.955317 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953019 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 15:02:22.955317 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953022 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 15:02:22.955317 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953024 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 15:02:22.955317 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953028 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 15:02:22.955317 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953031 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 15:02:22.955317 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953034 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 15:02:22.955317 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953036 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 15:02:22.955317 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953039 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 15:02:22.955317 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953042 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 15:02:22.955317 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953046 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 15:02:22.955317 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953050 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 15:02:22.955317 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953053 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 15:02:22.955317 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953055 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 15:02:22.955317 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953058 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 15:02:22.955778 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953061 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 15:02:22.955778 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953064 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 15:02:22.955778 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953066 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 15:02:22.955778 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953068 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 15:02:22.955778 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953071 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 15:02:22.955778 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953073 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 15:02:22.955778 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953076 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 15:02:22.955778 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953078 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 15:02:22.955778 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953081 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 15:02:22.955778 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953083 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 15:02:22.955778 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953085 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 15:02:22.955778 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953088 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 15:02:22.955778 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953090 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 15:02:22.955778 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953094 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 15:02:22.955778 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953096 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 15:02:22.955778 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953099 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 15:02:22.955778 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953101 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 15:02:22.955778 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953104 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 15:02:22.955778 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953106 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 15:02:22.955778 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953109 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 15:02:22.956284 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953112 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 15:02:22.956284 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953115 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 15:02:22.956284 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953118 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 15:02:22.956284 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953121 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 15:02:22.956284 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953123 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 15:02:22.956284 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953126 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 15:02:22.956284 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953128 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 15:02:22.956284 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953131 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 15:02:22.956284 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953133 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 15:02:22.956284 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953136 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 15:02:22.956284 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953139 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 15:02:22.956284 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953143 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 15:02:22.956284 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953145 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 15:02:22.956284 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953148 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 15:02:22.956284 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953150 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 15:02:22.956284 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953153 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 15:02:22.956284 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953155 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 15:02:22.956284 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953158 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 15:02:22.956284 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953160 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 15:02:22.956752 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953163 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 15:02:22.956752 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953166 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 15:02:22.956752 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953168 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 15:02:22.956752 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953171 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 15:02:22.956752 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953173 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 15:02:22.956752 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953176 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 15:02:22.956752 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953179 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 15:02:22.956752 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953182 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 15:02:22.956752 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953185 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 15:02:22.956752 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953187 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 15:02:22.956752 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953190 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 15:02:22.956752 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953192 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 15:02:22.956752 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953195 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 15:02:22.956752 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:22.953198 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 15:02:22.956752 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.953203 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 15:02:22.956752 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.953907 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 15:02:22.958423 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.958406 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 15:02:22.959258 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.959246 2575 server.go:1019] "Starting client certificate rotation" Apr 20 15:02:22.959324 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.959307 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 15:02:22.959366 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.959349 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 15:02:22.985017 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.984992 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 15:02:22.988321 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.988300 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 15:02:22.998127 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:22.998059 2575 log.go:25] "Validated CRI v1 runtime API" Apr 20 15:02:23.006165 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.006139 2575 log.go:25] "Validated CRI v1 image API" Apr 20 15:02:23.008585 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.008567 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 15:02:23.012071 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.012047 2575 fs.go:135] Filesystem UUIDs: map[22ce1a56-7553-49b9-89d2-24baa3071aea:/dev/nvme0n1p3 43ed9ccc-6264-413f-8985-8204637bebe7:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 20 15:02:23.012171 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.012069 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 15:02:23.018964 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.018830 2575 manager.go:217] Machine: {Timestamp:2026-04-20 15:02:23.016906167 +0000 UTC m=+0.389492862 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3103058 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2996ab61272f0e9f97920a1dc0768b SystemUUID:ec2996ab-6127-2f0e-9f97-920a1dc0768b BootID:2338fca5-733d-4cd4-b50f-ed7a7825c7ff Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:ab:e2:c8:7f:9b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:ab:e2:c8:7f:9b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:72:e8:93:28:4d:b3 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 15:02:23.018964 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.018938 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 15:02:23.019119 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.019105 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 15:02:23.020225 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.020200 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 15:02:23.020493 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.020228 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-198.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 15:02:23.020537 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.020509 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 15:02:23.020537 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.020520 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 15:02:23.020537 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.020533 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 15:02:23.020624 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.020476 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 15:02:23.021242 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.021230 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 15:02:23.022726 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.022716 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 20 15:02:23.022838 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.022829 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 15:02:23.025077 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.025066 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 20 15:02:23.025112 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.025088 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 15:02:23.025112 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.025100 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 15:02:23.025112 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.025108 2575 kubelet.go:397] "Adding apiserver pod source" Apr 20 15:02:23.025205 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.025117 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 15:02:23.026159 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.026145 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 15:02:23.026203 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.026170 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 15:02:23.028947 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.028929 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 15:02:23.030562 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.030549 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 15:02:23.032417 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.032403 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 15:02:23.032472 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.032420 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 15:02:23.032472 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.032426 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 15:02:23.032472 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.032431 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 15:02:23.032472 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.032437 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 15:02:23.032472 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.032443 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 15:02:23.032472 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.032449 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 15:02:23.032472 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.032456 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 15:02:23.032472 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.032463 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 15:02:23.032472 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.032469 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 15:02:23.032472 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.032477 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 15:02:23.032719 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.032486 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 15:02:23.034128 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.034118 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 15:02:23.034164 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.034129 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 15:02:23.037998 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.037984 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 15:02:23.038077 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.038021 2575 server.go:1295] "Started kubelet" Apr 20 15:02:23.038165 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.038123 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 15:02:23.038220 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.038173 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 15:02:23.038264 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.038251 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 15:02:23.038739 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:23.038716 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-198.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 15:02:23.038833 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.038738 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-133-198.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 15:02:23.038897 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:23.038837 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 15:02:23.038979 ip-10-0-133-198 systemd[1]: Started Kubernetes Kubelet. Apr 20 15:02:23.039520 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.039403 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 20 15:02:23.039577 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.039541 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 15:02:23.045856 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.045833 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 15:02:23.046434 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:23.045618 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-198.ec2.internal.18a818d1f528fb40 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-198.ec2.internal,UID:ip-10-0-133-198.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-133-198.ec2.internal,},FirstTimestamp:2026-04-20 15:02:23.03799584 +0000 UTC m=+0.410582535,LastTimestamp:2026-04-20 15:02:23.03799584 +0000 UTC m=+0.410582535,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-198.ec2.internal,}" Apr 20 15:02:23.046434 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.046432 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 15:02:23.047064 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:23.047046 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 15:02:23.047257 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:23.047238 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-198.ec2.internal\" not found" Apr 20 15:02:23.047383 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.047304 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 15:02:23.047383 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.047304 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 15:02:23.047383 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.047323 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 15:02:23.047534 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.047408 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 20 15:02:23.047534 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.047416 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 20 15:02:23.047796 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.047776 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-sknpk" Apr 20 15:02:23.047883 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.047799 2575 factory.go:55] Registering systemd factory Apr 20 15:02:23.047883 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.047821 2575 factory.go:223] Registration of the systemd container factory successfully Apr 20 15:02:23.048037 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.048026 2575 factory.go:153] Registering CRI-O factory Apr 20 15:02:23.048077 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.048042 2575 factory.go:223] Registration of the crio container factory successfully Apr 20 15:02:23.048108 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.048084 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 15:02:23.048108 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.048100 2575 factory.go:103] Registering Raw factory Apr 20 15:02:23.048179 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.048109 2575 manager.go:1196] Started watching for new ooms in manager Apr 20 15:02:23.048714 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.048701 2575 manager.go:319] Starting recovery of all containers Apr 20 15:02:23.049894 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:23.049863 2575 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-133-198.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 20 15:02:23.050052 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:23.049596 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 15:02:23.055917 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.055893 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-sknpk" Apr 20 15:02:23.061375 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.061340 2575 manager.go:324] Recovery completed Apr 20 15:02:23.065723 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.065707 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 15:02:23.069106 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.069086 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-198.ec2.internal" event="NodeHasSufficientMemory" Apr 20 15:02:23.069189 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.069117 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-198.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 15:02:23.069189 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.069127 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-198.ec2.internal" event="NodeHasSufficientPID" Apr 20 15:02:23.069730 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.069715 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 15:02:23.069730 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.069729 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 15:02:23.069810 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.069745 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 20 15:02:23.071691 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:23.071613 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-198.ec2.internal.18a818d1f703a6ba default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-198.ec2.internal,UID:ip-10-0-133-198.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-133-198.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-133-198.ec2.internal,},FirstTimestamp:2026-04-20 15:02:23.069103802 +0000 UTC m=+0.441690497,LastTimestamp:2026-04-20 15:02:23.069103802 +0000 UTC m=+0.441690497,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-198.ec2.internal,}" Apr 20 15:02:23.072310 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.072296 2575 policy_none.go:49] "None policy: Start" Apr 20 15:02:23.072389 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.072315 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 15:02:23.072389 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.072327 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 20 15:02:23.114697 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.109791 2575 manager.go:341] "Starting Device Plugin manager" Apr 20 15:02:23.114697 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:23.109825 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 15:02:23.114697 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.109835 2575 server.go:85] "Starting device plugin registration server" Apr 20 15:02:23.114697 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.110095 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 15:02:23.114697 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.110109 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 15:02:23.114697 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.110180 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 15:02:23.114697 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.110294 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 15:02:23.114697 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.110303 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 15:02:23.114697 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:23.110792 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 15:02:23.114697 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:23.110825 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-198.ec2.internal\" not found" Apr 20 15:02:23.175923 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.175874 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 15:02:23.177259 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.177242 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 15:02:23.177360 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.177286 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 15:02:23.177360 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.177306 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 15:02:23.177360 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.177314 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 15:02:23.177360 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:23.177353 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 15:02:23.181519 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.181495 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 15:02:23.210589 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.210560 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 15:02:23.211548 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.211534 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-198.ec2.internal" event="NodeHasSufficientMemory" Apr 20 15:02:23.211613 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.211565 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-198.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 15:02:23.211613 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.211575 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-198.ec2.internal" event="NodeHasSufficientPID" Apr 20 15:02:23.211613 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.211599 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-198.ec2.internal" Apr 20 15:02:23.221151 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.221132 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-198.ec2.internal" Apr 20 15:02:23.221200 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:23.221157 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-198.ec2.internal\": node \"ip-10-0-133-198.ec2.internal\" not found" Apr 20 15:02:23.236548 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:23.236523 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-198.ec2.internal\" not found" Apr 20 15:02:23.278259 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.278177 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-198.ec2.internal"] Apr 20 15:02:23.278259 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.278260 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 15:02:23.280127 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.280109 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-198.ec2.internal" event="NodeHasSufficientMemory" Apr 20 15:02:23.280219 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.280145 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-198.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 15:02:23.280219 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.280160 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-198.ec2.internal" event="NodeHasSufficientPID" Apr 20 15:02:23.281455 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.281440 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 15:02:23.281619 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.281605 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal" Apr 20 15:02:23.281668 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.281636 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 15:02:23.284424 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.284402 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-198.ec2.internal" event="NodeHasSufficientMemory" Apr 20 15:02:23.284424 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.284426 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-198.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 15:02:23.284551 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.284436 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-198.ec2.internal" event="NodeHasSufficientPID" Apr 20 15:02:23.284551 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.284410 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-198.ec2.internal" event="NodeHasSufficientMemory" Apr 20 15:02:23.284551 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.284497 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-198.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 15:02:23.284551 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.284509 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-198.ec2.internal" event="NodeHasSufficientPID" Apr 20 15:02:23.285548 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.285529 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-198.ec2.internal" Apr 20 15:02:23.285650 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.285561 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 15:02:23.288389 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.288370 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-198.ec2.internal" event="NodeHasSufficientMemory" Apr 20 15:02:23.288460 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.288401 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-198.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 15:02:23.288460 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.288414 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-198.ec2.internal" event="NodeHasSufficientPID" Apr 20 15:02:23.311042 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:23.311017 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-198.ec2.internal\" not found" node="ip-10-0-133-198.ec2.internal" Apr 20 15:02:23.314708 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:23.314692 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-198.ec2.internal\" not found" node="ip-10-0-133-198.ec2.internal" Apr 20 15:02:23.337670 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:23.337643 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-198.ec2.internal\" not found" Apr 20 15:02:23.347737 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.347710 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9fc3cee085866514e6ec8498fa3dbfcb-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal\" (UID: \"9fc3cee085866514e6ec8498fa3dbfcb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal" Apr 20 15:02:23.347801 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.347745 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9fc3cee085866514e6ec8498fa3dbfcb-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal\" (UID: \"9fc3cee085866514e6ec8498fa3dbfcb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal" Apr 20 15:02:23.347801 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.347766 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4cf8424ac7d796a78f96e62791daed1d-config\") pod \"kube-apiserver-proxy-ip-10-0-133-198.ec2.internal\" (UID: \"4cf8424ac7d796a78f96e62791daed1d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-198.ec2.internal" Apr 20 15:02:23.437786 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:23.437755 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-198.ec2.internal\" not found" Apr 20 15:02:23.448101 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.448065 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4cf8424ac7d796a78f96e62791daed1d-config\") pod \"kube-apiserver-proxy-ip-10-0-133-198.ec2.internal\" (UID: \"4cf8424ac7d796a78f96e62791daed1d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-198.ec2.internal" Apr 20 15:02:23.448101 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.448100 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9fc3cee085866514e6ec8498fa3dbfcb-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal\" (UID: \"9fc3cee085866514e6ec8498fa3dbfcb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal" Apr 20 15:02:23.448308 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.448119 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9fc3cee085866514e6ec8498fa3dbfcb-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal\" (UID: \"9fc3cee085866514e6ec8498fa3dbfcb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal" Apr 20 15:02:23.448308 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.448158 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4cf8424ac7d796a78f96e62791daed1d-config\") pod \"kube-apiserver-proxy-ip-10-0-133-198.ec2.internal\" (UID: \"4cf8424ac7d796a78f96e62791daed1d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-198.ec2.internal" Apr 20 15:02:23.448308 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.448165 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9fc3cee085866514e6ec8498fa3dbfcb-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal\" (UID: \"9fc3cee085866514e6ec8498fa3dbfcb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal" Apr 20 15:02:23.448308 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.448164 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9fc3cee085866514e6ec8498fa3dbfcb-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal\" (UID: \"9fc3cee085866514e6ec8498fa3dbfcb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal" Apr 20 15:02:23.538596 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:23.538513 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-198.ec2.internal\" not found" Apr 20 15:02:23.614012 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.613979 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal" Apr 20 15:02:23.617843 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.617824 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-198.ec2.internal" Apr 20 15:02:23.639221 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:23.639188 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-198.ec2.internal\" not found" Apr 20 15:02:23.739702 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:23.739659 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-198.ec2.internal\" not found" Apr 20 15:02:23.840292 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:23.840198 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-198.ec2.internal\" not found" Apr 20 15:02:23.940754 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:23.940716 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-198.ec2.internal\" not found" Apr 20 15:02:23.959189 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.959166 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 15:02:23.959390 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:23.959365 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 15:02:24.040851 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:24.040821 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-198.ec2.internal\" not found" Apr 20 15:02:24.046875 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:24.046854 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 15:02:24.059672 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:24.059090 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 14:57:23 +0000 UTC" deadline="2028-01-01 15:23:06.295077048 +0000 UTC" Apr 20 15:02:24.059817 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:24.059675 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14904h20m42.235410115s" Apr 20 15:02:24.064089 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:24.064068 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 15:02:24.085093 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:24.085063 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-57nz6" Apr 20 15:02:24.093193 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:24.093126 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-57nz6" Apr 20 15:02:24.113378 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:24.113350 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 15:02:24.141011 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:24.140981 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-198.ec2.internal\" not found" Apr 20 15:02:24.141455 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:24.141429 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cf8424ac7d796a78f96e62791daed1d.slice/crio-e4d41e8a217d1841293a062647d67006c5da93578eae8dd877298bb01b6a59ac WatchSource:0}: Error finding container e4d41e8a217d1841293a062647d67006c5da93578eae8dd877298bb01b6a59ac: Status 404 returned error can't find the container with id e4d41e8a217d1841293a062647d67006c5da93578eae8dd877298bb01b6a59ac Apr 20 15:02:24.146508 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:24.146484 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 15:02:24.161181 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:24.161151 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fc3cee085866514e6ec8498fa3dbfcb.slice/crio-901c4a0a6a350ddb3d5d3f796a5e85055dfc585c76b12e57b61b4b6021839bc0 WatchSource:0}: Error finding container 901c4a0a6a350ddb3d5d3f796a5e85055dfc585c76b12e57b61b4b6021839bc0: Status 404 returned error can't find the container with id 901c4a0a6a350ddb3d5d3f796a5e85055dfc585c76b12e57b61b4b6021839bc0 Apr 20 15:02:24.180226 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:24.180175 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal" event={"ID":"9fc3cee085866514e6ec8498fa3dbfcb","Type":"ContainerStarted","Data":"901c4a0a6a350ddb3d5d3f796a5e85055dfc585c76b12e57b61b4b6021839bc0"} Apr 20 15:02:24.181183 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:24.181156 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-198.ec2.internal" event={"ID":"4cf8424ac7d796a78f96e62791daed1d","Type":"ContainerStarted","Data":"e4d41e8a217d1841293a062647d67006c5da93578eae8dd877298bb01b6a59ac"} Apr 20 15:02:24.241373 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:24.241335 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-198.ec2.internal\" not found" Apr 20 15:02:24.275454 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:24.275423 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 15:02:24.341727 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:24.341676 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-198.ec2.internal\" not found" Apr 20 15:02:24.400675 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:24.400620 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 15:02:24.447694 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:24.447664 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal" Apr 20 15:02:24.458614 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:24.458581 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 15:02:24.460928 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:24.460908 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-198.ec2.internal" Apr 20 15:02:24.468926 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:24.468904 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 15:02:24.975940 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:24.975914 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 15:02:25.026150 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.026117 2575 apiserver.go:52] "Watching apiserver" Apr 20 15:02:25.034658 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.034631 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 15:02:25.037178 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.037148 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-133-198.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal","openshift-multus/network-metrics-daemon-lnqrj","openshift-network-diagnostics/network-check-target-fbms8","openshift-network-operator/iptables-alerter-lvgwg","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9pwx","openshift-cluster-node-tuning-operator/tuned-wcr5b","openshift-image-registry/node-ca-flrxv","openshift-multus/multus-additional-cni-plugins-5mqkf","openshift-multus/multus-z9bz2","openshift-ovn-kubernetes/ovnkube-node-sdgqx","kube-system/konnectivity-agent-t94xd"] Apr 20 15:02:25.039302 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.039260 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnqrj" Apr 20 15:02:25.039471 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:25.039355 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lnqrj" podUID="ee584f46-b9aa-46b2-a060-01c6f4e256e9" Apr 20 15:02:25.041571 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.041546 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fbms8" Apr 20 15:02:25.041677 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:25.041613 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fbms8" podUID="5e85baf4-757f-4b92-b2be-cb821cc1b33e" Apr 20 15:02:25.042691 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.042671 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-lvgwg" Apr 20 15:02:25.043935 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.043914 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9pwx" Apr 20 15:02:25.044966 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.044947 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 15:02:25.045466 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.045225 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 15:02:25.045466 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.045244 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 15:02:25.045466 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.045322 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-fgj9r\"" Apr 20 15:02:25.046044 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.046028 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 15:02:25.046995 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.046301 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.046995 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.046413 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-flrxv" Apr 20 15:02:25.046995 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.046929 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-jbg2c\"" Apr 20 15:02:25.047205 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.047077 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 15:02:25.047205 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.046932 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 15:02:25.048067 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.048048 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5mqkf" Apr 20 15:02:25.048811 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.048791 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 15:02:25.048903 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.048851 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-tfhgl\"" Apr 20 15:02:25.048960 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.048795 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 15:02:25.049257 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.049242 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.053374 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.050833 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 15:02:25.053374 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.051431 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 15:02:25.053374 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.051607 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-xrl2v\"" Apr 20 15:02:25.053374 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.051674 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 15:02:25.053374 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.052121 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 15:02:25.053374 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.052221 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 15:02:25.053374 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.052335 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-drcvv\"" Apr 20 15:02:25.053374 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.052592 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 15:02:25.053374 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.052769 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 15:02:25.053374 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.052852 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 15:02:25.053374 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.053210 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 15:02:25.055098 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.054334 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.056207 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.056154 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-z2vjb\"" Apr 20 15:02:25.056599 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.056560 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d5149160-8e4b-48af-aaf9-f76ba9a5abfb-host-slash\") pod \"iptables-alerter-lvgwg\" (UID: \"d5149160-8e4b-48af-aaf9-f76ba9a5abfb\") " pod="openshift-network-operator/iptables-alerter-lvgwg" Apr 20 15:02:25.056599 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.056595 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpdt5\" (UniqueName: \"kubernetes.io/projected/d5149160-8e4b-48af-aaf9-f76ba9a5abfb-kube-api-access-mpdt5\") pod \"iptables-alerter-lvgwg\" (UID: \"d5149160-8e4b-48af-aaf9-f76ba9a5abfb\") " pod="openshift-network-operator/iptables-alerter-lvgwg" Apr 20 15:02:25.056758 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.056623 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f9489b29-6fd7-4b56-abbc-9557d736b76c-socket-dir\") pod \"aws-ebs-csi-driver-node-v9pwx\" (UID: \"f9489b29-6fd7-4b56-abbc-9557d736b76c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9pwx" Apr 20 15:02:25.056758 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.056647 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f9489b29-6fd7-4b56-abbc-9557d736b76c-etc-selinux\") pod \"aws-ebs-csi-driver-node-v9pwx\" (UID: \"f9489b29-6fd7-4b56-abbc-9557d736b76c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9pwx" Apr 20 15:02:25.056758 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.056671 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f9489b29-6fd7-4b56-abbc-9557d736b76c-sys-fs\") pod \"aws-ebs-csi-driver-node-v9pwx\" (UID: \"f9489b29-6fd7-4b56-abbc-9557d736b76c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9pwx" Apr 20 15:02:25.056758 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.056691 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f50b30d6-0d2f-4686-a3f8-128ef485713a-tmp\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.056758 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.056717 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-cg2g6\"" Apr 20 15:02:25.056758 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.056730 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/68cb8276-452d-4742-adc2-9eb67152cc05-cni-binary-copy\") pod \"multus-additional-cni-plugins-5mqkf\" (UID: \"68cb8276-452d-4742-adc2-9eb67152cc05\") " pod="openshift-multus/multus-additional-cni-plugins-5mqkf" Apr 20 15:02:25.056758 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.056757 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/86824264-d16e-4d82-854b-f1f5bc86483c-serviceca\") pod \"node-ca-flrxv\" (UID: \"86824264-d16e-4d82-854b-f1f5bc86483c\") " pod="openshift-image-registry/node-ca-flrxv" Apr 20 15:02:25.057095 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.056785 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d5149160-8e4b-48af-aaf9-f76ba9a5abfb-iptables-alerter-script\") pod \"iptables-alerter-lvgwg\" (UID: \"d5149160-8e4b-48af-aaf9-f76ba9a5abfb\") " pod="openshift-network-operator/iptables-alerter-lvgwg" Apr 20 15:02:25.057095 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.056810 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f9489b29-6fd7-4b56-abbc-9557d736b76c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-v9pwx\" (UID: \"f9489b29-6fd7-4b56-abbc-9557d736b76c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9pwx" Apr 20 15:02:25.057095 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.056833 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f50b30d6-0d2f-4686-a3f8-128ef485713a-etc-sysctl-conf\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.057095 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.056857 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f50b30d6-0d2f-4686-a3f8-128ef485713a-lib-modules\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.057095 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.056881 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f50b30d6-0d2f-4686-a3f8-128ef485713a-etc-tuned\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.057095 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.056905 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 15:02:25.057095 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.056904 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgf8f\" (UniqueName: \"kubernetes.io/projected/86824264-d16e-4d82-854b-f1f5bc86483c-kube-api-access-qgf8f\") pod \"node-ca-flrxv\" (UID: \"86824264-d16e-4d82-854b-f1f5bc86483c\") " pod="openshift-image-registry/node-ca-flrxv" Apr 20 15:02:25.057095 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.056966 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f50b30d6-0d2f-4686-a3f8-128ef485713a-etc-sysctl-d\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.057095 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.057000 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/68cb8276-452d-4742-adc2-9eb67152cc05-cnibin\") pod \"multus-additional-cni-plugins-5mqkf\" (UID: \"68cb8276-452d-4742-adc2-9eb67152cc05\") " pod="openshift-multus/multus-additional-cni-plugins-5mqkf" Apr 20 15:02:25.057095 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.057029 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhkgn\" (UniqueName: \"kubernetes.io/projected/5e85baf4-757f-4b92-b2be-cb821cc1b33e-kube-api-access-vhkgn\") pod \"network-check-target-fbms8\" (UID: \"5e85baf4-757f-4b92-b2be-cb821cc1b33e\") " pod="openshift-network-diagnostics/network-check-target-fbms8" Apr 20 15:02:25.057095 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.057088 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f9489b29-6fd7-4b56-abbc-9557d736b76c-device-dir\") pod \"aws-ebs-csi-driver-node-v9pwx\" (UID: \"f9489b29-6fd7-4b56-abbc-9557d736b76c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9pwx" Apr 20 15:02:25.057679 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.057115 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f50b30d6-0d2f-4686-a3f8-128ef485713a-etc-kubernetes\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.057679 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.057137 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/68cb8276-452d-4742-adc2-9eb67152cc05-system-cni-dir\") pod \"multus-additional-cni-plugins-5mqkf\" (UID: \"68cb8276-452d-4742-adc2-9eb67152cc05\") " pod="openshift-multus/multus-additional-cni-plugins-5mqkf" Apr 20 15:02:25.057679 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.057151 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f50b30d6-0d2f-4686-a3f8-128ef485713a-etc-systemd\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.057679 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.057166 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/68cb8276-452d-4742-adc2-9eb67152cc05-os-release\") pod \"multus-additional-cni-plugins-5mqkf\" (UID: \"68cb8276-452d-4742-adc2-9eb67152cc05\") " pod="openshift-multus/multus-additional-cni-plugins-5mqkf" Apr 20 15:02:25.057679 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.057190 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/68cb8276-452d-4742-adc2-9eb67152cc05-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5mqkf\" (UID: \"68cb8276-452d-4742-adc2-9eb67152cc05\") " pod="openshift-multus/multus-additional-cni-plugins-5mqkf" Apr 20 15:02:25.057679 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.057215 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f50b30d6-0d2f-4686-a3f8-128ef485713a-etc-modprobe-d\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.057679 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.057245 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f50b30d6-0d2f-4686-a3f8-128ef485713a-host\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.057679 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.057289 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/68cb8276-452d-4742-adc2-9eb67152cc05-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5mqkf\" (UID: \"68cb8276-452d-4742-adc2-9eb67152cc05\") " pod="openshift-multus/multus-additional-cni-plugins-5mqkf" Apr 20 15:02:25.057679 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.057317 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g24nx\" (UniqueName: \"kubernetes.io/projected/ee584f46-b9aa-46b2-a060-01c6f4e256e9-kube-api-access-g24nx\") pod \"network-metrics-daemon-lnqrj\" (UID: \"ee584f46-b9aa-46b2-a060-01c6f4e256e9\") " pod="openshift-multus/network-metrics-daemon-lnqrj" Apr 20 15:02:25.057679 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.057357 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f9489b29-6fd7-4b56-abbc-9557d736b76c-registration-dir\") pod \"aws-ebs-csi-driver-node-v9pwx\" (UID: \"f9489b29-6fd7-4b56-abbc-9557d736b76c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9pwx" Apr 20 15:02:25.057679 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.057381 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgb2k\" (UniqueName: \"kubernetes.io/projected/f9489b29-6fd7-4b56-abbc-9557d736b76c-kube-api-access-xgb2k\") pod \"aws-ebs-csi-driver-node-v9pwx\" (UID: \"f9489b29-6fd7-4b56-abbc-9557d736b76c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9pwx" Apr 20 15:02:25.057679 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.057405 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f50b30d6-0d2f-4686-a3f8-128ef485713a-run\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.057679 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.057426 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f50b30d6-0d2f-4686-a3f8-128ef485713a-sys\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.057679 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.057458 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f50b30d6-0d2f-4686-a3f8-128ef485713a-var-lib-kubelet\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.057679 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.057482 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/68cb8276-452d-4742-adc2-9eb67152cc05-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5mqkf\" (UID: \"68cb8276-452d-4742-adc2-9eb67152cc05\") " pod="openshift-multus/multus-additional-cni-plugins-5mqkf" Apr 20 15:02:25.057679 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.057505 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f50b30d6-0d2f-4686-a3f8-128ef485713a-etc-sysconfig\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.058413 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.057527 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb7qp\" (UniqueName: \"kubernetes.io/projected/f50b30d6-0d2f-4686-a3f8-128ef485713a-kube-api-access-kb7qp\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.058413 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.057604 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk96v\" (UniqueName: \"kubernetes.io/projected/68cb8276-452d-4742-adc2-9eb67152cc05-kube-api-access-gk96v\") pod \"multus-additional-cni-plugins-5mqkf\" (UID: \"68cb8276-452d-4742-adc2-9eb67152cc05\") " pod="openshift-multus/multus-additional-cni-plugins-5mqkf" Apr 20 15:02:25.058413 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.057632 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/86824264-d16e-4d82-854b-f1f5bc86483c-host\") pod \"node-ca-flrxv\" (UID: \"86824264-d16e-4d82-854b-f1f5bc86483c\") " pod="openshift-image-registry/node-ca-flrxv" Apr 20 15:02:25.058413 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.058054 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 15:02:25.058413 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.058138 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 15:02:25.058413 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.058304 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-t94xd" Apr 20 15:02:25.058692 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.058502 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 15:02:25.058692 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.057667 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee584f46-b9aa-46b2-a060-01c6f4e256e9-metrics-certs\") pod \"network-metrics-daemon-lnqrj\" (UID: \"ee584f46-b9aa-46b2-a060-01c6f4e256e9\") " pod="openshift-multus/network-metrics-daemon-lnqrj" Apr 20 15:02:25.058692 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.058662 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 15:02:25.058692 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.058682 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 15:02:25.060656 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.060632 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-cb8qj\"" Apr 20 15:02:25.060830 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.060787 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 15:02:25.060902 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.060878 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 15:02:25.094026 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.093979 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 14:57:24 +0000 UTC" deadline="2027-11-28 00:58:09.981204394 +0000 UTC" Apr 20 15:02:25.094026 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.094017 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14073h55m44.887191597s" Apr 20 15:02:25.148611 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.148581 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 15:02:25.159098 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.158957 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xpcp\" (UniqueName: \"kubernetes.io/projected/6318e638-1067-4e23-94f4-dad4de00297a-kube-api-access-5xpcp\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.159098 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.159006 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-etc-openvswitch\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.159098 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.159053 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f50b30d6-0d2f-4686-a3f8-128ef485713a-etc-modprobe-d\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.159401 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.159117 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g24nx\" (UniqueName: \"kubernetes.io/projected/ee584f46-b9aa-46b2-a060-01c6f4e256e9-kube-api-access-g24nx\") pod \"network-metrics-daemon-lnqrj\" (UID: \"ee584f46-b9aa-46b2-a060-01c6f4e256e9\") " pod="openshift-multus/network-metrics-daemon-lnqrj" Apr 20 15:02:25.159401 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.159154 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-host-cni-netd\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.159401 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.159186 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpgcg\" (UniqueName: \"kubernetes.io/projected/6d67f43f-b926-4199-ae7a-ccf686190d9b-kube-api-access-lpgcg\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.159401 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.159222 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xgb2k\" (UniqueName: \"kubernetes.io/projected/f9489b29-6fd7-4b56-abbc-9557d736b76c-kube-api-access-xgb2k\") pod \"aws-ebs-csi-driver-node-v9pwx\" (UID: \"f9489b29-6fd7-4b56-abbc-9557d736b76c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9pwx" Apr 20 15:02:25.159401 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.159241 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f50b30d6-0d2f-4686-a3f8-128ef485713a-etc-modprobe-d\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.159401 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.159306 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f50b30d6-0d2f-4686-a3f8-128ef485713a-sys\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.159401 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.159356 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f50b30d6-0d2f-4686-a3f8-128ef485713a-etc-sysconfig\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.159401 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.159398 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gk96v\" (UniqueName: \"kubernetes.io/projected/68cb8276-452d-4742-adc2-9eb67152cc05-kube-api-access-gk96v\") pod \"multus-additional-cni-plugins-5mqkf\" (UID: \"68cb8276-452d-4742-adc2-9eb67152cc05\") " pod="openshift-multus/multus-additional-cni-plugins-5mqkf" Apr 20 15:02:25.159834 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.159429 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/86824264-d16e-4d82-854b-f1f5bc86483c-host\") pod \"node-ca-flrxv\" (UID: \"86824264-d16e-4d82-854b-f1f5bc86483c\") " pod="openshift-image-registry/node-ca-flrxv" Apr 20 15:02:25.159834 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.159471 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee584f46-b9aa-46b2-a060-01c6f4e256e9-metrics-certs\") pod \"network-metrics-daemon-lnqrj\" (UID: \"ee584f46-b9aa-46b2-a060-01c6f4e256e9\") " pod="openshift-multus/network-metrics-daemon-lnqrj" Apr 20 15:02:25.159834 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.159503 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-host-var-lib-cni-multus\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.159834 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.159525 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f50b30d6-0d2f-4686-a3f8-128ef485713a-etc-sysconfig\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.159834 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.159537 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-log-socket\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.159834 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.159569 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6d67f43f-b926-4199-ae7a-ccf686190d9b-ovn-node-metrics-cert\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.159834 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:25.159705 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:02:25.159834 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.159775 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f50b30d6-0d2f-4686-a3f8-128ef485713a-sys\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.159834 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:25.159791 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee584f46-b9aa-46b2-a060-01c6f4e256e9-metrics-certs podName:ee584f46-b9aa-46b2-a060-01c6f4e256e9 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:25.659767536 +0000 UTC m=+3.032354241 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ee584f46-b9aa-46b2-a060-01c6f4e256e9-metrics-certs") pod "network-metrics-daemon-lnqrj" (UID: "ee584f46-b9aa-46b2-a060-01c6f4e256e9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:02:25.160312 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.159845 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/86824264-d16e-4d82-854b-f1f5bc86483c-host\") pod \"node-ca-flrxv\" (UID: \"86824264-d16e-4d82-854b-f1f5bc86483c\") " pod="openshift-image-registry/node-ca-flrxv" Apr 20 15:02:25.160312 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.159600 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-host-var-lib-cni-bin\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.160312 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.160098 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d5149160-8e4b-48af-aaf9-f76ba9a5abfb-host-slash\") pod \"iptables-alerter-lvgwg\" (UID: \"d5149160-8e4b-48af-aaf9-f76ba9a5abfb\") " pod="openshift-network-operator/iptables-alerter-lvgwg" Apr 20 15:02:25.160312 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.160183 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mpdt5\" (UniqueName: \"kubernetes.io/projected/d5149160-8e4b-48af-aaf9-f76ba9a5abfb-kube-api-access-mpdt5\") pod \"iptables-alerter-lvgwg\" (UID: \"d5149160-8e4b-48af-aaf9-f76ba9a5abfb\") " pod="openshift-network-operator/iptables-alerter-lvgwg" Apr 20 15:02:25.160312 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.160249 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f9489b29-6fd7-4b56-abbc-9557d736b76c-socket-dir\") pod \"aws-ebs-csi-driver-node-v9pwx\" (UID: \"f9489b29-6fd7-4b56-abbc-9557d736b76c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9pwx" Apr 20 15:02:25.160669 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.160645 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f9489b29-6fd7-4b56-abbc-9557d736b76c-etc-selinux\") pod \"aws-ebs-csi-driver-node-v9pwx\" (UID: \"f9489b29-6fd7-4b56-abbc-9557d736b76c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9pwx" Apr 20 15:02:25.160773 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.160684 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f9489b29-6fd7-4b56-abbc-9557d736b76c-sys-fs\") pod \"aws-ebs-csi-driver-node-v9pwx\" (UID: \"f9489b29-6fd7-4b56-abbc-9557d736b76c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9pwx" Apr 20 15:02:25.160773 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.160709 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d5149160-8e4b-48af-aaf9-f76ba9a5abfb-host-slash\") pod \"iptables-alerter-lvgwg\" (UID: \"d5149160-8e4b-48af-aaf9-f76ba9a5abfb\") " pod="openshift-network-operator/iptables-alerter-lvgwg" Apr 20 15:02:25.160773 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.160723 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-multus-cni-dir\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.160773 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.160745 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-os-release\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.160773 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.160767 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f9489b29-6fd7-4b56-abbc-9557d736b76c-socket-dir\") pod \"aws-ebs-csi-driver-node-v9pwx\" (UID: \"f9489b29-6fd7-4b56-abbc-9557d736b76c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9pwx" Apr 20 15:02:25.161009 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.160770 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d5149160-8e4b-48af-aaf9-f76ba9a5abfb-iptables-alerter-script\") pod \"iptables-alerter-lvgwg\" (UID: \"d5149160-8e4b-48af-aaf9-f76ba9a5abfb\") " pod="openshift-network-operator/iptables-alerter-lvgwg" Apr 20 15:02:25.161009 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.160808 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f50b30d6-0d2f-4686-a3f8-128ef485713a-etc-sysctl-conf\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.161009 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.160812 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f9489b29-6fd7-4b56-abbc-9557d736b76c-sys-fs\") pod \"aws-ebs-csi-driver-node-v9pwx\" (UID: \"f9489b29-6fd7-4b56-abbc-9557d736b76c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9pwx" Apr 20 15:02:25.161009 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.160824 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f50b30d6-0d2f-4686-a3f8-128ef485713a-lib-modules\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.161009 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.160842 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qgf8f\" (UniqueName: \"kubernetes.io/projected/86824264-d16e-4d82-854b-f1f5bc86483c-kube-api-access-qgf8f\") pod \"node-ca-flrxv\" (UID: \"86824264-d16e-4d82-854b-f1f5bc86483c\") " pod="openshift-image-registry/node-ca-flrxv" Apr 20 15:02:25.161009 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.160861 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-host-var-lib-kubelet\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.161009 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.160874 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f9489b29-6fd7-4b56-abbc-9557d736b76c-etc-selinux\") pod \"aws-ebs-csi-driver-node-v9pwx\" (UID: \"f9489b29-6fd7-4b56-abbc-9557d736b76c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9pwx" Apr 20 15:02:25.161009 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.160889 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-hostroot\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.161009 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.160906 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-host-run-k8s-cni-cncf-io\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.161009 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.160930 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f50b30d6-0d2f-4686-a3f8-128ef485713a-etc-sysctl-d\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.161009 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.160954 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/68cb8276-452d-4742-adc2-9eb67152cc05-cnibin\") pod \"multus-additional-cni-plugins-5mqkf\" (UID: \"68cb8276-452d-4742-adc2-9eb67152cc05\") " pod="openshift-multus/multus-additional-cni-plugins-5mqkf" Apr 20 15:02:25.161009 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.160993 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-systemd-units\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.161009 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161003 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f50b30d6-0d2f-4686-a3f8-128ef485713a-lib-modules\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.161626 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161025 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-host-slash\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.161626 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161000 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/68cb8276-452d-4742-adc2-9eb67152cc05-cnibin\") pod \"multus-additional-cni-plugins-5mqkf\" (UID: \"68cb8276-452d-4742-adc2-9eb67152cc05\") " pod="openshift-multus/multus-additional-cni-plugins-5mqkf" Apr 20 15:02:25.161626 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161061 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f50b30d6-0d2f-4686-a3f8-128ef485713a-etc-sysctl-conf\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.161626 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161111 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f50b30d6-0d2f-4686-a3f8-128ef485713a-etc-sysctl-d\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.161626 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161143 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-node-log\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.161626 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161174 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6d67f43f-b926-4199-ae7a-ccf686190d9b-ovnkube-config\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.161626 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161197 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6d67f43f-b926-4199-ae7a-ccf686190d9b-env-overrides\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.161626 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161246 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/86bce86e-ed68-41f9-be80-acf37bfb646f-agent-certs\") pod \"konnectivity-agent-t94xd\" (UID: \"86bce86e-ed68-41f9-be80-acf37bfb646f\") " pod="kube-system/konnectivity-agent-t94xd" Apr 20 15:02:25.161626 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161285 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-etc-kubernetes\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.161626 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161301 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d5149160-8e4b-48af-aaf9-f76ba9a5abfb-iptables-alerter-script\") pod \"iptables-alerter-lvgwg\" (UID: \"d5149160-8e4b-48af-aaf9-f76ba9a5abfb\") " pod="openshift-network-operator/iptables-alerter-lvgwg" Apr 20 15:02:25.161626 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161312 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-host-kubelet\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.161626 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161334 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-run-openvswitch\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.161626 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161351 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.161626 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161367 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-system-cni-dir\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.161626 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161398 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f50b30d6-0d2f-4686-a3f8-128ef485713a-host\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.161626 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161426 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/68cb8276-452d-4742-adc2-9eb67152cc05-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5mqkf\" (UID: \"68cb8276-452d-4742-adc2-9eb67152cc05\") " pod="openshift-multus/multus-additional-cni-plugins-5mqkf" Apr 20 15:02:25.161626 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161474 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6318e638-1067-4e23-94f4-dad4de00297a-cni-binary-copy\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.162453 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161478 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f50b30d6-0d2f-4686-a3f8-128ef485713a-host\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.162453 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161512 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/86bce86e-ed68-41f9-be80-acf37bfb646f-konnectivity-ca\") pod \"konnectivity-agent-t94xd\" (UID: \"86bce86e-ed68-41f9-be80-acf37bfb646f\") " pod="kube-system/konnectivity-agent-t94xd" Apr 20 15:02:25.162453 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161554 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-host-run-multus-certs\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.162453 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161588 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f9489b29-6fd7-4b56-abbc-9557d736b76c-registration-dir\") pod \"aws-ebs-csi-driver-node-v9pwx\" (UID: \"f9489b29-6fd7-4b56-abbc-9557d736b76c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9pwx" Apr 20 15:02:25.162453 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161614 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f50b30d6-0d2f-4686-a3f8-128ef485713a-run\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.162453 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161639 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f50b30d6-0d2f-4686-a3f8-128ef485713a-var-lib-kubelet\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.162453 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161663 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/68cb8276-452d-4742-adc2-9eb67152cc05-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5mqkf\" (UID: \"68cb8276-452d-4742-adc2-9eb67152cc05\") " pod="openshift-multus/multus-additional-cni-plugins-5mqkf" Apr 20 15:02:25.162453 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161682 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f9489b29-6fd7-4b56-abbc-9557d736b76c-registration-dir\") pod \"aws-ebs-csi-driver-node-v9pwx\" (UID: \"f9489b29-6fd7-4b56-abbc-9557d736b76c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9pwx" Apr 20 15:02:25.162453 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161691 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-multus-conf-dir\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.162453 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161716 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-var-lib-openvswitch\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.162453 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161736 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f50b30d6-0d2f-4686-a3f8-128ef485713a-var-lib-kubelet\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.162453 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161741 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-host-cni-bin\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.162453 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161785 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f50b30d6-0d2f-4686-a3f8-128ef485713a-run\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.162453 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161813 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kb7qp\" (UniqueName: \"kubernetes.io/projected/f50b30d6-0d2f-4686-a3f8-128ef485713a-kube-api-access-kb7qp\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.162453 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161841 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-cnibin\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.162453 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161865 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6d67f43f-b926-4199-ae7a-ccf686190d9b-ovnkube-script-lib\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.162453 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161890 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f50b30d6-0d2f-4686-a3f8-128ef485713a-tmp\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.163144 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161895 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/68cb8276-452d-4742-adc2-9eb67152cc05-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5mqkf\" (UID: \"68cb8276-452d-4742-adc2-9eb67152cc05\") " pod="openshift-multus/multus-additional-cni-plugins-5mqkf" Apr 20 15:02:25.163144 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161914 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/68cb8276-452d-4742-adc2-9eb67152cc05-cni-binary-copy\") pod \"multus-additional-cni-plugins-5mqkf\" (UID: \"68cb8276-452d-4742-adc2-9eb67152cc05\") " pod="openshift-multus/multus-additional-cni-plugins-5mqkf" Apr 20 15:02:25.163144 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161935 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/68cb8276-452d-4742-adc2-9eb67152cc05-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5mqkf\" (UID: \"68cb8276-452d-4742-adc2-9eb67152cc05\") " pod="openshift-multus/multus-additional-cni-plugins-5mqkf" Apr 20 15:02:25.163144 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161939 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/86824264-d16e-4d82-854b-f1f5bc86483c-serviceca\") pod \"node-ca-flrxv\" (UID: \"86824264-d16e-4d82-854b-f1f5bc86483c\") " pod="openshift-image-registry/node-ca-flrxv" Apr 20 15:02:25.163144 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.161979 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-host-run-netns\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.163144 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.162009 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f9489b29-6fd7-4b56-abbc-9557d736b76c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-v9pwx\" (UID: \"f9489b29-6fd7-4b56-abbc-9557d736b76c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9pwx" Apr 20 15:02:25.163144 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.162036 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f50b30d6-0d2f-4686-a3f8-128ef485713a-etc-tuned\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.163144 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.162061 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6318e638-1067-4e23-94f4-dad4de00297a-multus-daemon-config\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.163144 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.162089 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-multus-socket-dir-parent\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.163144 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.162106 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f9489b29-6fd7-4b56-abbc-9557d736b76c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-v9pwx\" (UID: \"f9489b29-6fd7-4b56-abbc-9557d736b76c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9pwx" Apr 20 15:02:25.163144 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.162112 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-run-systemd\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.163144 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.162147 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-host-run-ovn-kubernetes\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.163144 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.162174 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhkgn\" (UniqueName: \"kubernetes.io/projected/5e85baf4-757f-4b92-b2be-cb821cc1b33e-kube-api-access-vhkgn\") pod \"network-check-target-fbms8\" (UID: \"5e85baf4-757f-4b92-b2be-cb821cc1b33e\") " pod="openshift-network-diagnostics/network-check-target-fbms8" Apr 20 15:02:25.163144 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.162200 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f9489b29-6fd7-4b56-abbc-9557d736b76c-device-dir\") pod \"aws-ebs-csi-driver-node-v9pwx\" (UID: \"f9489b29-6fd7-4b56-abbc-9557d736b76c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9pwx" Apr 20 15:02:25.163144 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.162223 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f50b30d6-0d2f-4686-a3f8-128ef485713a-etc-kubernetes\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.163144 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.162247 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/68cb8276-452d-4742-adc2-9eb67152cc05-system-cni-dir\") pod \"multus-additional-cni-plugins-5mqkf\" (UID: \"68cb8276-452d-4742-adc2-9eb67152cc05\") " pod="openshift-multus/multus-additional-cni-plugins-5mqkf" Apr 20 15:02:25.163144 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.162266 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 15:02:25.163920 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.162290 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-host-run-netns\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.163920 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.162307 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f50b30d6-0d2f-4686-a3f8-128ef485713a-etc-kubernetes\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.163920 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.162326 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-run-ovn\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.163920 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.162328 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f9489b29-6fd7-4b56-abbc-9557d736b76c-device-dir\") pod \"aws-ebs-csi-driver-node-v9pwx\" (UID: \"f9489b29-6fd7-4b56-abbc-9557d736b76c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9pwx" Apr 20 15:02:25.163920 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.162352 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f50b30d6-0d2f-4686-a3f8-128ef485713a-etc-systemd\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.163920 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.162356 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/86824264-d16e-4d82-854b-f1f5bc86483c-serviceca\") pod \"node-ca-flrxv\" (UID: \"86824264-d16e-4d82-854b-f1f5bc86483c\") " pod="openshift-image-registry/node-ca-flrxv" Apr 20 15:02:25.163920 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.162359 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/68cb8276-452d-4742-adc2-9eb67152cc05-system-cni-dir\") pod \"multus-additional-cni-plugins-5mqkf\" (UID: \"68cb8276-452d-4742-adc2-9eb67152cc05\") " pod="openshift-multus/multus-additional-cni-plugins-5mqkf" Apr 20 15:02:25.163920 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.162376 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/68cb8276-452d-4742-adc2-9eb67152cc05-os-release\") pod \"multus-additional-cni-plugins-5mqkf\" (UID: \"68cb8276-452d-4742-adc2-9eb67152cc05\") " pod="openshift-multus/multus-additional-cni-plugins-5mqkf" Apr 20 15:02:25.163920 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.162411 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/68cb8276-452d-4742-adc2-9eb67152cc05-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5mqkf\" (UID: \"68cb8276-452d-4742-adc2-9eb67152cc05\") " pod="openshift-multus/multus-additional-cni-plugins-5mqkf" Apr 20 15:02:25.163920 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.162412 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f50b30d6-0d2f-4686-a3f8-128ef485713a-etc-systemd\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.163920 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.162434 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/68cb8276-452d-4742-adc2-9eb67152cc05-os-release\") pod \"multus-additional-cni-plugins-5mqkf\" (UID: \"68cb8276-452d-4742-adc2-9eb67152cc05\") " pod="openshift-multus/multus-additional-cni-plugins-5mqkf" Apr 20 15:02:25.163920 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.162481 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/68cb8276-452d-4742-adc2-9eb67152cc05-cni-binary-copy\") pod \"multus-additional-cni-plugins-5mqkf\" (UID: \"68cb8276-452d-4742-adc2-9eb67152cc05\") " pod="openshift-multus/multus-additional-cni-plugins-5mqkf" Apr 20 15:02:25.163920 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.162864 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/68cb8276-452d-4742-adc2-9eb67152cc05-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5mqkf\" (UID: \"68cb8276-452d-4742-adc2-9eb67152cc05\") " pod="openshift-multus/multus-additional-cni-plugins-5mqkf" Apr 20 15:02:25.165674 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.165649 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f50b30d6-0d2f-4686-a3f8-128ef485713a-etc-tuned\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.165762 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.165698 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f50b30d6-0d2f-4686-a3f8-128ef485713a-tmp\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.168521 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.168501 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g24nx\" (UniqueName: \"kubernetes.io/projected/ee584f46-b9aa-46b2-a060-01c6f4e256e9-kube-api-access-g24nx\") pod \"network-metrics-daemon-lnqrj\" (UID: \"ee584f46-b9aa-46b2-a060-01c6f4e256e9\") " pod="openshift-multus/network-metrics-daemon-lnqrj" Apr 20 15:02:25.170578 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:25.170314 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 15:02:25.170578 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:25.170336 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 15:02:25.170578 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:25.170351 2575 projected.go:194] Error preparing data for projected volume kube-api-access-vhkgn for pod openshift-network-diagnostics/network-check-target-fbms8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:02:25.170578 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:25.170424 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5e85baf4-757f-4b92-b2be-cb821cc1b33e-kube-api-access-vhkgn podName:5e85baf4-757f-4b92-b2be-cb821cc1b33e nodeName:}" failed. No retries permitted until 2026-04-20 15:02:25.670405585 +0000 UTC m=+3.042992287 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-vhkgn" (UniqueName: "kubernetes.io/projected/5e85baf4-757f-4b92-b2be-cb821cc1b33e-kube-api-access-vhkgn") pod "network-check-target-fbms8" (UID: "5e85baf4-757f-4b92-b2be-cb821cc1b33e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:02:25.172253 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.172195 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgf8f\" (UniqueName: \"kubernetes.io/projected/86824264-d16e-4d82-854b-f1f5bc86483c-kube-api-access-qgf8f\") pod \"node-ca-flrxv\" (UID: \"86824264-d16e-4d82-854b-f1f5bc86483c\") " pod="openshift-image-registry/node-ca-flrxv" Apr 20 15:02:25.172856 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.172824 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk96v\" (UniqueName: \"kubernetes.io/projected/68cb8276-452d-4742-adc2-9eb67152cc05-kube-api-access-gk96v\") pod \"multus-additional-cni-plugins-5mqkf\" (UID: \"68cb8276-452d-4742-adc2-9eb67152cc05\") " pod="openshift-multus/multus-additional-cni-plugins-5mqkf" Apr 20 15:02:25.173064 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.173042 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpdt5\" (UniqueName: \"kubernetes.io/projected/d5149160-8e4b-48af-aaf9-f76ba9a5abfb-kube-api-access-mpdt5\") pod \"iptables-alerter-lvgwg\" (UID: \"d5149160-8e4b-48af-aaf9-f76ba9a5abfb\") " pod="openshift-network-operator/iptables-alerter-lvgwg" Apr 20 15:02:25.173490 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.173465 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgb2k\" (UniqueName: \"kubernetes.io/projected/f9489b29-6fd7-4b56-abbc-9557d736b76c-kube-api-access-xgb2k\") pod \"aws-ebs-csi-driver-node-v9pwx\" (UID: \"f9489b29-6fd7-4b56-abbc-9557d736b76c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9pwx" Apr 20 15:02:25.174098 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.174081 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb7qp\" (UniqueName: \"kubernetes.io/projected/f50b30d6-0d2f-4686-a3f8-128ef485713a-kube-api-access-kb7qp\") pod \"tuned-wcr5b\" (UID: \"f50b30d6-0d2f-4686-a3f8-128ef485713a\") " pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.263352 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263224 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-host-run-netns\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.263352 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263285 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6318e638-1067-4e23-94f4-dad4de00297a-multus-daemon-config\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.263352 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263311 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-multus-socket-dir-parent\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.263352 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263331 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-run-systemd\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.263352 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263346 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-host-run-ovn-kubernetes\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.263694 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263352 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-host-run-netns\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.263694 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263407 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-host-run-netns\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.263694 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263407 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-run-systemd\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.263694 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263433 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-multus-socket-dir-parent\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.263694 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263442 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-host-run-netns\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.263694 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263502 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-host-run-ovn-kubernetes\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.263694 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263535 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-run-ovn\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.263694 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263581 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5xpcp\" (UniqueName: \"kubernetes.io/projected/6318e638-1067-4e23-94f4-dad4de00297a-kube-api-access-5xpcp\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.263694 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263609 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-etc-openvswitch\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.263694 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263620 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-run-ovn\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.263694 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263635 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-host-cni-netd\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.263694 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263661 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lpgcg\" (UniqueName: \"kubernetes.io/projected/6d67f43f-b926-4199-ae7a-ccf686190d9b-kube-api-access-lpgcg\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.264197 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263704 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-host-var-lib-cni-multus\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.264197 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263728 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-log-socket\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.264197 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263752 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6d67f43f-b926-4199-ae7a-ccf686190d9b-ovn-node-metrics-cert\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.264197 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263749 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-host-cni-netd\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.264197 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263780 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-host-var-lib-cni-bin\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.264197 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263811 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-host-var-lib-cni-bin\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.264197 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263817 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-multus-cni-dir\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.264197 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263784 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-etc-openvswitch\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.264197 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263824 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-log-socket\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.264197 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263841 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-os-release\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.264197 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263864 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-host-var-lib-cni-multus\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.264197 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263868 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-host-var-lib-kubelet\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.264197 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263894 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-host-var-lib-kubelet\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.264197 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263895 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-hostroot\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.264197 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263926 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-hostroot\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.264197 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263933 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-host-run-k8s-cni-cncf-io\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.264197 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263948 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-multus-cni-dir\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.264197 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263963 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-systemd-units\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.264993 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263951 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6318e638-1067-4e23-94f4-dad4de00297a-multus-daemon-config\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.264993 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263990 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-host-slash\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.264993 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263995 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-host-run-k8s-cni-cncf-io\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.264993 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263992 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-os-release\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.264993 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.263997 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-systemd-units\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.264993 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.264015 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-node-log\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.264993 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.264053 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-node-log\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.264993 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.264058 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6d67f43f-b926-4199-ae7a-ccf686190d9b-ovnkube-config\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.264993 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.264030 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-host-slash\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.264993 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.264083 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6d67f43f-b926-4199-ae7a-ccf686190d9b-env-overrides\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.264993 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.264109 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/86bce86e-ed68-41f9-be80-acf37bfb646f-agent-certs\") pod \"konnectivity-agent-t94xd\" (UID: \"86bce86e-ed68-41f9-be80-acf37bfb646f\") " pod="kube-system/konnectivity-agent-t94xd" Apr 20 15:02:25.264993 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.264134 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-etc-kubernetes\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.264993 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.264167 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-etc-kubernetes\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.264993 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.264185 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-host-kubelet\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.264993 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.264206 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-run-openvswitch\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.264993 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.264231 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.264993 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.264256 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-system-cni-dir\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.264993 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.264259 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-host-kubelet\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.265748 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.264312 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-run-openvswitch\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.265748 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.264328 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-system-cni-dir\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.265748 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.264349 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.265748 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.264355 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6318e638-1067-4e23-94f4-dad4de00297a-cni-binary-copy\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.265748 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.264382 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/86bce86e-ed68-41f9-be80-acf37bfb646f-konnectivity-ca\") pod \"konnectivity-agent-t94xd\" (UID: \"86bce86e-ed68-41f9-be80-acf37bfb646f\") " pod="kube-system/konnectivity-agent-t94xd" Apr 20 15:02:25.265748 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.264409 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-host-run-multus-certs\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.265748 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.264435 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-multus-conf-dir\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.265748 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.264475 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-multus-conf-dir\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.265748 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.264478 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-var-lib-openvswitch\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.265748 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.264512 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-host-run-multus-certs\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.265748 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.264517 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-host-cni-bin\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.265748 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.264543 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-cnibin\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.265748 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.264551 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-var-lib-openvswitch\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.265748 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.264567 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6d67f43f-b926-4199-ae7a-ccf686190d9b-ovnkube-script-lib\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.265748 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.264587 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6d67f43f-b926-4199-ae7a-ccf686190d9b-host-cni-bin\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.265748 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.264625 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6d67f43f-b926-4199-ae7a-ccf686190d9b-env-overrides\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.265748 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.264633 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6318e638-1067-4e23-94f4-dad4de00297a-cnibin\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.265748 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.264655 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6d67f43f-b926-4199-ae7a-ccf686190d9b-ovnkube-config\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.266603 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.264886 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6318e638-1067-4e23-94f4-dad4de00297a-cni-binary-copy\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.266603 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.264911 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/86bce86e-ed68-41f9-be80-acf37bfb646f-konnectivity-ca\") pod \"konnectivity-agent-t94xd\" (UID: \"86bce86e-ed68-41f9-be80-acf37bfb646f\") " pod="kube-system/konnectivity-agent-t94xd" Apr 20 15:02:25.266603 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.265090 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6d67f43f-b926-4199-ae7a-ccf686190d9b-ovnkube-script-lib\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.266988 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.266965 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6d67f43f-b926-4199-ae7a-ccf686190d9b-ovn-node-metrics-cert\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.267101 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.266990 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/86bce86e-ed68-41f9-be80-acf37bfb646f-agent-certs\") pod \"konnectivity-agent-t94xd\" (UID: \"86bce86e-ed68-41f9-be80-acf37bfb646f\") " pod="kube-system/konnectivity-agent-t94xd" Apr 20 15:02:25.271695 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.271675 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xpcp\" (UniqueName: \"kubernetes.io/projected/6318e638-1067-4e23-94f4-dad4de00297a-kube-api-access-5xpcp\") pod \"multus-z9bz2\" (UID: \"6318e638-1067-4e23-94f4-dad4de00297a\") " pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.272051 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.272033 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpgcg\" (UniqueName: \"kubernetes.io/projected/6d67f43f-b926-4199-ae7a-ccf686190d9b-kube-api-access-lpgcg\") pod \"ovnkube-node-sdgqx\" (UID: \"6d67f43f-b926-4199-ae7a-ccf686190d9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.358129 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.358095 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-lvgwg" Apr 20 15:02:25.365726 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.365699 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9pwx" Apr 20 15:02:25.375342 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.375315 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" Apr 20 15:02:25.381952 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.381923 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-flrxv" Apr 20 15:02:25.388575 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.388548 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5mqkf" Apr 20 15:02:25.397284 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.397254 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-z9bz2" Apr 20 15:02:25.404055 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.404025 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:25.411701 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.411678 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-t94xd" Apr 20 15:02:25.667869 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.667788 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee584f46-b9aa-46b2-a060-01c6f4e256e9-metrics-certs\") pod \"network-metrics-daemon-lnqrj\" (UID: \"ee584f46-b9aa-46b2-a060-01c6f4e256e9\") " pod="openshift-multus/network-metrics-daemon-lnqrj" Apr 20 15:02:25.668036 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:25.667929 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:02:25.668036 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:25.667994 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee584f46-b9aa-46b2-a060-01c6f4e256e9-metrics-certs podName:ee584f46-b9aa-46b2-a060-01c6f4e256e9 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:26.667976719 +0000 UTC m=+4.040563416 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ee584f46-b9aa-46b2-a060-01c6f4e256e9-metrics-certs") pod "network-metrics-daemon-lnqrj" (UID: "ee584f46-b9aa-46b2-a060-01c6f4e256e9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:02:25.768560 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:25.768530 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhkgn\" (UniqueName: \"kubernetes.io/projected/5e85baf4-757f-4b92-b2be-cb821cc1b33e-kube-api-access-vhkgn\") pod \"network-check-target-fbms8\" (UID: \"5e85baf4-757f-4b92-b2be-cb821cc1b33e\") " pod="openshift-network-diagnostics/network-check-target-fbms8" Apr 20 15:02:25.768744 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:25.768698 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 15:02:25.768744 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:25.768722 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 15:02:25.768744 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:25.768734 2575 projected.go:194] Error preparing data for projected volume kube-api-access-vhkgn for pod openshift-network-diagnostics/network-check-target-fbms8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:02:25.768898 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:25.768799 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5e85baf4-757f-4b92-b2be-cb821cc1b33e-kube-api-access-vhkgn podName:5e85baf4-757f-4b92-b2be-cb821cc1b33e nodeName:}" failed. No retries permitted until 2026-04-20 15:02:26.768778687 +0000 UTC m=+4.141365370 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-vhkgn" (UniqueName: "kubernetes.io/projected/5e85baf4-757f-4b92-b2be-cb821cc1b33e-kube-api-access-vhkgn") pod "network-check-target-fbms8" (UID: "5e85baf4-757f-4b92-b2be-cb821cc1b33e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:02:25.896183 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:25.896149 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86bce86e_ed68_41f9_be80_acf37bfb646f.slice/crio-c7735192e3cecb2d6ca09e828a6d07197be067074b4731eee2eb963f0f728bee WatchSource:0}: Error finding container c7735192e3cecb2d6ca09e828a6d07197be067074b4731eee2eb963f0f728bee: Status 404 returned error can't find the container with id c7735192e3cecb2d6ca09e828a6d07197be067074b4731eee2eb963f0f728bee Apr 20 15:02:25.897215 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:25.897187 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86824264_d16e_4d82_854b_f1f5bc86483c.slice/crio-3be7e47fb90b365640329df6ba63b5f2f6ad91155fe6daf9617b1aba554818ff WatchSource:0}: Error finding container 3be7e47fb90b365640329df6ba63b5f2f6ad91155fe6daf9617b1aba554818ff: Status 404 returned error can't find the container with id 3be7e47fb90b365640329df6ba63b5f2f6ad91155fe6daf9617b1aba554818ff Apr 20 15:02:25.898577 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:25.898551 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6318e638_1067_4e23_94f4_dad4de00297a.slice/crio-4f93ff6181db2096e3f5b0c5c62bee25708a7749572b9381e508f123404df7df WatchSource:0}: Error finding container 4f93ff6181db2096e3f5b0c5c62bee25708a7749572b9381e508f123404df7df: Status 404 returned error can't find the container with id 4f93ff6181db2096e3f5b0c5c62bee25708a7749572b9381e508f123404df7df Apr 20 15:02:25.899936 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:25.899901 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9489b29_6fd7_4b56_abbc_9557d736b76c.slice/crio-50f258718b015b07b8bf8d719dc2d9dc317a62775fed8fdf46fe50585e1b094a WatchSource:0}: Error finding container 50f258718b015b07b8bf8d719dc2d9dc317a62775fed8fdf46fe50585e1b094a: Status 404 returned error can't find the container with id 50f258718b015b07b8bf8d719dc2d9dc317a62775fed8fdf46fe50585e1b094a Apr 20 15:02:25.901943 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:25.901915 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf50b30d6_0d2f_4686_a3f8_128ef485713a.slice/crio-676107f9ccafd34fabe65ee721aee68068387e82829576cfdefb700f5909660c WatchSource:0}: Error finding container 676107f9ccafd34fabe65ee721aee68068387e82829576cfdefb700f5909660c: Status 404 returned error can't find the container with id 676107f9ccafd34fabe65ee721aee68068387e82829576cfdefb700f5909660c Apr 20 15:02:25.903263 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:25.903236 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5149160_8e4b_48af_aaf9_f76ba9a5abfb.slice/crio-a3e7ffb66b66c9c5c2bff2642f9a300fce151c8da5a1eeafc2f3f766da566742 WatchSource:0}: Error finding container a3e7ffb66b66c9c5c2bff2642f9a300fce151c8da5a1eeafc2f3f766da566742: Status 404 returned error can't find the container with id a3e7ffb66b66c9c5c2bff2642f9a300fce151c8da5a1eeafc2f3f766da566742 Apr 20 15:02:25.904481 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:25.904358 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68cb8276_452d_4742_adc2_9eb67152cc05.slice/crio-510e0f69679c214ab6bcc1a7622dfa24b82cbbe71faba8a98d2d99bcd708611b WatchSource:0}: Error finding container 510e0f69679c214ab6bcc1a7622dfa24b82cbbe71faba8a98d2d99bcd708611b: Status 404 returned error can't find the container with id 510e0f69679c214ab6bcc1a7622dfa24b82cbbe71faba8a98d2d99bcd708611b Apr 20 15:02:25.905089 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:25.904995 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d67f43f_b926_4199_ae7a_ccf686190d9b.slice/crio-18299e79c829c55f59b8884444837f35ff87c3bd12f59c7b59fbb3fb111cf58a WatchSource:0}: Error finding container 18299e79c829c55f59b8884444837f35ff87c3bd12f59c7b59fbb3fb111cf58a: Status 404 returned error can't find the container with id 18299e79c829c55f59b8884444837f35ff87c3bd12f59c7b59fbb3fb111cf58a Apr 20 15:02:26.094923 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:26.094874 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 14:57:24 +0000 UTC" deadline="2027-10-20 23:38:51.48080323 +0000 UTC" Apr 20 15:02:26.094923 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:26.094915 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13160h36m25.385891803s" Apr 20 15:02:26.177806 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:26.177720 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnqrj" Apr 20 15:02:26.177945 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:26.177830 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lnqrj" podUID="ee584f46-b9aa-46b2-a060-01c6f4e256e9" Apr 20 15:02:26.183935 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:26.183904 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-lvgwg" event={"ID":"d5149160-8e4b-48af-aaf9-f76ba9a5abfb","Type":"ContainerStarted","Data":"a3e7ffb66b66c9c5c2bff2642f9a300fce151c8da5a1eeafc2f3f766da566742"} Apr 20 15:02:26.184809 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:26.184780 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5mqkf" event={"ID":"68cb8276-452d-4742-adc2-9eb67152cc05","Type":"ContainerStarted","Data":"510e0f69679c214ab6bcc1a7622dfa24b82cbbe71faba8a98d2d99bcd708611b"} Apr 20 15:02:26.187030 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:26.187000 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-198.ec2.internal" event={"ID":"4cf8424ac7d796a78f96e62791daed1d","Type":"ContainerStarted","Data":"bb0ab3d9c4edc1e68d628a41810054b5a86748c13e36656e5c3c8e53edd7e442"} Apr 20 15:02:26.188013 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:26.187989 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" event={"ID":"6d67f43f-b926-4199-ae7a-ccf686190d9b","Type":"ContainerStarted","Data":"18299e79c829c55f59b8884444837f35ff87c3bd12f59c7b59fbb3fb111cf58a"} Apr 20 15:02:26.189058 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:26.189039 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" event={"ID":"f50b30d6-0d2f-4686-a3f8-128ef485713a","Type":"ContainerStarted","Data":"676107f9ccafd34fabe65ee721aee68068387e82829576cfdefb700f5909660c"} Apr 20 15:02:26.189908 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:26.189882 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9pwx" event={"ID":"f9489b29-6fd7-4b56-abbc-9557d736b76c","Type":"ContainerStarted","Data":"50f258718b015b07b8bf8d719dc2d9dc317a62775fed8fdf46fe50585e1b094a"} Apr 20 15:02:26.190783 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:26.190763 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-z9bz2" event={"ID":"6318e638-1067-4e23-94f4-dad4de00297a","Type":"ContainerStarted","Data":"4f93ff6181db2096e3f5b0c5c62bee25708a7749572b9381e508f123404df7df"} Apr 20 15:02:26.191728 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:26.191711 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-flrxv" event={"ID":"86824264-d16e-4d82-854b-f1f5bc86483c","Type":"ContainerStarted","Data":"3be7e47fb90b365640329df6ba63b5f2f6ad91155fe6daf9617b1aba554818ff"} Apr 20 15:02:26.192509 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:26.192490 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-t94xd" event={"ID":"86bce86e-ed68-41f9-be80-acf37bfb646f","Type":"ContainerStarted","Data":"c7735192e3cecb2d6ca09e828a6d07197be067074b4731eee2eb963f0f728bee"} Apr 20 15:02:26.199097 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:26.199053 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-198.ec2.internal" podStartSLOduration=2.199041334 podStartE2EDuration="2.199041334s" podCreationTimestamp="2026-04-20 15:02:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:02:26.198691426 +0000 UTC m=+3.571278143" watchObservedRunningTime="2026-04-20 15:02:26.199041334 +0000 UTC m=+3.571628119" Apr 20 15:02:26.676357 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:26.676322 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee584f46-b9aa-46b2-a060-01c6f4e256e9-metrics-certs\") pod \"network-metrics-daemon-lnqrj\" (UID: \"ee584f46-b9aa-46b2-a060-01c6f4e256e9\") " pod="openshift-multus/network-metrics-daemon-lnqrj" Apr 20 15:02:26.676529 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:26.676501 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:02:26.676597 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:26.676568 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee584f46-b9aa-46b2-a060-01c6f4e256e9-metrics-certs podName:ee584f46-b9aa-46b2-a060-01c6f4e256e9 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:28.67654996 +0000 UTC m=+6.049136651 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ee584f46-b9aa-46b2-a060-01c6f4e256e9-metrics-certs") pod "network-metrics-daemon-lnqrj" (UID: "ee584f46-b9aa-46b2-a060-01c6f4e256e9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:02:26.776775 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:26.776745 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhkgn\" (UniqueName: \"kubernetes.io/projected/5e85baf4-757f-4b92-b2be-cb821cc1b33e-kube-api-access-vhkgn\") pod \"network-check-target-fbms8\" (UID: \"5e85baf4-757f-4b92-b2be-cb821cc1b33e\") " pod="openshift-network-diagnostics/network-check-target-fbms8" Apr 20 15:02:26.776950 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:26.776935 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 15:02:26.777023 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:26.776956 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 15:02:26.777023 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:26.776970 2575 projected.go:194] Error preparing data for projected volume kube-api-access-vhkgn for pod openshift-network-diagnostics/network-check-target-fbms8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:02:26.777132 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:26.777029 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5e85baf4-757f-4b92-b2be-cb821cc1b33e-kube-api-access-vhkgn podName:5e85baf4-757f-4b92-b2be-cb821cc1b33e nodeName:}" failed. No retries permitted until 2026-04-20 15:02:28.777010413 +0000 UTC m=+6.149597100 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-vhkgn" (UniqueName: "kubernetes.io/projected/5e85baf4-757f-4b92-b2be-cb821cc1b33e-kube-api-access-vhkgn") pod "network-check-target-fbms8" (UID: "5e85baf4-757f-4b92-b2be-cb821cc1b33e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:02:27.180508 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:27.180475 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fbms8" Apr 20 15:02:27.180965 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:27.180609 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fbms8" podUID="5e85baf4-757f-4b92-b2be-cb821cc1b33e" Apr 20 15:02:27.203641 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:27.203601 2575 generic.go:358] "Generic (PLEG): container finished" podID="9fc3cee085866514e6ec8498fa3dbfcb" containerID="800a176356af50d0ff3e5f6fa640bf8eb02f7be03419df963cc33f1d661ddd52" exitCode=0 Apr 20 15:02:27.203830 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:27.203772 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal" event={"ID":"9fc3cee085866514e6ec8498fa3dbfcb","Type":"ContainerDied","Data":"800a176356af50d0ff3e5f6fa640bf8eb02f7be03419df963cc33f1d661ddd52"} Apr 20 15:02:28.177885 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:28.177839 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnqrj" Apr 20 15:02:28.178097 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:28.177989 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lnqrj" podUID="ee584f46-b9aa-46b2-a060-01c6f4e256e9" Apr 20 15:02:28.215910 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:28.215844 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal" event={"ID":"9fc3cee085866514e6ec8498fa3dbfcb","Type":"ContainerStarted","Data":"212ca54fae45796fc691fa0b44418096ada8c4cf662d25542c2057b6965a6dfd"} Apr 20 15:02:28.229822 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:28.229614 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal" podStartSLOduration=4.229594314 podStartE2EDuration="4.229594314s" podCreationTimestamp="2026-04-20 15:02:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:02:28.22855825 +0000 UTC m=+5.601144955" watchObservedRunningTime="2026-04-20 15:02:28.229594314 +0000 UTC m=+5.602181018" Apr 20 15:02:28.693550 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:28.693516 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee584f46-b9aa-46b2-a060-01c6f4e256e9-metrics-certs\") pod \"network-metrics-daemon-lnqrj\" (UID: \"ee584f46-b9aa-46b2-a060-01c6f4e256e9\") " pod="openshift-multus/network-metrics-daemon-lnqrj" Apr 20 15:02:28.693751 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:28.693723 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:02:28.693812 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:28.693784 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee584f46-b9aa-46b2-a060-01c6f4e256e9-metrics-certs podName:ee584f46-b9aa-46b2-a060-01c6f4e256e9 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:32.693764954 +0000 UTC m=+10.066351639 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ee584f46-b9aa-46b2-a060-01c6f4e256e9-metrics-certs") pod "network-metrics-daemon-lnqrj" (UID: "ee584f46-b9aa-46b2-a060-01c6f4e256e9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:02:28.794757 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:28.794138 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhkgn\" (UniqueName: \"kubernetes.io/projected/5e85baf4-757f-4b92-b2be-cb821cc1b33e-kube-api-access-vhkgn\") pod \"network-check-target-fbms8\" (UID: \"5e85baf4-757f-4b92-b2be-cb821cc1b33e\") " pod="openshift-network-diagnostics/network-check-target-fbms8" Apr 20 15:02:28.794757 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:28.794367 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 15:02:28.794757 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:28.794389 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 15:02:28.794757 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:28.794402 2575 projected.go:194] Error preparing data for projected volume kube-api-access-vhkgn for pod openshift-network-diagnostics/network-check-target-fbms8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:02:28.795109 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:28.794807 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5e85baf4-757f-4b92-b2be-cb821cc1b33e-kube-api-access-vhkgn podName:5e85baf4-757f-4b92-b2be-cb821cc1b33e nodeName:}" failed. No retries permitted until 2026-04-20 15:02:32.79478027 +0000 UTC m=+10.167366956 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-vhkgn" (UniqueName: "kubernetes.io/projected/5e85baf4-757f-4b92-b2be-cb821cc1b33e-kube-api-access-vhkgn") pod "network-check-target-fbms8" (UID: "5e85baf4-757f-4b92-b2be-cb821cc1b33e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:02:29.178129 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:29.178053 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fbms8" Apr 20 15:02:29.178286 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:29.178178 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fbms8" podUID="5e85baf4-757f-4b92-b2be-cb821cc1b33e" Apr 20 15:02:30.178471 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:30.178434 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnqrj" Apr 20 15:02:30.178953 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:30.178579 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lnqrj" podUID="ee584f46-b9aa-46b2-a060-01c6f4e256e9" Apr 20 15:02:30.369974 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:30.369222 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-2nhxg"] Apr 20 15:02:30.371755 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:30.371729 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2nhxg" Apr 20 15:02:30.371870 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:30.371806 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2nhxg" podUID="c7922475-7fe5-4f88-a1fd-a1bd0359f7c6" Apr 20 15:02:30.406214 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:30.406044 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c7922475-7fe5-4f88-a1fd-a1bd0359f7c6-dbus\") pod \"global-pull-secret-syncer-2nhxg\" (UID: \"c7922475-7fe5-4f88-a1fd-a1bd0359f7c6\") " pod="kube-system/global-pull-secret-syncer-2nhxg" Apr 20 15:02:30.406214 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:30.406094 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c7922475-7fe5-4f88-a1fd-a1bd0359f7c6-kubelet-config\") pod \"global-pull-secret-syncer-2nhxg\" (UID: \"c7922475-7fe5-4f88-a1fd-a1bd0359f7c6\") " pod="kube-system/global-pull-secret-syncer-2nhxg" Apr 20 15:02:30.406214 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:30.406129 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c7922475-7fe5-4f88-a1fd-a1bd0359f7c6-original-pull-secret\") pod \"global-pull-secret-syncer-2nhxg\" (UID: \"c7922475-7fe5-4f88-a1fd-a1bd0359f7c6\") " pod="kube-system/global-pull-secret-syncer-2nhxg" Apr 20 15:02:30.507886 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:30.507339 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c7922475-7fe5-4f88-a1fd-a1bd0359f7c6-dbus\") pod \"global-pull-secret-syncer-2nhxg\" (UID: \"c7922475-7fe5-4f88-a1fd-a1bd0359f7c6\") " pod="kube-system/global-pull-secret-syncer-2nhxg" Apr 20 15:02:30.507886 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:30.507393 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c7922475-7fe5-4f88-a1fd-a1bd0359f7c6-kubelet-config\") pod \"global-pull-secret-syncer-2nhxg\" (UID: \"c7922475-7fe5-4f88-a1fd-a1bd0359f7c6\") " pod="kube-system/global-pull-secret-syncer-2nhxg" Apr 20 15:02:30.507886 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:30.507421 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c7922475-7fe5-4f88-a1fd-a1bd0359f7c6-original-pull-secret\") pod \"global-pull-secret-syncer-2nhxg\" (UID: \"c7922475-7fe5-4f88-a1fd-a1bd0359f7c6\") " pod="kube-system/global-pull-secret-syncer-2nhxg" Apr 20 15:02:30.507886 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:30.507516 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c7922475-7fe5-4f88-a1fd-a1bd0359f7c6-dbus\") pod \"global-pull-secret-syncer-2nhxg\" (UID: \"c7922475-7fe5-4f88-a1fd-a1bd0359f7c6\") " pod="kube-system/global-pull-secret-syncer-2nhxg" Apr 20 15:02:30.507886 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:30.507571 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 15:02:30.507886 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:30.507588 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c7922475-7fe5-4f88-a1fd-a1bd0359f7c6-kubelet-config\") pod \"global-pull-secret-syncer-2nhxg\" (UID: \"c7922475-7fe5-4f88-a1fd-a1bd0359f7c6\") " pod="kube-system/global-pull-secret-syncer-2nhxg" Apr 20 15:02:30.507886 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:30.507632 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7922475-7fe5-4f88-a1fd-a1bd0359f7c6-original-pull-secret podName:c7922475-7fe5-4f88-a1fd-a1bd0359f7c6 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:31.007612729 +0000 UTC m=+8.380199432 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c7922475-7fe5-4f88-a1fd-a1bd0359f7c6-original-pull-secret") pod "global-pull-secret-syncer-2nhxg" (UID: "c7922475-7fe5-4f88-a1fd-a1bd0359f7c6") : object "kube-system"/"original-pull-secret" not registered Apr 20 15:02:31.011994 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:31.011907 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c7922475-7fe5-4f88-a1fd-a1bd0359f7c6-original-pull-secret\") pod \"global-pull-secret-syncer-2nhxg\" (UID: \"c7922475-7fe5-4f88-a1fd-a1bd0359f7c6\") " pod="kube-system/global-pull-secret-syncer-2nhxg" Apr 20 15:02:31.012183 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:31.012048 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 15:02:31.012183 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:31.012115 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7922475-7fe5-4f88-a1fd-a1bd0359f7c6-original-pull-secret podName:c7922475-7fe5-4f88-a1fd-a1bd0359f7c6 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:32.012097614 +0000 UTC m=+9.384684298 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c7922475-7fe5-4f88-a1fd-a1bd0359f7c6-original-pull-secret") pod "global-pull-secret-syncer-2nhxg" (UID: "c7922475-7fe5-4f88-a1fd-a1bd0359f7c6") : object "kube-system"/"original-pull-secret" not registered Apr 20 15:02:31.178049 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:31.177573 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fbms8" Apr 20 15:02:31.178049 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:31.177702 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fbms8" podUID="5e85baf4-757f-4b92-b2be-cb821cc1b33e" Apr 20 15:02:32.022065 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:32.021524 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c7922475-7fe5-4f88-a1fd-a1bd0359f7c6-original-pull-secret\") pod \"global-pull-secret-syncer-2nhxg\" (UID: \"c7922475-7fe5-4f88-a1fd-a1bd0359f7c6\") " pod="kube-system/global-pull-secret-syncer-2nhxg" Apr 20 15:02:32.022065 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:32.021669 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 15:02:32.022065 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:32.021733 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7922475-7fe5-4f88-a1fd-a1bd0359f7c6-original-pull-secret podName:c7922475-7fe5-4f88-a1fd-a1bd0359f7c6 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:34.021714793 +0000 UTC m=+11.394301479 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c7922475-7fe5-4f88-a1fd-a1bd0359f7c6-original-pull-secret") pod "global-pull-secret-syncer-2nhxg" (UID: "c7922475-7fe5-4f88-a1fd-a1bd0359f7c6") : object "kube-system"/"original-pull-secret" not registered Apr 20 15:02:32.179031 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:32.178519 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnqrj" Apr 20 15:02:32.179031 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:32.178662 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lnqrj" podUID="ee584f46-b9aa-46b2-a060-01c6f4e256e9" Apr 20 15:02:32.179031 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:32.178735 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2nhxg" Apr 20 15:02:32.179031 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:32.178850 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2nhxg" podUID="c7922475-7fe5-4f88-a1fd-a1bd0359f7c6" Apr 20 15:02:32.727332 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:32.726882 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee584f46-b9aa-46b2-a060-01c6f4e256e9-metrics-certs\") pod \"network-metrics-daemon-lnqrj\" (UID: \"ee584f46-b9aa-46b2-a060-01c6f4e256e9\") " pod="openshift-multus/network-metrics-daemon-lnqrj" Apr 20 15:02:32.727332 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:32.727072 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:02:32.727332 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:32.727142 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee584f46-b9aa-46b2-a060-01c6f4e256e9-metrics-certs podName:ee584f46-b9aa-46b2-a060-01c6f4e256e9 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:40.72712308 +0000 UTC m=+18.099709776 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ee584f46-b9aa-46b2-a060-01c6f4e256e9-metrics-certs") pod "network-metrics-daemon-lnqrj" (UID: "ee584f46-b9aa-46b2-a060-01c6f4e256e9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:02:32.827944 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:32.827894 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhkgn\" (UniqueName: \"kubernetes.io/projected/5e85baf4-757f-4b92-b2be-cb821cc1b33e-kube-api-access-vhkgn\") pod \"network-check-target-fbms8\" (UID: \"5e85baf4-757f-4b92-b2be-cb821cc1b33e\") " pod="openshift-network-diagnostics/network-check-target-fbms8" Apr 20 15:02:32.828135 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:32.828073 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 15:02:32.828135 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:32.828094 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 15:02:32.828135 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:32.828106 2575 projected.go:194] Error preparing data for projected volume kube-api-access-vhkgn for pod openshift-network-diagnostics/network-check-target-fbms8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:02:32.828303 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:32.828161 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5e85baf4-757f-4b92-b2be-cb821cc1b33e-kube-api-access-vhkgn podName:5e85baf4-757f-4b92-b2be-cb821cc1b33e nodeName:}" failed. No retries permitted until 2026-04-20 15:02:40.828142615 +0000 UTC m=+18.200729520 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-vhkgn" (UniqueName: "kubernetes.io/projected/5e85baf4-757f-4b92-b2be-cb821cc1b33e-kube-api-access-vhkgn") pod "network-check-target-fbms8" (UID: "5e85baf4-757f-4b92-b2be-cb821cc1b33e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:02:33.179293 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:33.178987 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fbms8" Apr 20 15:02:33.179693 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:33.179337 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fbms8" podUID="5e85baf4-757f-4b92-b2be-cb821cc1b33e" Apr 20 15:02:34.037191 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:34.037150 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c7922475-7fe5-4f88-a1fd-a1bd0359f7c6-original-pull-secret\") pod \"global-pull-secret-syncer-2nhxg\" (UID: \"c7922475-7fe5-4f88-a1fd-a1bd0359f7c6\") " pod="kube-system/global-pull-secret-syncer-2nhxg" Apr 20 15:02:34.037413 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:34.037338 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 15:02:34.037475 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:34.037421 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7922475-7fe5-4f88-a1fd-a1bd0359f7c6-original-pull-secret podName:c7922475-7fe5-4f88-a1fd-a1bd0359f7c6 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:38.037397532 +0000 UTC m=+15.409984236 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c7922475-7fe5-4f88-a1fd-a1bd0359f7c6-original-pull-secret") pod "global-pull-secret-syncer-2nhxg" (UID: "c7922475-7fe5-4f88-a1fd-a1bd0359f7c6") : object "kube-system"/"original-pull-secret" not registered Apr 20 15:02:34.177800 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:34.177761 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnqrj" Apr 20 15:02:34.177960 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:34.177759 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2nhxg" Apr 20 15:02:34.177960 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:34.177934 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lnqrj" podUID="ee584f46-b9aa-46b2-a060-01c6f4e256e9" Apr 20 15:02:34.178080 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:34.178064 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2nhxg" podUID="c7922475-7fe5-4f88-a1fd-a1bd0359f7c6" Apr 20 15:02:35.178194 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:35.178109 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fbms8" Apr 20 15:02:35.178650 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:35.178234 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fbms8" podUID="5e85baf4-757f-4b92-b2be-cb821cc1b33e" Apr 20 15:02:36.178059 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:36.178018 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2nhxg" Apr 20 15:02:36.178228 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:36.178150 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2nhxg" podUID="c7922475-7fe5-4f88-a1fd-a1bd0359f7c6" Apr 20 15:02:36.178228 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:36.178184 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnqrj" Apr 20 15:02:36.178621 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:36.178289 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lnqrj" podUID="ee584f46-b9aa-46b2-a060-01c6f4e256e9" Apr 20 15:02:37.178220 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:37.178186 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fbms8" Apr 20 15:02:37.178678 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:37.178328 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fbms8" podUID="5e85baf4-757f-4b92-b2be-cb821cc1b33e" Apr 20 15:02:38.066594 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:38.066556 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c7922475-7fe5-4f88-a1fd-a1bd0359f7c6-original-pull-secret\") pod \"global-pull-secret-syncer-2nhxg\" (UID: \"c7922475-7fe5-4f88-a1fd-a1bd0359f7c6\") " pod="kube-system/global-pull-secret-syncer-2nhxg" Apr 20 15:02:38.066769 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:38.066725 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 15:02:38.066818 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:38.066793 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7922475-7fe5-4f88-a1fd-a1bd0359f7c6-original-pull-secret podName:c7922475-7fe5-4f88-a1fd-a1bd0359f7c6 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:46.066777459 +0000 UTC m=+23.439364141 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c7922475-7fe5-4f88-a1fd-a1bd0359f7c6-original-pull-secret") pod "global-pull-secret-syncer-2nhxg" (UID: "c7922475-7fe5-4f88-a1fd-a1bd0359f7c6") : object "kube-system"/"original-pull-secret" not registered Apr 20 15:02:38.177746 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:38.177708 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2nhxg" Apr 20 15:02:38.177910 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:38.177801 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnqrj" Apr 20 15:02:38.177953 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:38.177926 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2nhxg" podUID="c7922475-7fe5-4f88-a1fd-a1bd0359f7c6" Apr 20 15:02:38.178079 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:38.178057 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lnqrj" podUID="ee584f46-b9aa-46b2-a060-01c6f4e256e9" Apr 20 15:02:39.178428 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:39.178392 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fbms8" Apr 20 15:02:39.178841 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:39.178532 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fbms8" podUID="5e85baf4-757f-4b92-b2be-cb821cc1b33e" Apr 20 15:02:40.177795 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:40.177764 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnqrj" Apr 20 15:02:40.178096 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:40.177779 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2nhxg" Apr 20 15:02:40.178096 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:40.177901 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lnqrj" podUID="ee584f46-b9aa-46b2-a060-01c6f4e256e9" Apr 20 15:02:40.178096 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:40.177976 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2nhxg" podUID="c7922475-7fe5-4f88-a1fd-a1bd0359f7c6" Apr 20 15:02:40.784363 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:40.784319 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee584f46-b9aa-46b2-a060-01c6f4e256e9-metrics-certs\") pod \"network-metrics-daemon-lnqrj\" (UID: \"ee584f46-b9aa-46b2-a060-01c6f4e256e9\") " pod="openshift-multus/network-metrics-daemon-lnqrj" Apr 20 15:02:40.784769 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:40.784499 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:02:40.784769 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:40.784563 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee584f46-b9aa-46b2-a060-01c6f4e256e9-metrics-certs podName:ee584f46-b9aa-46b2-a060-01c6f4e256e9 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:56.784547275 +0000 UTC m=+34.157133957 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ee584f46-b9aa-46b2-a060-01c6f4e256e9-metrics-certs") pod "network-metrics-daemon-lnqrj" (UID: "ee584f46-b9aa-46b2-a060-01c6f4e256e9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:02:40.793890 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:40.793855 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-wfw5f"] Apr 20 15:02:40.882457 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:40.882421 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wfw5f" Apr 20 15:02:40.884795 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:40.884763 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhkgn\" (UniqueName: \"kubernetes.io/projected/5e85baf4-757f-4b92-b2be-cb821cc1b33e-kube-api-access-vhkgn\") pod \"network-check-target-fbms8\" (UID: \"5e85baf4-757f-4b92-b2be-cb821cc1b33e\") " pod="openshift-network-diagnostics/network-check-target-fbms8" Apr 20 15:02:40.884921 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:40.884892 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 15:02:40.884921 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:40.884908 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 15:02:40.884921 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:40.884919 2575 projected.go:194] Error preparing data for projected volume kube-api-access-vhkgn for pod openshift-network-diagnostics/network-check-target-fbms8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:02:40.885029 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:40.884970 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5e85baf4-757f-4b92-b2be-cb821cc1b33e-kube-api-access-vhkgn podName:5e85baf4-757f-4b92-b2be-cb821cc1b33e nodeName:}" failed. No retries permitted until 2026-04-20 15:02:56.884957707 +0000 UTC m=+34.257544390 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-vhkgn" (UniqueName: "kubernetes.io/projected/5e85baf4-757f-4b92-b2be-cb821cc1b33e-kube-api-access-vhkgn") pod "network-check-target-fbms8" (UID: "5e85baf4-757f-4b92-b2be-cb821cc1b33e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:02:40.885320 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:40.885301 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 15:02:40.885320 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:40.885317 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-wt7fq\"" Apr 20 15:02:40.885443 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:40.885319 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 15:02:40.985448 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:40.985410 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2e79beec-c5db-41c7-a60e-c759696b1d60-tmp-dir\") pod \"node-resolver-wfw5f\" (UID: \"2e79beec-c5db-41c7-a60e-c759696b1d60\") " pod="openshift-dns/node-resolver-wfw5f" Apr 20 15:02:40.985637 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:40.985512 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2e79beec-c5db-41c7-a60e-c759696b1d60-hosts-file\") pod \"node-resolver-wfw5f\" (UID: \"2e79beec-c5db-41c7-a60e-c759696b1d60\") " pod="openshift-dns/node-resolver-wfw5f" Apr 20 15:02:40.985637 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:40.985551 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmr58\" (UniqueName: \"kubernetes.io/projected/2e79beec-c5db-41c7-a60e-c759696b1d60-kube-api-access-vmr58\") pod \"node-resolver-wfw5f\" (UID: \"2e79beec-c5db-41c7-a60e-c759696b1d60\") " pod="openshift-dns/node-resolver-wfw5f" Apr 20 15:02:41.086088 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:41.086044 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2e79beec-c5db-41c7-a60e-c759696b1d60-hosts-file\") pod \"node-resolver-wfw5f\" (UID: \"2e79beec-c5db-41c7-a60e-c759696b1d60\") " pod="openshift-dns/node-resolver-wfw5f" Apr 20 15:02:41.086300 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:41.086106 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vmr58\" (UniqueName: \"kubernetes.io/projected/2e79beec-c5db-41c7-a60e-c759696b1d60-kube-api-access-vmr58\") pod \"node-resolver-wfw5f\" (UID: \"2e79beec-c5db-41c7-a60e-c759696b1d60\") " pod="openshift-dns/node-resolver-wfw5f" Apr 20 15:02:41.086300 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:41.086156 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2e79beec-c5db-41c7-a60e-c759696b1d60-tmp-dir\") pod \"node-resolver-wfw5f\" (UID: \"2e79beec-c5db-41c7-a60e-c759696b1d60\") " pod="openshift-dns/node-resolver-wfw5f" Apr 20 15:02:41.086300 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:41.086194 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2e79beec-c5db-41c7-a60e-c759696b1d60-hosts-file\") pod \"node-resolver-wfw5f\" (UID: \"2e79beec-c5db-41c7-a60e-c759696b1d60\") " pod="openshift-dns/node-resolver-wfw5f" Apr 20 15:02:41.086591 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:41.086568 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2e79beec-c5db-41c7-a60e-c759696b1d60-tmp-dir\") pod \"node-resolver-wfw5f\" (UID: \"2e79beec-c5db-41c7-a60e-c759696b1d60\") " pod="openshift-dns/node-resolver-wfw5f" Apr 20 15:02:41.098572 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:41.098531 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmr58\" (UniqueName: \"kubernetes.io/projected/2e79beec-c5db-41c7-a60e-c759696b1d60-kube-api-access-vmr58\") pod \"node-resolver-wfw5f\" (UID: \"2e79beec-c5db-41c7-a60e-c759696b1d60\") " pod="openshift-dns/node-resolver-wfw5f" Apr 20 15:02:41.177799 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:41.177766 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fbms8" Apr 20 15:02:41.177971 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:41.177884 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fbms8" podUID="5e85baf4-757f-4b92-b2be-cb821cc1b33e" Apr 20 15:02:41.190962 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:41.190924 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wfw5f" Apr 20 15:02:42.177578 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:42.177536 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2nhxg" Apr 20 15:02:42.177578 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:42.177556 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnqrj" Apr 20 15:02:42.178160 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:42.177664 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2nhxg" podUID="c7922475-7fe5-4f88-a1fd-a1bd0359f7c6" Apr 20 15:02:42.178160 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:42.177798 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lnqrj" podUID="ee584f46-b9aa-46b2-a060-01c6f4e256e9" Apr 20 15:02:42.603989 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:02:42.603943 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e79beec_c5db_41c7_a60e_c759696b1d60.slice/crio-92bd1406b58607955a1a955d980e898da629eaebc0c7cf18ed62a08e266f0bc6 WatchSource:0}: Error finding container 92bd1406b58607955a1a955d980e898da629eaebc0c7cf18ed62a08e266f0bc6: Status 404 returned error can't find the container with id 92bd1406b58607955a1a955d980e898da629eaebc0c7cf18ed62a08e266f0bc6 Apr 20 15:02:43.179265 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:43.178962 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fbms8" Apr 20 15:02:43.179902 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:43.179357 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fbms8" podUID="5e85baf4-757f-4b92-b2be-cb821cc1b33e" Apr 20 15:02:43.242578 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:43.242549 2575 generic.go:358] "Generic (PLEG): container finished" podID="68cb8276-452d-4742-adc2-9eb67152cc05" containerID="2490bb736cc38be635df0f84a86f81bdd916b270981ae27afebfd7e62a9a4eec" exitCode=0 Apr 20 15:02:43.242713 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:43.242632 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5mqkf" event={"ID":"68cb8276-452d-4742-adc2-9eb67152cc05","Type":"ContainerDied","Data":"2490bb736cc38be635df0f84a86f81bdd916b270981ae27afebfd7e62a9a4eec"} Apr 20 15:02:43.244071 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:43.244005 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wfw5f" event={"ID":"2e79beec-c5db-41c7-a60e-c759696b1d60","Type":"ContainerStarted","Data":"a79ef88d248df5f05ebb38a3ed931774357fa85ce2ac1755867867cb79ea74ad"} Apr 20 15:02:43.244071 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:43.244042 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wfw5f" event={"ID":"2e79beec-c5db-41c7-a60e-c759696b1d60","Type":"ContainerStarted","Data":"92bd1406b58607955a1a955d980e898da629eaebc0c7cf18ed62a08e266f0bc6"} Apr 20 15:02:43.246216 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:43.246196 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" event={"ID":"6d67f43f-b926-4199-ae7a-ccf686190d9b","Type":"ContainerStarted","Data":"5633e11d7a7af4eeb291cd80194230c55995467777ce988cbf322a76f694ef65"} Apr 20 15:02:43.246327 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:43.246225 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" event={"ID":"6d67f43f-b926-4199-ae7a-ccf686190d9b","Type":"ContainerStarted","Data":"cd1292b4921d1613656b8a80743e0b8f3c7a144573d35e736d56f8571db43eab"} Apr 20 15:02:43.246327 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:43.246239 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" event={"ID":"6d67f43f-b926-4199-ae7a-ccf686190d9b","Type":"ContainerStarted","Data":"d38fcf2bbb154a0ab1afedf021477391fca5092f6e8b927a4f42ee96bbfbc0ea"} Apr 20 15:02:43.246327 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:43.246251 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" event={"ID":"6d67f43f-b926-4199-ae7a-ccf686190d9b","Type":"ContainerStarted","Data":"8f9b82da1278898306e4ec4eeb7d589ca6fcad145942aba1d22ea746c3fa0f17"} Apr 20 15:02:43.247421 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:43.247394 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" event={"ID":"f50b30d6-0d2f-4686-a3f8-128ef485713a","Type":"ContainerStarted","Data":"97ead392a46d2b074ce2259ead08a7b5c5d3dbf4fda08e0293f0ff76e5d9defd"} Apr 20 15:02:43.248591 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:43.248570 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9pwx" event={"ID":"f9489b29-6fd7-4b56-abbc-9557d736b76c","Type":"ContainerStarted","Data":"b45f6aabb3054d0156e59dd74e0eb6f380e50a29dc26d28e570a30c90ac48157"} Apr 20 15:02:43.249906 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:43.249884 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-z9bz2" event={"ID":"6318e638-1067-4e23-94f4-dad4de00297a","Type":"ContainerStarted","Data":"50c3157dfa0d23f2bb4d0aee42392656e9b84ee80c93c6a26b8d40408136d0fa"} Apr 20 15:02:43.251134 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:43.251114 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-flrxv" event={"ID":"86824264-d16e-4d82-854b-f1f5bc86483c","Type":"ContainerStarted","Data":"46e14c0f4929fd969b6edae374e40995986e9f149b0913a50af484650c4dc8b2"} Apr 20 15:02:43.252348 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:43.252328 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-t94xd" event={"ID":"86bce86e-ed68-41f9-be80-acf37bfb646f","Type":"ContainerStarted","Data":"7682732cc60f7fb7f8fac32ec050927f752801e15a132ac050029afe0f00282e"} Apr 20 15:02:43.278817 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:43.278774 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-wcr5b" podStartSLOduration=3.581857682 podStartE2EDuration="20.278758238s" podCreationTimestamp="2026-04-20 15:02:23 +0000 UTC" firstStartedPulling="2026-04-20 15:02:25.903868941 +0000 UTC m=+3.276455638" lastFinishedPulling="2026-04-20 15:02:42.600769511 +0000 UTC m=+19.973356194" observedRunningTime="2026-04-20 15:02:43.278676018 +0000 UTC m=+20.651262721" watchObservedRunningTime="2026-04-20 15:02:43.278758238 +0000 UTC m=+20.651344943" Apr 20 15:02:43.284733 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:43.284708 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-t94xd" Apr 20 15:02:43.285263 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:43.285245 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-t94xd" Apr 20 15:02:43.307820 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:43.307780 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-flrxv" podStartSLOduration=3.63653405 podStartE2EDuration="20.307765252s" podCreationTimestamp="2026-04-20 15:02:23 +0000 UTC" firstStartedPulling="2026-04-20 15:02:25.899496718 +0000 UTC m=+3.272083413" lastFinishedPulling="2026-04-20 15:02:42.570727928 +0000 UTC m=+19.943314615" observedRunningTime="2026-04-20 15:02:43.291405364 +0000 UTC m=+20.663992067" watchObservedRunningTime="2026-04-20 15:02:43.307765252 +0000 UTC m=+20.680351956" Apr 20 15:02:43.307982 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:43.307957 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-z9bz2" podStartSLOduration=3.597958973 podStartE2EDuration="20.307950615s" podCreationTimestamp="2026-04-20 15:02:23 +0000 UTC" firstStartedPulling="2026-04-20 15:02:25.900876848 +0000 UTC m=+3.273463531" lastFinishedPulling="2026-04-20 15:02:42.610868476 +0000 UTC m=+19.983455173" observedRunningTime="2026-04-20 15:02:43.307485509 +0000 UTC m=+20.680072213" watchObservedRunningTime="2026-04-20 15:02:43.307950615 +0000 UTC m=+20.680537320" Apr 20 15:02:43.321014 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:43.320967 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-t94xd" podStartSLOduration=3.648492466 podStartE2EDuration="20.320950594s" podCreationTimestamp="2026-04-20 15:02:23 +0000 UTC" firstStartedPulling="2026-04-20 15:02:25.898290196 +0000 UTC m=+3.270876881" lastFinishedPulling="2026-04-20 15:02:42.570748312 +0000 UTC m=+19.943335009" observedRunningTime="2026-04-20 15:02:43.32091955 +0000 UTC m=+20.693506254" watchObservedRunningTime="2026-04-20 15:02:43.320950594 +0000 UTC m=+20.693537280" Apr 20 15:02:43.338292 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:43.338228 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-wfw5f" podStartSLOduration=3.338213157 podStartE2EDuration="3.338213157s" podCreationTimestamp="2026-04-20 15:02:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:02:43.337846212 +0000 UTC m=+20.710432916" watchObservedRunningTime="2026-04-20 15:02:43.338213157 +0000 UTC m=+20.710799860" Apr 20 15:02:43.863767 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:43.863741 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 15:02:44.120506 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:44.120394 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T15:02:43.863763927Z","UUID":"26735964-357e-44c1-84a6-2faea7e31bd0","Handler":null,"Name":"","Endpoint":""} Apr 20 15:02:44.123710 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:44.123684 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 15:02:44.123858 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:44.123720 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 15:02:44.178387 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:44.178358 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2nhxg" Apr 20 15:02:44.178569 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:44.178360 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnqrj" Apr 20 15:02:44.178569 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:44.178481 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2nhxg" podUID="c7922475-7fe5-4f88-a1fd-a1bd0359f7c6" Apr 20 15:02:44.178683 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:44.178580 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lnqrj" podUID="ee584f46-b9aa-46b2-a060-01c6f4e256e9" Apr 20 15:02:44.256300 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:44.256250 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-lvgwg" event={"ID":"d5149160-8e4b-48af-aaf9-f76ba9a5abfb","Type":"ContainerStarted","Data":"e03b4a65ad204e41287f81d6b4b549344ff3602072412334d0a9e141a00b0cf0"} Apr 20 15:02:44.259430 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:44.259394 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" event={"ID":"6d67f43f-b926-4199-ae7a-ccf686190d9b","Type":"ContainerStarted","Data":"cd3bb0f1671f7a6b903297a741532f6e90bbceb42d7f3230e8d3177b50412627"} Apr 20 15:02:44.259560 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:44.259435 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" event={"ID":"6d67f43f-b926-4199-ae7a-ccf686190d9b","Type":"ContainerStarted","Data":"4c32e3347bb6b7aa12083683c8fcf3d31bfef46865a13a91ca586db77c280a68"} Apr 20 15:02:44.261481 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:44.261441 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9pwx" event={"ID":"f9489b29-6fd7-4b56-abbc-9557d736b76c","Type":"ContainerStarted","Data":"38a385c633a650b65989fa50c66f102cf2a5f70343cf687e86c7f828e0e7a808"} Apr 20 15:02:44.262108 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:44.262089 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-t94xd" Apr 20 15:02:44.262790 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:44.262759 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-t94xd" Apr 20 15:02:44.284661 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:44.284613 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-lvgwg" podStartSLOduration=4.590286182 podStartE2EDuration="21.284598429s" podCreationTimestamp="2026-04-20 15:02:23 +0000 UTC" firstStartedPulling="2026-04-20 15:02:25.904804216 +0000 UTC m=+3.277390898" lastFinishedPulling="2026-04-20 15:02:42.59911646 +0000 UTC m=+19.971703145" observedRunningTime="2026-04-20 15:02:44.270759361 +0000 UTC m=+21.643346068" watchObservedRunningTime="2026-04-20 15:02:44.284598429 +0000 UTC m=+21.657185127" Apr 20 15:02:45.178348 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:45.178316 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fbms8" Apr 20 15:02:45.178536 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:45.178442 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fbms8" podUID="5e85baf4-757f-4b92-b2be-cb821cc1b33e" Apr 20 15:02:45.265832 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:45.265751 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9pwx" event={"ID":"f9489b29-6fd7-4b56-abbc-9557d736b76c","Type":"ContainerStarted","Data":"dfb95035526b0d7e00269447a0c172444c679ab79a998568970126655d8bb4ba"} Apr 20 15:02:45.283330 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:45.283283 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9pwx" podStartSLOduration=3.248333727 podStartE2EDuration="22.283255262s" podCreationTimestamp="2026-04-20 15:02:23 +0000 UTC" firstStartedPulling="2026-04-20 15:02:25.901773846 +0000 UTC m=+3.274360531" lastFinishedPulling="2026-04-20 15:02:44.93669537 +0000 UTC m=+22.309282066" observedRunningTime="2026-04-20 15:02:45.282797036 +0000 UTC m=+22.655383742" watchObservedRunningTime="2026-04-20 15:02:45.283255262 +0000 UTC m=+22.655842000" Apr 20 15:02:46.124602 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:46.124561 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c7922475-7fe5-4f88-a1fd-a1bd0359f7c6-original-pull-secret\") pod \"global-pull-secret-syncer-2nhxg\" (UID: \"c7922475-7fe5-4f88-a1fd-a1bd0359f7c6\") " pod="kube-system/global-pull-secret-syncer-2nhxg" Apr 20 15:02:46.124811 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:46.124736 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 15:02:46.124870 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:46.124818 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7922475-7fe5-4f88-a1fd-a1bd0359f7c6-original-pull-secret podName:c7922475-7fe5-4f88-a1fd-a1bd0359f7c6 nodeName:}" failed. No retries permitted until 2026-04-20 15:03:02.124798542 +0000 UTC m=+39.497385228 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c7922475-7fe5-4f88-a1fd-a1bd0359f7c6-original-pull-secret") pod "global-pull-secret-syncer-2nhxg" (UID: "c7922475-7fe5-4f88-a1fd-a1bd0359f7c6") : object "kube-system"/"original-pull-secret" not registered Apr 20 15:02:46.177744 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:46.177704 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnqrj" Apr 20 15:02:46.177912 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:46.177704 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2nhxg" Apr 20 15:02:46.177912 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:46.177848 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lnqrj" podUID="ee584f46-b9aa-46b2-a060-01c6f4e256e9" Apr 20 15:02:46.178035 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:46.177914 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2nhxg" podUID="c7922475-7fe5-4f88-a1fd-a1bd0359f7c6" Apr 20 15:02:46.273095 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:46.273040 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" event={"ID":"6d67f43f-b926-4199-ae7a-ccf686190d9b","Type":"ContainerStarted","Data":"c8d5666e8b4740286bc1384bc1d6247181403467eaafb79d9062322dd143e0b2"} Apr 20 15:02:47.181635 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:47.181437 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fbms8" Apr 20 15:02:47.181795 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:47.181725 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fbms8" podUID="5e85baf4-757f-4b92-b2be-cb821cc1b33e" Apr 20 15:02:48.178512 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:48.178476 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnqrj" Apr 20 15:02:48.178998 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:48.178476 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2nhxg" Apr 20 15:02:48.178998 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:48.178615 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lnqrj" podUID="ee584f46-b9aa-46b2-a060-01c6f4e256e9" Apr 20 15:02:48.178998 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:48.178648 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2nhxg" podUID="c7922475-7fe5-4f88-a1fd-a1bd0359f7c6" Apr 20 15:02:48.280283 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:48.280242 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" event={"ID":"6d67f43f-b926-4199-ae7a-ccf686190d9b","Type":"ContainerStarted","Data":"95346a46472a101246f0f948288f2cf69fb0f868e1e2f3ce36dc85c9b61e47ae"} Apr 20 15:02:48.280630 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:48.280596 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:48.281922 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:48.281893 2575 generic.go:358] "Generic (PLEG): container finished" podID="68cb8276-452d-4742-adc2-9eb67152cc05" containerID="41d90d603489b67b2bd499d9d8d92151a7a6816f36fed961cabe18a428eebc62" exitCode=0 Apr 20 15:02:48.282051 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:48.281929 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5mqkf" event={"ID":"68cb8276-452d-4742-adc2-9eb67152cc05","Type":"ContainerDied","Data":"41d90d603489b67b2bd499d9d8d92151a7a6816f36fed961cabe18a428eebc62"} Apr 20 15:02:48.296388 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:48.296353 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:48.310517 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:48.310473 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" podStartSLOduration=8.573857475 podStartE2EDuration="25.310460863s" podCreationTimestamp="2026-04-20 15:02:23 +0000 UTC" firstStartedPulling="2026-04-20 15:02:25.90743972 +0000 UTC m=+3.280026401" lastFinishedPulling="2026-04-20 15:02:42.644043102 +0000 UTC m=+20.016629789" observedRunningTime="2026-04-20 15:02:48.310373336 +0000 UTC m=+25.682960039" watchObservedRunningTime="2026-04-20 15:02:48.310460863 +0000 UTC m=+25.683047575" Apr 20 15:02:49.178329 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:49.178128 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fbms8" Apr 20 15:02:49.178488 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:49.178408 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fbms8" podUID="5e85baf4-757f-4b92-b2be-cb821cc1b33e" Apr 20 15:02:49.284337 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:49.284312 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 20 15:02:49.284742 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:49.284726 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:49.299143 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:49.299111 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:49.658368 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:49.658341 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-fbms8"] Apr 20 15:02:49.658510 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:49.658432 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fbms8" Apr 20 15:02:49.658559 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:49.658519 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fbms8" podUID="5e85baf4-757f-4b92-b2be-cb821cc1b33e" Apr 20 15:02:49.661443 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:49.661406 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-2nhxg"] Apr 20 15:02:49.661564 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:49.661508 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2nhxg" Apr 20 15:02:49.661619 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:49.661588 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2nhxg" podUID="c7922475-7fe5-4f88-a1fd-a1bd0359f7c6" Apr 20 15:02:49.662120 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:49.662097 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lnqrj"] Apr 20 15:02:49.662205 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:49.662191 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnqrj" Apr 20 15:02:49.662302 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:49.662268 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lnqrj" podUID="ee584f46-b9aa-46b2-a060-01c6f4e256e9" Apr 20 15:02:50.288185 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:50.288150 2575 generic.go:358] "Generic (PLEG): container finished" podID="68cb8276-452d-4742-adc2-9eb67152cc05" containerID="68b7a401c10b227cfe521d43d34408dc4e723c13b34c5bbb1b8bac1724e938f8" exitCode=0 Apr 20 15:02:50.288898 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:50.288238 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5mqkf" event={"ID":"68cb8276-452d-4742-adc2-9eb67152cc05","Type":"ContainerDied","Data":"68b7a401c10b227cfe521d43d34408dc4e723c13b34c5bbb1b8bac1724e938f8"} Apr 20 15:02:50.288898 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:50.288478 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 20 15:02:50.898449 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:50.898413 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:02:51.178609 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:51.178529 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnqrj" Apr 20 15:02:51.178758 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:51.178529 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2nhxg" Apr 20 15:02:51.178758 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:51.178654 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lnqrj" podUID="ee584f46-b9aa-46b2-a060-01c6f4e256e9" Apr 20 15:02:51.178758 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:51.178529 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fbms8" Apr 20 15:02:51.178758 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:51.178725 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2nhxg" podUID="c7922475-7fe5-4f88-a1fd-a1bd0359f7c6" Apr 20 15:02:51.178890 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:51.178802 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fbms8" podUID="5e85baf4-757f-4b92-b2be-cb821cc1b33e" Apr 20 15:02:51.292920 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:51.292888 2575 generic.go:358] "Generic (PLEG): container finished" podID="68cb8276-452d-4742-adc2-9eb67152cc05" containerID="8964ad264b6cd25b261a52b47415f7ef9d47c31b73e85e8cfd2fcb8d19c1d9bc" exitCode=0 Apr 20 15:02:51.293518 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:51.292963 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5mqkf" event={"ID":"68cb8276-452d-4742-adc2-9eb67152cc05","Type":"ContainerDied","Data":"8964ad264b6cd25b261a52b47415f7ef9d47c31b73e85e8cfd2fcb8d19c1d9bc"} Apr 20 15:02:52.308040 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:52.307984 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" podUID="6d67f43f-b926-4199-ae7a-ccf686190d9b" containerName="ovnkube-controller" probeResult="failure" output="" Apr 20 15:02:53.179262 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:53.179221 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fbms8" Apr 20 15:02:53.179467 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:53.179329 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2nhxg" Apr 20 15:02:53.179467 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:53.179371 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fbms8" podUID="5e85baf4-757f-4b92-b2be-cb821cc1b33e" Apr 20 15:02:53.179467 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:53.179409 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2nhxg" podUID="c7922475-7fe5-4f88-a1fd-a1bd0359f7c6" Apr 20 15:02:53.179467 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:53.179450 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnqrj" Apr 20 15:02:53.179699 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:53.179532 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lnqrj" podUID="ee584f46-b9aa-46b2-a060-01c6f4e256e9" Apr 20 15:02:55.177560 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:55.177525 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fbms8" Apr 20 15:02:55.178094 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:55.177584 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2nhxg" Apr 20 15:02:55.178094 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:55.177689 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2nhxg" podUID="c7922475-7fe5-4f88-a1fd-a1bd0359f7c6" Apr 20 15:02:55.178094 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:55.177773 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnqrj" Apr 20 15:02:55.178094 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:55.177901 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lnqrj" podUID="ee584f46-b9aa-46b2-a060-01c6f4e256e9" Apr 20 15:02:55.178094 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:55.177994 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fbms8" podUID="5e85baf4-757f-4b92-b2be-cb821cc1b33e" Apr 20 15:02:55.432747 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:55.432659 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-198.ec2.internal" event="NodeReady" Apr 20 15:02:55.432963 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:55.432830 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 15:02:55.478411 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:55.478375 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-xp9tj"] Apr 20 15:02:55.505980 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:55.505948 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-gcg7h"] Apr 20 15:02:55.506157 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:55.506132 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xp9tj" Apr 20 15:02:55.510479 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:55.510452 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 15:02:55.510479 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:55.510464 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 15:02:55.510801 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:55.510783 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vdrzr\"" Apr 20 15:02:55.525416 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:55.525390 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xp9tj"] Apr 20 15:02:55.525416 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:55.525421 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gcg7h"] Apr 20 15:02:55.525595 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:55.525552 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gcg7h" Apr 20 15:02:55.527867 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:55.527846 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 15:02:55.528067 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:55.527846 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 15:02:55.528067 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:55.527875 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-4l2x2\"" Apr 20 15:02:55.528236 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:55.528151 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 15:02:55.594260 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:55.594218 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfea901a-e1df-46c7-b211-b94f978562b5-metrics-tls\") pod \"dns-default-xp9tj\" (UID: \"bfea901a-e1df-46c7-b211-b94f978562b5\") " pod="openshift-dns/dns-default-xp9tj" Apr 20 15:02:55.594471 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:55.594299 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bfea901a-e1df-46c7-b211-b94f978562b5-tmp-dir\") pod \"dns-default-xp9tj\" (UID: \"bfea901a-e1df-46c7-b211-b94f978562b5\") " pod="openshift-dns/dns-default-xp9tj" Apr 20 15:02:55.594471 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:55.594420 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfea901a-e1df-46c7-b211-b94f978562b5-config-volume\") pod \"dns-default-xp9tj\" (UID: \"bfea901a-e1df-46c7-b211-b94f978562b5\") " pod="openshift-dns/dns-default-xp9tj" Apr 20 15:02:55.594471 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:55.594462 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg94p\" (UniqueName: \"kubernetes.io/projected/bfea901a-e1df-46c7-b211-b94f978562b5-kube-api-access-dg94p\") pod \"dns-default-xp9tj\" (UID: \"bfea901a-e1df-46c7-b211-b94f978562b5\") " pod="openshift-dns/dns-default-xp9tj" Apr 20 15:02:55.695632 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:55.695545 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfea901a-e1df-46c7-b211-b94f978562b5-metrics-tls\") pod \"dns-default-xp9tj\" (UID: \"bfea901a-e1df-46c7-b211-b94f978562b5\") " pod="openshift-dns/dns-default-xp9tj" Apr 20 15:02:55.695632 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:55.695599 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bfea901a-e1df-46c7-b211-b94f978562b5-tmp-dir\") pod \"dns-default-xp9tj\" (UID: \"bfea901a-e1df-46c7-b211-b94f978562b5\") " pod="openshift-dns/dns-default-xp9tj" Apr 20 15:02:55.695632 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:55.695636 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfmrs\" (UniqueName: \"kubernetes.io/projected/cfa73a34-d39a-4a89-b936-de5c6399f787-kube-api-access-mfmrs\") pod \"ingress-canary-gcg7h\" (UID: \"cfa73a34-d39a-4a89-b936-de5c6399f787\") " pod="openshift-ingress-canary/ingress-canary-gcg7h" Apr 20 15:02:55.695915 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:55.695655 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfa73a34-d39a-4a89-b936-de5c6399f787-cert\") pod \"ingress-canary-gcg7h\" (UID: \"cfa73a34-d39a-4a89-b936-de5c6399f787\") " pod="openshift-ingress-canary/ingress-canary-gcg7h" Apr 20 15:02:55.695915 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:55.695685 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfea901a-e1df-46c7-b211-b94f978562b5-config-volume\") pod \"dns-default-xp9tj\" (UID: \"bfea901a-e1df-46c7-b211-b94f978562b5\") " pod="openshift-dns/dns-default-xp9tj" Apr 20 15:02:55.695915 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:55.695703 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dg94p\" (UniqueName: \"kubernetes.io/projected/bfea901a-e1df-46c7-b211-b94f978562b5-kube-api-access-dg94p\") pod \"dns-default-xp9tj\" (UID: \"bfea901a-e1df-46c7-b211-b94f978562b5\") " pod="openshift-dns/dns-default-xp9tj" Apr 20 15:02:55.695915 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:55.695702 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 15:02:55.695915 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:55.695791 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfea901a-e1df-46c7-b211-b94f978562b5-metrics-tls podName:bfea901a-e1df-46c7-b211-b94f978562b5 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:56.195767249 +0000 UTC m=+33.568353937 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bfea901a-e1df-46c7-b211-b94f978562b5-metrics-tls") pod "dns-default-xp9tj" (UID: "bfea901a-e1df-46c7-b211-b94f978562b5") : secret "dns-default-metrics-tls" not found Apr 20 15:02:55.696708 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:55.696676 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bfea901a-e1df-46c7-b211-b94f978562b5-tmp-dir\") pod \"dns-default-xp9tj\" (UID: \"bfea901a-e1df-46c7-b211-b94f978562b5\") " pod="openshift-dns/dns-default-xp9tj" Apr 20 15:02:55.696985 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:55.696969 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfea901a-e1df-46c7-b211-b94f978562b5-config-volume\") pod \"dns-default-xp9tj\" (UID: \"bfea901a-e1df-46c7-b211-b94f978562b5\") " pod="openshift-dns/dns-default-xp9tj" Apr 20 15:02:55.706715 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:55.706683 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg94p\" (UniqueName: \"kubernetes.io/projected/bfea901a-e1df-46c7-b211-b94f978562b5-kube-api-access-dg94p\") pod \"dns-default-xp9tj\" (UID: \"bfea901a-e1df-46c7-b211-b94f978562b5\") " pod="openshift-dns/dns-default-xp9tj" Apr 20 15:02:55.796763 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:55.796725 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mfmrs\" (UniqueName: \"kubernetes.io/projected/cfa73a34-d39a-4a89-b936-de5c6399f787-kube-api-access-mfmrs\") pod \"ingress-canary-gcg7h\" (UID: \"cfa73a34-d39a-4a89-b936-de5c6399f787\") " pod="openshift-ingress-canary/ingress-canary-gcg7h" Apr 20 15:02:55.796763 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:55.796767 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfa73a34-d39a-4a89-b936-de5c6399f787-cert\") pod \"ingress-canary-gcg7h\" (UID: \"cfa73a34-d39a-4a89-b936-de5c6399f787\") " pod="openshift-ingress-canary/ingress-canary-gcg7h" Apr 20 15:02:55.797000 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:55.796879 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 15:02:55.797000 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:55.796957 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfa73a34-d39a-4a89-b936-de5c6399f787-cert podName:cfa73a34-d39a-4a89-b936-de5c6399f787 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:56.296936751 +0000 UTC m=+33.669523434 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cfa73a34-d39a-4a89-b936-de5c6399f787-cert") pod "ingress-canary-gcg7h" (UID: "cfa73a34-d39a-4a89-b936-de5c6399f787") : secret "canary-serving-cert" not found Apr 20 15:02:55.805725 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:55.805702 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfmrs\" (UniqueName: \"kubernetes.io/projected/cfa73a34-d39a-4a89-b936-de5c6399f787-kube-api-access-mfmrs\") pod \"ingress-canary-gcg7h\" (UID: \"cfa73a34-d39a-4a89-b936-de5c6399f787\") " pod="openshift-ingress-canary/ingress-canary-gcg7h" Apr 20 15:02:56.199891 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:56.199848 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfea901a-e1df-46c7-b211-b94f978562b5-metrics-tls\") pod \"dns-default-xp9tj\" (UID: \"bfea901a-e1df-46c7-b211-b94f978562b5\") " pod="openshift-dns/dns-default-xp9tj" Apr 20 15:02:56.200428 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:56.200000 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 15:02:56.200428 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:56.200073 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfea901a-e1df-46c7-b211-b94f978562b5-metrics-tls podName:bfea901a-e1df-46c7-b211-b94f978562b5 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:57.200055982 +0000 UTC m=+34.572642664 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bfea901a-e1df-46c7-b211-b94f978562b5-metrics-tls") pod "dns-default-xp9tj" (UID: "bfea901a-e1df-46c7-b211-b94f978562b5") : secret "dns-default-metrics-tls" not found Apr 20 15:02:56.301149 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:56.301113 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfa73a34-d39a-4a89-b936-de5c6399f787-cert\") pod \"ingress-canary-gcg7h\" (UID: \"cfa73a34-d39a-4a89-b936-de5c6399f787\") " pod="openshift-ingress-canary/ingress-canary-gcg7h" Apr 20 15:02:56.301363 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:56.301267 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 15:02:56.301442 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:56.301382 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfa73a34-d39a-4a89-b936-de5c6399f787-cert podName:cfa73a34-d39a-4a89-b936-de5c6399f787 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:57.301359691 +0000 UTC m=+34.673946375 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cfa73a34-d39a-4a89-b936-de5c6399f787-cert") pod "ingress-canary-gcg7h" (UID: "cfa73a34-d39a-4a89-b936-de5c6399f787") : secret "canary-serving-cert" not found Apr 20 15:02:56.804086 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:56.803979 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee584f46-b9aa-46b2-a060-01c6f4e256e9-metrics-certs\") pod \"network-metrics-daemon-lnqrj\" (UID: \"ee584f46-b9aa-46b2-a060-01c6f4e256e9\") " pod="openshift-multus/network-metrics-daemon-lnqrj" Apr 20 15:02:56.804251 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:56.804173 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:02:56.804354 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:56.804255 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee584f46-b9aa-46b2-a060-01c6f4e256e9-metrics-certs podName:ee584f46-b9aa-46b2-a060-01c6f4e256e9 nodeName:}" failed. No retries permitted until 2026-04-20 15:03:28.804235484 +0000 UTC m=+66.176822189 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ee584f46-b9aa-46b2-a060-01c6f4e256e9-metrics-certs") pod "network-metrics-daemon-lnqrj" (UID: "ee584f46-b9aa-46b2-a060-01c6f4e256e9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 15:02:56.904516 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:56.904471 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhkgn\" (UniqueName: \"kubernetes.io/projected/5e85baf4-757f-4b92-b2be-cb821cc1b33e-kube-api-access-vhkgn\") pod \"network-check-target-fbms8\" (UID: \"5e85baf4-757f-4b92-b2be-cb821cc1b33e\") " pod="openshift-network-diagnostics/network-check-target-fbms8" Apr 20 15:02:56.904681 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:56.904637 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 15:02:56.904681 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:56.904660 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 15:02:56.904681 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:56.904671 2575 projected.go:194] Error preparing data for projected volume kube-api-access-vhkgn for pod openshift-network-diagnostics/network-check-target-fbms8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:02:56.904801 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:56.904723 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5e85baf4-757f-4b92-b2be-cb821cc1b33e-kube-api-access-vhkgn podName:5e85baf4-757f-4b92-b2be-cb821cc1b33e nodeName:}" failed. No retries permitted until 2026-04-20 15:03:28.904709689 +0000 UTC m=+66.277296372 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-vhkgn" (UniqueName: "kubernetes.io/projected/5e85baf4-757f-4b92-b2be-cb821cc1b33e-kube-api-access-vhkgn") pod "network-check-target-fbms8" (UID: "5e85baf4-757f-4b92-b2be-cb821cc1b33e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 15:02:57.178069 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:57.178028 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2nhxg" Apr 20 15:02:57.178069 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:57.178074 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fbms8" Apr 20 15:02:57.178333 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:57.178213 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnqrj" Apr 20 15:02:57.180814 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:57.180785 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 15:02:57.180975 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:57.180951 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 15:02:57.181088 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:57.180956 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-dhzk6\"" Apr 20 15:02:57.181896 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:57.181880 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 15:02:57.181987 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:57.181906 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-64ff7\"" Apr 20 15:02:57.181987 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:57.181915 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 15:02:57.207239 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:57.207210 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfea901a-e1df-46c7-b211-b94f978562b5-metrics-tls\") pod \"dns-default-xp9tj\" (UID: \"bfea901a-e1df-46c7-b211-b94f978562b5\") " pod="openshift-dns/dns-default-xp9tj" Apr 20 15:02:57.207611 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:57.207349 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 15:02:57.207611 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:57.207401 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfea901a-e1df-46c7-b211-b94f978562b5-metrics-tls podName:bfea901a-e1df-46c7-b211-b94f978562b5 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:59.207389571 +0000 UTC m=+36.579976254 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bfea901a-e1df-46c7-b211-b94f978562b5-metrics-tls") pod "dns-default-xp9tj" (UID: "bfea901a-e1df-46c7-b211-b94f978562b5") : secret "dns-default-metrics-tls" not found Apr 20 15:02:57.307682 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:57.307651 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfa73a34-d39a-4a89-b936-de5c6399f787-cert\") pod \"ingress-canary-gcg7h\" (UID: \"cfa73a34-d39a-4a89-b936-de5c6399f787\") " pod="openshift-ingress-canary/ingress-canary-gcg7h" Apr 20 15:02:57.307850 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:57.307790 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 15:02:57.307850 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:57.307847 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfa73a34-d39a-4a89-b936-de5c6399f787-cert podName:cfa73a34-d39a-4a89-b936-de5c6399f787 nodeName:}" failed. No retries permitted until 2026-04-20 15:02:59.307832088 +0000 UTC m=+36.680418769 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cfa73a34-d39a-4a89-b936-de5c6399f787-cert") pod "ingress-canary-gcg7h" (UID: "cfa73a34-d39a-4a89-b936-de5c6399f787") : secret "canary-serving-cert" not found Apr 20 15:02:58.308999 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:58.308963 2575 generic.go:358] "Generic (PLEG): container finished" podID="68cb8276-452d-4742-adc2-9eb67152cc05" containerID="6dca8ab3accfb00010c7746f5ea743e23af579b57330fcedad536d4aacbb79c1" exitCode=0 Apr 20 15:02:58.309382 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:58.309011 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5mqkf" event={"ID":"68cb8276-452d-4742-adc2-9eb67152cc05","Type":"ContainerDied","Data":"6dca8ab3accfb00010c7746f5ea743e23af579b57330fcedad536d4aacbb79c1"} Apr 20 15:02:59.222652 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:59.222619 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfea901a-e1df-46c7-b211-b94f978562b5-metrics-tls\") pod \"dns-default-xp9tj\" (UID: \"bfea901a-e1df-46c7-b211-b94f978562b5\") " pod="openshift-dns/dns-default-xp9tj" Apr 20 15:02:59.222782 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:59.222728 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 15:02:59.222817 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:59.222785 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfea901a-e1df-46c7-b211-b94f978562b5-metrics-tls podName:bfea901a-e1df-46c7-b211-b94f978562b5 nodeName:}" failed. No retries permitted until 2026-04-20 15:03:03.222770134 +0000 UTC m=+40.595356837 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bfea901a-e1df-46c7-b211-b94f978562b5-metrics-tls") pod "dns-default-xp9tj" (UID: "bfea901a-e1df-46c7-b211-b94f978562b5") : secret "dns-default-metrics-tls" not found Apr 20 15:02:59.313818 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:59.313780 2575 generic.go:358] "Generic (PLEG): container finished" podID="68cb8276-452d-4742-adc2-9eb67152cc05" containerID="42cdc489f3c599823d3d121048983f49e7950796b1db0e6c724e02f43b633864" exitCode=0 Apr 20 15:02:59.314155 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:59.313826 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5mqkf" event={"ID":"68cb8276-452d-4742-adc2-9eb67152cc05","Type":"ContainerDied","Data":"42cdc489f3c599823d3d121048983f49e7950796b1db0e6c724e02f43b633864"} Apr 20 15:02:59.323867 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:02:59.323844 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfa73a34-d39a-4a89-b936-de5c6399f787-cert\") pod \"ingress-canary-gcg7h\" (UID: \"cfa73a34-d39a-4a89-b936-de5c6399f787\") " pod="openshift-ingress-canary/ingress-canary-gcg7h" Apr 20 15:02:59.323995 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:59.323977 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 15:02:59.324036 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:02:59.324024 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfa73a34-d39a-4a89-b936-de5c6399f787-cert podName:cfa73a34-d39a-4a89-b936-de5c6399f787 nodeName:}" failed. No retries permitted until 2026-04-20 15:03:03.32401185 +0000 UTC m=+40.696598536 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cfa73a34-d39a-4a89-b936-de5c6399f787-cert") pod "ingress-canary-gcg7h" (UID: "cfa73a34-d39a-4a89-b936-de5c6399f787") : secret "canary-serving-cert" not found Apr 20 15:03:00.319441 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:03:00.319404 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5mqkf" event={"ID":"68cb8276-452d-4742-adc2-9eb67152cc05","Type":"ContainerStarted","Data":"9effc2af6af8d3a5474b31e8ea32525e438d135bf41df7b32ddb50996bc2359f"} Apr 20 15:03:00.342323 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:03:00.342248 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-5mqkf" podStartSLOduration=5.736177828 podStartE2EDuration="37.342217288s" podCreationTimestamp="2026-04-20 15:02:23 +0000 UTC" firstStartedPulling="2026-04-20 15:02:25.906589641 +0000 UTC m=+3.279176326" lastFinishedPulling="2026-04-20 15:02:57.512629086 +0000 UTC m=+34.885215786" observedRunningTime="2026-04-20 15:03:00.34055896 +0000 UTC m=+37.713145663" watchObservedRunningTime="2026-04-20 15:03:00.342217288 +0000 UTC m=+37.714803991" Apr 20 15:03:02.143905 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:03:02.143858 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c7922475-7fe5-4f88-a1fd-a1bd0359f7c6-original-pull-secret\") pod \"global-pull-secret-syncer-2nhxg\" (UID: \"c7922475-7fe5-4f88-a1fd-a1bd0359f7c6\") " pod="kube-system/global-pull-secret-syncer-2nhxg" Apr 20 15:03:02.147210 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:03:02.147178 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c7922475-7fe5-4f88-a1fd-a1bd0359f7c6-original-pull-secret\") pod \"global-pull-secret-syncer-2nhxg\" (UID: \"c7922475-7fe5-4f88-a1fd-a1bd0359f7c6\") " pod="kube-system/global-pull-secret-syncer-2nhxg" Apr 20 15:03:02.288786 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:03:02.288734 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2nhxg" Apr 20 15:03:02.409615 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:03:02.409553 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-2nhxg"] Apr 20 15:03:02.412697 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:03:02.412672 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7922475_7fe5_4f88_a1fd_a1bd0359f7c6.slice/crio-df7edea5eb9e4e31808da05877a3cc2aa3add6d4f7ec5c116bf7afd804cecc96 WatchSource:0}: Error finding container df7edea5eb9e4e31808da05877a3cc2aa3add6d4f7ec5c116bf7afd804cecc96: Status 404 returned error can't find the container with id df7edea5eb9e4e31808da05877a3cc2aa3add6d4f7ec5c116bf7afd804cecc96 Apr 20 15:03:03.252176 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:03:03.252131 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfea901a-e1df-46c7-b211-b94f978562b5-metrics-tls\") pod \"dns-default-xp9tj\" (UID: \"bfea901a-e1df-46c7-b211-b94f978562b5\") " pod="openshift-dns/dns-default-xp9tj" Apr 20 15:03:03.252575 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:03:03.252300 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 15:03:03.252575 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:03:03.252369 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfea901a-e1df-46c7-b211-b94f978562b5-metrics-tls podName:bfea901a-e1df-46c7-b211-b94f978562b5 nodeName:}" failed. No retries permitted until 2026-04-20 15:03:11.252353548 +0000 UTC m=+48.624940230 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bfea901a-e1df-46c7-b211-b94f978562b5-metrics-tls") pod "dns-default-xp9tj" (UID: "bfea901a-e1df-46c7-b211-b94f978562b5") : secret "dns-default-metrics-tls" not found Apr 20 15:03:03.325890 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:03:03.325852 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-2nhxg" event={"ID":"c7922475-7fe5-4f88-a1fd-a1bd0359f7c6","Type":"ContainerStarted","Data":"df7edea5eb9e4e31808da05877a3cc2aa3add6d4f7ec5c116bf7afd804cecc96"} Apr 20 15:03:03.353453 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:03:03.353418 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfa73a34-d39a-4a89-b936-de5c6399f787-cert\") pod \"ingress-canary-gcg7h\" (UID: \"cfa73a34-d39a-4a89-b936-de5c6399f787\") " pod="openshift-ingress-canary/ingress-canary-gcg7h" Apr 20 15:03:03.353644 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:03:03.353534 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 15:03:03.353644 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:03:03.353591 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfa73a34-d39a-4a89-b936-de5c6399f787-cert podName:cfa73a34-d39a-4a89-b936-de5c6399f787 nodeName:}" failed. No retries permitted until 2026-04-20 15:03:11.35357208 +0000 UTC m=+48.726158779 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cfa73a34-d39a-4a89-b936-de5c6399f787-cert") pod "ingress-canary-gcg7h" (UID: "cfa73a34-d39a-4a89-b936-de5c6399f787") : secret "canary-serving-cert" not found Apr 20 15:03:07.334610 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:03:07.334573 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-2nhxg" event={"ID":"c7922475-7fe5-4f88-a1fd-a1bd0359f7c6","Type":"ContainerStarted","Data":"8f0bb370d1af91a93698376b876c03d63ea537e7f5a964c69ce26b1171c82e3c"} Apr 20 15:03:07.349424 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:03:07.349382 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-2nhxg" podStartSLOduration=33.444419167 podStartE2EDuration="37.349369335s" podCreationTimestamp="2026-04-20 15:02:30 +0000 UTC" firstStartedPulling="2026-04-20 15:03:02.41719629 +0000 UTC m=+39.789782971" lastFinishedPulling="2026-04-20 15:03:06.322146457 +0000 UTC m=+43.694733139" observedRunningTime="2026-04-20 15:03:07.348665469 +0000 UTC m=+44.721252173" watchObservedRunningTime="2026-04-20 15:03:07.349369335 +0000 UTC m=+44.721956038" Apr 20 15:03:11.313508 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:03:11.313470 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfea901a-e1df-46c7-b211-b94f978562b5-metrics-tls\") pod \"dns-default-xp9tj\" (UID: \"bfea901a-e1df-46c7-b211-b94f978562b5\") " pod="openshift-dns/dns-default-xp9tj" Apr 20 15:03:11.313907 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:03:11.313587 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 15:03:11.313907 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:03:11.313638 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfea901a-e1df-46c7-b211-b94f978562b5-metrics-tls podName:bfea901a-e1df-46c7-b211-b94f978562b5 nodeName:}" failed. No retries permitted until 2026-04-20 15:03:27.313625223 +0000 UTC m=+64.686211908 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bfea901a-e1df-46c7-b211-b94f978562b5-metrics-tls") pod "dns-default-xp9tj" (UID: "bfea901a-e1df-46c7-b211-b94f978562b5") : secret "dns-default-metrics-tls" not found Apr 20 15:03:11.414295 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:03:11.414237 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfa73a34-d39a-4a89-b936-de5c6399f787-cert\") pod \"ingress-canary-gcg7h\" (UID: \"cfa73a34-d39a-4a89-b936-de5c6399f787\") " pod="openshift-ingress-canary/ingress-canary-gcg7h" Apr 20 15:03:11.414468 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:03:11.414387 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 15:03:11.414468 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:03:11.414453 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfa73a34-d39a-4a89-b936-de5c6399f787-cert podName:cfa73a34-d39a-4a89-b936-de5c6399f787 nodeName:}" failed. No retries permitted until 2026-04-20 15:03:27.414437986 +0000 UTC m=+64.787024672 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cfa73a34-d39a-4a89-b936-de5c6399f787-cert") pod "ingress-canary-gcg7h" (UID: "cfa73a34-d39a-4a89-b936-de5c6399f787") : secret "canary-serving-cert" not found Apr 20 15:03:22.305126 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:03:22.305092 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sdgqx" Apr 20 15:03:27.321388 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:03:27.321343 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfea901a-e1df-46c7-b211-b94f978562b5-metrics-tls\") pod \"dns-default-xp9tj\" (UID: \"bfea901a-e1df-46c7-b211-b94f978562b5\") " pod="openshift-dns/dns-default-xp9tj" Apr 20 15:03:27.321769 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:03:27.321490 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 15:03:27.321769 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:03:27.321566 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfea901a-e1df-46c7-b211-b94f978562b5-metrics-tls podName:bfea901a-e1df-46c7-b211-b94f978562b5 nodeName:}" failed. No retries permitted until 2026-04-20 15:03:59.321549057 +0000 UTC m=+96.694135739 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bfea901a-e1df-46c7-b211-b94f978562b5-metrics-tls") pod "dns-default-xp9tj" (UID: "bfea901a-e1df-46c7-b211-b94f978562b5") : secret "dns-default-metrics-tls" not found Apr 20 15:03:27.422155 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:03:27.422111 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfa73a34-d39a-4a89-b936-de5c6399f787-cert\") pod \"ingress-canary-gcg7h\" (UID: \"cfa73a34-d39a-4a89-b936-de5c6399f787\") " pod="openshift-ingress-canary/ingress-canary-gcg7h" Apr 20 15:03:27.422383 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:03:27.422287 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 15:03:27.422383 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:03:27.422362 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfa73a34-d39a-4a89-b936-de5c6399f787-cert podName:cfa73a34-d39a-4a89-b936-de5c6399f787 nodeName:}" failed. No retries permitted until 2026-04-20 15:03:59.422341392 +0000 UTC m=+96.794928073 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cfa73a34-d39a-4a89-b936-de5c6399f787-cert") pod "ingress-canary-gcg7h" (UID: "cfa73a34-d39a-4a89-b936-de5c6399f787") : secret "canary-serving-cert" not found Apr 20 15:03:28.831667 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:03:28.831624 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee584f46-b9aa-46b2-a060-01c6f4e256e9-metrics-certs\") pod \"network-metrics-daemon-lnqrj\" (UID: \"ee584f46-b9aa-46b2-a060-01c6f4e256e9\") " pod="openshift-multus/network-metrics-daemon-lnqrj" Apr 20 15:03:28.834435 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:03:28.834415 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 15:03:28.841860 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:03:28.841839 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 15:03:28.841937 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:03:28.841914 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee584f46-b9aa-46b2-a060-01c6f4e256e9-metrics-certs podName:ee584f46-b9aa-46b2-a060-01c6f4e256e9 nodeName:}" failed. No retries permitted until 2026-04-20 15:04:32.841896422 +0000 UTC m=+130.214483104 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ee584f46-b9aa-46b2-a060-01c6f4e256e9-metrics-certs") pod "network-metrics-daemon-lnqrj" (UID: "ee584f46-b9aa-46b2-a060-01c6f4e256e9") : secret "metrics-daemon-secret" not found Apr 20 15:03:28.932200 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:03:28.932169 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhkgn\" (UniqueName: \"kubernetes.io/projected/5e85baf4-757f-4b92-b2be-cb821cc1b33e-kube-api-access-vhkgn\") pod \"network-check-target-fbms8\" (UID: \"5e85baf4-757f-4b92-b2be-cb821cc1b33e\") " pod="openshift-network-diagnostics/network-check-target-fbms8" Apr 20 15:03:28.935049 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:03:28.935028 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 15:03:28.944974 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:03:28.944954 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 15:03:28.955538 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:03:28.955516 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhkgn\" (UniqueName: \"kubernetes.io/projected/5e85baf4-757f-4b92-b2be-cb821cc1b33e-kube-api-access-vhkgn\") pod \"network-check-target-fbms8\" (UID: \"5e85baf4-757f-4b92-b2be-cb821cc1b33e\") " pod="openshift-network-diagnostics/network-check-target-fbms8" Apr 20 15:03:28.996548 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:03:28.996518 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-dhzk6\"" Apr 20 15:03:29.003799 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:03:29.003777 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fbms8" Apr 20 15:03:29.136299 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:03:29.136186 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-fbms8"] Apr 20 15:03:29.140796 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:03:29.140769 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e85baf4_757f_4b92_b2be_cb821cc1b33e.slice/crio-22873e10116871d5962df26fd891ffb16ef3dbd36bb480a35413b1b475762377 WatchSource:0}: Error finding container 22873e10116871d5962df26fd891ffb16ef3dbd36bb480a35413b1b475762377: Status 404 returned error can't find the container with id 22873e10116871d5962df26fd891ffb16ef3dbd36bb480a35413b1b475762377 Apr 20 15:03:29.375821 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:03:29.375783 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-fbms8" event={"ID":"5e85baf4-757f-4b92-b2be-cb821cc1b33e","Type":"ContainerStarted","Data":"22873e10116871d5962df26fd891ffb16ef3dbd36bb480a35413b1b475762377"} Apr 20 15:03:32.382723 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:03:32.382689 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-fbms8" event={"ID":"5e85baf4-757f-4b92-b2be-cb821cc1b33e","Type":"ContainerStarted","Data":"19f74f694bdf8c27ba0f654893b7ca253354dd9056c87052926ebf7a8518feca"} Apr 20 15:03:32.383120 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:03:32.382803 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-fbms8" Apr 20 15:03:32.400432 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:03:32.400359 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-fbms8" podStartSLOduration=66.823178197 podStartE2EDuration="1m9.400343539s" podCreationTimestamp="2026-04-20 15:02:23 +0000 UTC" firstStartedPulling="2026-04-20 15:03:29.14249986 +0000 UTC m=+66.515086542" lastFinishedPulling="2026-04-20 15:03:31.719665189 +0000 UTC m=+69.092251884" observedRunningTime="2026-04-20 15:03:32.400295262 +0000 UTC m=+69.772881970" watchObservedRunningTime="2026-04-20 15:03:32.400343539 +0000 UTC m=+69.772930235" Apr 20 15:03:59.347386 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:03:59.347340 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfea901a-e1df-46c7-b211-b94f978562b5-metrics-tls\") pod \"dns-default-xp9tj\" (UID: \"bfea901a-e1df-46c7-b211-b94f978562b5\") " pod="openshift-dns/dns-default-xp9tj" Apr 20 15:03:59.347825 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:03:59.347498 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 15:03:59.347825 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:03:59.347566 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfea901a-e1df-46c7-b211-b94f978562b5-metrics-tls podName:bfea901a-e1df-46c7-b211-b94f978562b5 nodeName:}" failed. No retries permitted until 2026-04-20 15:05:03.347549897 +0000 UTC m=+160.720136584 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bfea901a-e1df-46c7-b211-b94f978562b5-metrics-tls") pod "dns-default-xp9tj" (UID: "bfea901a-e1df-46c7-b211-b94f978562b5") : secret "dns-default-metrics-tls" not found Apr 20 15:03:59.448122 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:03:59.448086 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfa73a34-d39a-4a89-b936-de5c6399f787-cert\") pod \"ingress-canary-gcg7h\" (UID: \"cfa73a34-d39a-4a89-b936-de5c6399f787\") " pod="openshift-ingress-canary/ingress-canary-gcg7h" Apr 20 15:03:59.448337 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:03:59.448212 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 15:03:59.448337 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:03:59.448297 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfa73a34-d39a-4a89-b936-de5c6399f787-cert podName:cfa73a34-d39a-4a89-b936-de5c6399f787 nodeName:}" failed. No retries permitted until 2026-04-20 15:05:03.448265139 +0000 UTC m=+160.820851820 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cfa73a34-d39a-4a89-b936-de5c6399f787-cert") pod "ingress-canary-gcg7h" (UID: "cfa73a34-d39a-4a89-b936-de5c6399f787") : secret "canary-serving-cert" not found Apr 20 15:04:03.388403 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:03.388364 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-fbms8" Apr 20 15:04:32.876923 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:32.876879 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee584f46-b9aa-46b2-a060-01c6f4e256e9-metrics-certs\") pod \"network-metrics-daemon-lnqrj\" (UID: \"ee584f46-b9aa-46b2-a060-01c6f4e256e9\") " pod="openshift-multus/network-metrics-daemon-lnqrj" Apr 20 15:04:32.877437 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:04:32.877005 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 15:04:32.877437 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:04:32.877076 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee584f46-b9aa-46b2-a060-01c6f4e256e9-metrics-certs podName:ee584f46-b9aa-46b2-a060-01c6f4e256e9 nodeName:}" failed. No retries permitted until 2026-04-20 15:06:34.87706018 +0000 UTC m=+252.249646869 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ee584f46-b9aa-46b2-a060-01c6f4e256e9-metrics-certs") pod "network-metrics-daemon-lnqrj" (UID: "ee584f46-b9aa-46b2-a060-01c6f4e256e9") : secret "metrics-daemon-secret" not found Apr 20 15:04:52.143102 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.143069 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-j22cl"] Apr 20 15:04:52.146870 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.146814 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-pbfc9"] Apr 20 15:04:52.147040 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.146949 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j22cl" Apr 20 15:04:52.148976 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.148955 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-pbfc9" Apr 20 15:04:52.149554 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.149537 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 20 15:04:52.150899 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.150883 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 15:04:52.150978 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.150897 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 15:04:52.150978 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.150949 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-tmk6r\"" Apr 20 15:04:52.151098 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.151022 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 20 15:04:52.151236 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.151219 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 20 15:04:52.151325 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.151250 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 20 15:04:52.151872 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.151857 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 20 15:04:52.151942 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.151862 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 20 15:04:52.151942 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.151890 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-ljmr4\"" Apr 20 15:04:52.157497 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.157474 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 20 15:04:52.157805 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.157788 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-j22cl"] Apr 20 15:04:52.162694 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.162672 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-pbfc9"] Apr 20 15:04:52.301989 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.301956 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/af67f37d-b332-44b7-8678-28aa45d26ed9-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-j22cl\" (UID: \"af67f37d-b332-44b7-8678-28aa45d26ed9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j22cl" Apr 20 15:04:52.302173 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.302026 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab96425d-f444-4af7-9052-1bddacccef53-serving-cert\") pod \"console-operator-9d4b6777b-pbfc9\" (UID: \"ab96425d-f444-4af7-9052-1bddacccef53\") " pod="openshift-console-operator/console-operator-9d4b6777b-pbfc9" Apr 20 15:04:52.302173 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.302052 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ab96425d-f444-4af7-9052-1bddacccef53-trusted-ca\") pod \"console-operator-9d4b6777b-pbfc9\" (UID: \"ab96425d-f444-4af7-9052-1bddacccef53\") " pod="openshift-console-operator/console-operator-9d4b6777b-pbfc9" Apr 20 15:04:52.302173 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.302081 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j27mx\" (UniqueName: \"kubernetes.io/projected/af67f37d-b332-44b7-8678-28aa45d26ed9-kube-api-access-j27mx\") pod \"cluster-monitoring-operator-75587bd455-j22cl\" (UID: \"af67f37d-b332-44b7-8678-28aa45d26ed9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j22cl" Apr 20 15:04:52.302335 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.302179 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v8b6\" (UniqueName: \"kubernetes.io/projected/ab96425d-f444-4af7-9052-1bddacccef53-kube-api-access-4v8b6\") pod \"console-operator-9d4b6777b-pbfc9\" (UID: \"ab96425d-f444-4af7-9052-1bddacccef53\") " pod="openshift-console-operator/console-operator-9d4b6777b-pbfc9" Apr 20 15:04:52.302335 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.302246 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/af67f37d-b332-44b7-8678-28aa45d26ed9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-j22cl\" (UID: \"af67f37d-b332-44b7-8678-28aa45d26ed9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j22cl" Apr 20 15:04:52.302335 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.302296 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab96425d-f444-4af7-9052-1bddacccef53-config\") pod \"console-operator-9d4b6777b-pbfc9\" (UID: \"ab96425d-f444-4af7-9052-1bddacccef53\") " pod="openshift-console-operator/console-operator-9d4b6777b-pbfc9" Apr 20 15:04:52.403386 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.403263 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4v8b6\" (UniqueName: \"kubernetes.io/projected/ab96425d-f444-4af7-9052-1bddacccef53-kube-api-access-4v8b6\") pod \"console-operator-9d4b6777b-pbfc9\" (UID: \"ab96425d-f444-4af7-9052-1bddacccef53\") " pod="openshift-console-operator/console-operator-9d4b6777b-pbfc9" Apr 20 15:04:52.403386 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.403337 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/af67f37d-b332-44b7-8678-28aa45d26ed9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-j22cl\" (UID: \"af67f37d-b332-44b7-8678-28aa45d26ed9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j22cl" Apr 20 15:04:52.403386 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.403360 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab96425d-f444-4af7-9052-1bddacccef53-config\") pod \"console-operator-9d4b6777b-pbfc9\" (UID: \"ab96425d-f444-4af7-9052-1bddacccef53\") " pod="openshift-console-operator/console-operator-9d4b6777b-pbfc9" Apr 20 15:04:52.403386 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.403383 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/af67f37d-b332-44b7-8678-28aa45d26ed9-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-j22cl\" (UID: \"af67f37d-b332-44b7-8678-28aa45d26ed9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j22cl" Apr 20 15:04:52.403704 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:04:52.403469 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 15:04:52.403704 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.403540 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab96425d-f444-4af7-9052-1bddacccef53-serving-cert\") pod \"console-operator-9d4b6777b-pbfc9\" (UID: \"ab96425d-f444-4af7-9052-1bddacccef53\") " pod="openshift-console-operator/console-operator-9d4b6777b-pbfc9" Apr 20 15:04:52.403704 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.403567 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ab96425d-f444-4af7-9052-1bddacccef53-trusted-ca\") pod \"console-operator-9d4b6777b-pbfc9\" (UID: \"ab96425d-f444-4af7-9052-1bddacccef53\") " pod="openshift-console-operator/console-operator-9d4b6777b-pbfc9" Apr 20 15:04:52.403704 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.403603 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j27mx\" (UniqueName: \"kubernetes.io/projected/af67f37d-b332-44b7-8678-28aa45d26ed9-kube-api-access-j27mx\") pod \"cluster-monitoring-operator-75587bd455-j22cl\" (UID: \"af67f37d-b332-44b7-8678-28aa45d26ed9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j22cl" Apr 20 15:04:52.403704 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:04:52.403638 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af67f37d-b332-44b7-8678-28aa45d26ed9-cluster-monitoring-operator-tls podName:af67f37d-b332-44b7-8678-28aa45d26ed9 nodeName:}" failed. No retries permitted until 2026-04-20 15:04:52.903598973 +0000 UTC m=+150.276185658 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/af67f37d-b332-44b7-8678-28aa45d26ed9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-j22cl" (UID: "af67f37d-b332-44b7-8678-28aa45d26ed9") : secret "cluster-monitoring-operator-tls" not found Apr 20 15:04:52.404088 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.404059 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/af67f37d-b332-44b7-8678-28aa45d26ed9-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-j22cl\" (UID: \"af67f37d-b332-44b7-8678-28aa45d26ed9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j22cl" Apr 20 15:04:52.404172 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.404102 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab96425d-f444-4af7-9052-1bddacccef53-config\") pod \"console-operator-9d4b6777b-pbfc9\" (UID: \"ab96425d-f444-4af7-9052-1bddacccef53\") " pod="openshift-console-operator/console-operator-9d4b6777b-pbfc9" Apr 20 15:04:52.404426 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.404407 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ab96425d-f444-4af7-9052-1bddacccef53-trusted-ca\") pod \"console-operator-9d4b6777b-pbfc9\" (UID: \"ab96425d-f444-4af7-9052-1bddacccef53\") " pod="openshift-console-operator/console-operator-9d4b6777b-pbfc9" Apr 20 15:04:52.405829 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.405810 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab96425d-f444-4af7-9052-1bddacccef53-serving-cert\") pod \"console-operator-9d4b6777b-pbfc9\" (UID: \"ab96425d-f444-4af7-9052-1bddacccef53\") " pod="openshift-console-operator/console-operator-9d4b6777b-pbfc9" Apr 20 15:04:52.412367 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.412343 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v8b6\" (UniqueName: \"kubernetes.io/projected/ab96425d-f444-4af7-9052-1bddacccef53-kube-api-access-4v8b6\") pod \"console-operator-9d4b6777b-pbfc9\" (UID: \"ab96425d-f444-4af7-9052-1bddacccef53\") " pod="openshift-console-operator/console-operator-9d4b6777b-pbfc9" Apr 20 15:04:52.412508 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.412484 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j27mx\" (UniqueName: \"kubernetes.io/projected/af67f37d-b332-44b7-8678-28aa45d26ed9-kube-api-access-j27mx\") pod \"cluster-monitoring-operator-75587bd455-j22cl\" (UID: \"af67f37d-b332-44b7-8678-28aa45d26ed9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j22cl" Apr 20 15:04:52.464063 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.464030 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-pbfc9" Apr 20 15:04:52.574173 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.574141 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-pbfc9"] Apr 20 15:04:52.577976 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:04:52.577945 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab96425d_f444_4af7_9052_1bddacccef53.slice/crio-863302daf6ec04e77398299190c0c86bc2af19dfc064882e71ed7ba5508060f8 WatchSource:0}: Error finding container 863302daf6ec04e77398299190c0c86bc2af19dfc064882e71ed7ba5508060f8: Status 404 returned error can't find the container with id 863302daf6ec04e77398299190c0c86bc2af19dfc064882e71ed7ba5508060f8 Apr 20 15:04:52.905927 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:52.905889 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/af67f37d-b332-44b7-8678-28aa45d26ed9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-j22cl\" (UID: \"af67f37d-b332-44b7-8678-28aa45d26ed9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j22cl" Apr 20 15:04:52.906115 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:04:52.906053 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 15:04:52.906156 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:04:52.906128 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af67f37d-b332-44b7-8678-28aa45d26ed9-cluster-monitoring-operator-tls podName:af67f37d-b332-44b7-8678-28aa45d26ed9 nodeName:}" failed. No retries permitted until 2026-04-20 15:04:53.90611154 +0000 UTC m=+151.278698228 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/af67f37d-b332-44b7-8678-28aa45d26ed9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-j22cl" (UID: "af67f37d-b332-44b7-8678-28aa45d26ed9") : secret "cluster-monitoring-operator-tls" not found Apr 20 15:04:53.535312 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:53.535254 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-pbfc9" event={"ID":"ab96425d-f444-4af7-9052-1bddacccef53","Type":"ContainerStarted","Data":"863302daf6ec04e77398299190c0c86bc2af19dfc064882e71ed7ba5508060f8"} Apr 20 15:04:53.720794 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:53.720758 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-779c949c75-b2sq6"] Apr 20 15:04:53.722567 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:53.722551 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-779c949c75-b2sq6" Apr 20 15:04:53.725342 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:53.725312 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 15:04:53.725764 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:53.725540 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 15:04:53.725764 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:53.725601 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-nfkdq\"" Apr 20 15:04:53.725764 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:53.725638 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 15:04:53.729897 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:53.729878 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 15:04:53.735042 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:53.735018 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-779c949c75-b2sq6"] Apr 20 15:04:53.812116 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:53.812030 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-installation-pull-secrets\") pod \"image-registry-779c949c75-b2sq6\" (UID: \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\") " pod="openshift-image-registry/image-registry-779c949c75-b2sq6" Apr 20 15:04:53.812116 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:53.812094 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-ca-trust-extracted\") pod \"image-registry-779c949c75-b2sq6\" (UID: \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\") " pod="openshift-image-registry/image-registry-779c949c75-b2sq6" Apr 20 15:04:53.812360 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:53.812125 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-registry-tls\") pod \"image-registry-779c949c75-b2sq6\" (UID: \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\") " pod="openshift-image-registry/image-registry-779c949c75-b2sq6" Apr 20 15:04:53.812360 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:53.812148 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-bound-sa-token\") pod \"image-registry-779c949c75-b2sq6\" (UID: \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\") " pod="openshift-image-registry/image-registry-779c949c75-b2sq6" Apr 20 15:04:53.812360 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:53.812210 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdf4g\" (UniqueName: \"kubernetes.io/projected/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-kube-api-access-zdf4g\") pod \"image-registry-779c949c75-b2sq6\" (UID: \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\") " pod="openshift-image-registry/image-registry-779c949c75-b2sq6" Apr 20 15:04:53.812360 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:53.812237 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-registry-certificates\") pod \"image-registry-779c949c75-b2sq6\" (UID: \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\") " pod="openshift-image-registry/image-registry-779c949c75-b2sq6" Apr 20 15:04:53.812360 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:53.812335 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-image-registry-private-configuration\") pod \"image-registry-779c949c75-b2sq6\" (UID: \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\") " pod="openshift-image-registry/image-registry-779c949c75-b2sq6" Apr 20 15:04:53.812618 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:53.812362 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-trusted-ca\") pod \"image-registry-779c949c75-b2sq6\" (UID: \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\") " pod="openshift-image-registry/image-registry-779c949c75-b2sq6" Apr 20 15:04:53.913428 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:53.913384 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-image-registry-private-configuration\") pod \"image-registry-779c949c75-b2sq6\" (UID: \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\") " pod="openshift-image-registry/image-registry-779c949c75-b2sq6" Apr 20 15:04:53.913428 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:53.913423 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-trusted-ca\") pod \"image-registry-779c949c75-b2sq6\" (UID: \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\") " pod="openshift-image-registry/image-registry-779c949c75-b2sq6" Apr 20 15:04:53.913632 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:53.913448 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-installation-pull-secrets\") pod \"image-registry-779c949c75-b2sq6\" (UID: \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\") " pod="openshift-image-registry/image-registry-779c949c75-b2sq6" Apr 20 15:04:53.913632 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:53.913486 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/af67f37d-b332-44b7-8678-28aa45d26ed9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-j22cl\" (UID: \"af67f37d-b332-44b7-8678-28aa45d26ed9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j22cl" Apr 20 15:04:53.913632 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:04:53.913579 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 15:04:53.913632 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:53.913603 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-ca-trust-extracted\") pod \"image-registry-779c949c75-b2sq6\" (UID: \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\") " pod="openshift-image-registry/image-registry-779c949c75-b2sq6" Apr 20 15:04:53.913784 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:04:53.913654 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af67f37d-b332-44b7-8678-28aa45d26ed9-cluster-monitoring-operator-tls podName:af67f37d-b332-44b7-8678-28aa45d26ed9 nodeName:}" failed. No retries permitted until 2026-04-20 15:04:55.913634964 +0000 UTC m=+153.286221645 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/af67f37d-b332-44b7-8678-28aa45d26ed9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-j22cl" (UID: "af67f37d-b332-44b7-8678-28aa45d26ed9") : secret "cluster-monitoring-operator-tls" not found Apr 20 15:04:53.913784 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:53.913684 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-registry-tls\") pod \"image-registry-779c949c75-b2sq6\" (UID: \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\") " pod="openshift-image-registry/image-registry-779c949c75-b2sq6" Apr 20 15:04:53.913784 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:53.913721 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-bound-sa-token\") pod \"image-registry-779c949c75-b2sq6\" (UID: \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\") " pod="openshift-image-registry/image-registry-779c949c75-b2sq6" Apr 20 15:04:53.913784 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:53.913759 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zdf4g\" (UniqueName: \"kubernetes.io/projected/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-kube-api-access-zdf4g\") pod \"image-registry-779c949c75-b2sq6\" (UID: \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\") " pod="openshift-image-registry/image-registry-779c949c75-b2sq6" Apr 20 15:04:53.913989 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:53.913787 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-registry-certificates\") pod \"image-registry-779c949c75-b2sq6\" (UID: \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\") " pod="openshift-image-registry/image-registry-779c949c75-b2sq6" Apr 20 15:04:53.913989 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:04:53.913843 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 15:04:53.913989 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:04:53.913864 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-779c949c75-b2sq6: secret "image-registry-tls" not found Apr 20 15:04:53.913989 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:04:53.913926 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-registry-tls podName:991dcfd6-74f7-47ef-9682-ac84ec6e81f2 nodeName:}" failed. No retries permitted until 2026-04-20 15:04:54.413908 +0000 UTC m=+151.786494685 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-registry-tls") pod "image-registry-779c949c75-b2sq6" (UID: "991dcfd6-74f7-47ef-9682-ac84ec6e81f2") : secret "image-registry-tls" not found Apr 20 15:04:53.914293 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:53.914000 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-ca-trust-extracted\") pod \"image-registry-779c949c75-b2sq6\" (UID: \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\") " pod="openshift-image-registry/image-registry-779c949c75-b2sq6" Apr 20 15:04:53.914385 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:53.914365 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-registry-certificates\") pod \"image-registry-779c949c75-b2sq6\" (UID: \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\") " pod="openshift-image-registry/image-registry-779c949c75-b2sq6" Apr 20 15:04:53.914528 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:53.914513 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-trusted-ca\") pod \"image-registry-779c949c75-b2sq6\" (UID: \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\") " pod="openshift-image-registry/image-registry-779c949c75-b2sq6" Apr 20 15:04:53.915964 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:53.915941 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-installation-pull-secrets\") pod \"image-registry-779c949c75-b2sq6\" (UID: \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\") " pod="openshift-image-registry/image-registry-779c949c75-b2sq6" Apr 20 15:04:53.916071 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:53.915969 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-image-registry-private-configuration\") pod \"image-registry-779c949c75-b2sq6\" (UID: \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\") " pod="openshift-image-registry/image-registry-779c949c75-b2sq6" Apr 20 15:04:53.921937 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:53.921911 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-bound-sa-token\") pod \"image-registry-779c949c75-b2sq6\" (UID: \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\") " pod="openshift-image-registry/image-registry-779c949c75-b2sq6" Apr 20 15:04:53.922061 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:53.922046 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdf4g\" (UniqueName: \"kubernetes.io/projected/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-kube-api-access-zdf4g\") pod \"image-registry-779c949c75-b2sq6\" (UID: \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\") " pod="openshift-image-registry/image-registry-779c949c75-b2sq6" Apr 20 15:04:54.417387 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:54.417340 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-registry-tls\") pod \"image-registry-779c949c75-b2sq6\" (UID: \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\") " pod="openshift-image-registry/image-registry-779c949c75-b2sq6" Apr 20 15:04:54.417579 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:04:54.417500 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 15:04:54.417579 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:04:54.417522 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-779c949c75-b2sq6: secret "image-registry-tls" not found Apr 20 15:04:54.417670 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:04:54.417588 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-registry-tls podName:991dcfd6-74f7-47ef-9682-ac84ec6e81f2 nodeName:}" failed. No retries permitted until 2026-04-20 15:04:55.417571457 +0000 UTC m=+152.790158139 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-registry-tls") pod "image-registry-779c949c75-b2sq6" (UID: "991dcfd6-74f7-47ef-9682-ac84ec6e81f2") : secret "image-registry-tls" not found Apr 20 15:04:55.425965 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:55.425934 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-registry-tls\") pod \"image-registry-779c949c75-b2sq6\" (UID: \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\") " pod="openshift-image-registry/image-registry-779c949c75-b2sq6" Apr 20 15:04:55.426376 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:04:55.426058 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 15:04:55.426376 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:04:55.426069 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-779c949c75-b2sq6: secret "image-registry-tls" not found Apr 20 15:04:55.426376 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:04:55.426117 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-registry-tls podName:991dcfd6-74f7-47ef-9682-ac84ec6e81f2 nodeName:}" failed. No retries permitted until 2026-04-20 15:04:57.426103307 +0000 UTC m=+154.798689990 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-registry-tls") pod "image-registry-779c949c75-b2sq6" (UID: "991dcfd6-74f7-47ef-9682-ac84ec6e81f2") : secret "image-registry-tls" not found Apr 20 15:04:55.540515 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:55.540487 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pbfc9_ab96425d-f444-4af7-9052-1bddacccef53/console-operator/0.log" Apr 20 15:04:55.540622 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:55.540524 2575 generic.go:358] "Generic (PLEG): container finished" podID="ab96425d-f444-4af7-9052-1bddacccef53" containerID="a448d8e7b8b461caa0d966edbfffef56eaf8e83e58ea0dd81959f756b11cc491" exitCode=255 Apr 20 15:04:55.540622 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:55.540556 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-pbfc9" event={"ID":"ab96425d-f444-4af7-9052-1bddacccef53","Type":"ContainerDied","Data":"a448d8e7b8b461caa0d966edbfffef56eaf8e83e58ea0dd81959f756b11cc491"} Apr 20 15:04:55.540806 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:55.540792 2575 scope.go:117] "RemoveContainer" containerID="a448d8e7b8b461caa0d966edbfffef56eaf8e83e58ea0dd81959f756b11cc491" Apr 20 15:04:55.930087 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:55.930043 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/af67f37d-b332-44b7-8678-28aa45d26ed9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-j22cl\" (UID: \"af67f37d-b332-44b7-8678-28aa45d26ed9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j22cl" Apr 20 15:04:55.930245 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:04:55.930194 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 15:04:55.930333 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:04:55.930267 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af67f37d-b332-44b7-8678-28aa45d26ed9-cluster-monitoring-operator-tls podName:af67f37d-b332-44b7-8678-28aa45d26ed9 nodeName:}" failed. No retries permitted until 2026-04-20 15:04:59.930249726 +0000 UTC m=+157.302836416 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/af67f37d-b332-44b7-8678-28aa45d26ed9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-j22cl" (UID: "af67f37d-b332-44b7-8678-28aa45d26ed9") : secret "cluster-monitoring-operator-tls" not found Apr 20 15:04:56.079516 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:56.079480 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8489c"] Apr 20 15:04:56.081179 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:56.081163 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8489c" Apr 20 15:04:56.083464 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:56.083436 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 20 15:04:56.083601 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:56.083438 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 20 15:04:56.083601 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:56.083485 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-rjncf\"" Apr 20 15:04:56.083755 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:56.083741 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 20 15:04:56.084391 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:56.084375 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 20 15:04:56.091756 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:56.091731 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8489c"] Apr 20 15:04:56.234195 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:56.234094 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6a53b42-a8ae-4454-b200-47881e42577a-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-8489c\" (UID: \"d6a53b42-a8ae-4454-b200-47881e42577a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8489c" Apr 20 15:04:56.234372 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:56.234226 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqqz6\" (UniqueName: \"kubernetes.io/projected/d6a53b42-a8ae-4454-b200-47881e42577a-kube-api-access-jqqz6\") pod \"kube-storage-version-migrator-operator-6769c5d45-8489c\" (UID: \"d6a53b42-a8ae-4454-b200-47881e42577a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8489c" Apr 20 15:04:56.234372 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:56.234337 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6a53b42-a8ae-4454-b200-47881e42577a-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-8489c\" (UID: \"d6a53b42-a8ae-4454-b200-47881e42577a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8489c" Apr 20 15:04:56.335638 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:56.335589 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jqqz6\" (UniqueName: \"kubernetes.io/projected/d6a53b42-a8ae-4454-b200-47881e42577a-kube-api-access-jqqz6\") pod \"kube-storage-version-migrator-operator-6769c5d45-8489c\" (UID: \"d6a53b42-a8ae-4454-b200-47881e42577a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8489c" Apr 20 15:04:56.335832 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:56.335655 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6a53b42-a8ae-4454-b200-47881e42577a-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-8489c\" (UID: \"d6a53b42-a8ae-4454-b200-47881e42577a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8489c" Apr 20 15:04:56.335832 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:56.335694 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6a53b42-a8ae-4454-b200-47881e42577a-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-8489c\" (UID: \"d6a53b42-a8ae-4454-b200-47881e42577a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8489c" Apr 20 15:04:56.336168 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:56.336151 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6a53b42-a8ae-4454-b200-47881e42577a-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-8489c\" (UID: \"d6a53b42-a8ae-4454-b200-47881e42577a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8489c" Apr 20 15:04:56.337988 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:56.337967 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6a53b42-a8ae-4454-b200-47881e42577a-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-8489c\" (UID: \"d6a53b42-a8ae-4454-b200-47881e42577a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8489c" Apr 20 15:04:56.343661 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:56.343633 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqqz6\" (UniqueName: \"kubernetes.io/projected/d6a53b42-a8ae-4454-b200-47881e42577a-kube-api-access-jqqz6\") pod \"kube-storage-version-migrator-operator-6769c5d45-8489c\" (UID: \"d6a53b42-a8ae-4454-b200-47881e42577a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8489c" Apr 20 15:04:56.389948 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:56.389899 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8489c" Apr 20 15:04:56.531699 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:56.531550 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8489c"] Apr 20 15:04:56.535109 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:04:56.535080 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6a53b42_a8ae_4454_b200_47881e42577a.slice/crio-e6c33cc00f97730b6989fffa7bd731b143b8718fea8a9b8e29007a301dd71402 WatchSource:0}: Error finding container e6c33cc00f97730b6989fffa7bd731b143b8718fea8a9b8e29007a301dd71402: Status 404 returned error can't find the container with id e6c33cc00f97730b6989fffa7bd731b143b8718fea8a9b8e29007a301dd71402 Apr 20 15:04:56.547531 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:56.547506 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pbfc9_ab96425d-f444-4af7-9052-1bddacccef53/console-operator/1.log" Apr 20 15:04:56.547855 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:56.547841 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pbfc9_ab96425d-f444-4af7-9052-1bddacccef53/console-operator/0.log" Apr 20 15:04:56.547927 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:56.547876 2575 generic.go:358] "Generic (PLEG): container finished" podID="ab96425d-f444-4af7-9052-1bddacccef53" containerID="8702cf283dbde2d293aae31f4d850cd955380cb9a11fb0aaeebfe6a66203c402" exitCode=255 Apr 20 15:04:56.547982 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:56.547935 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-pbfc9" event={"ID":"ab96425d-f444-4af7-9052-1bddacccef53","Type":"ContainerDied","Data":"8702cf283dbde2d293aae31f4d850cd955380cb9a11fb0aaeebfe6a66203c402"} Apr 20 15:04:56.547982 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:56.547969 2575 scope.go:117] "RemoveContainer" containerID="a448d8e7b8b461caa0d966edbfffef56eaf8e83e58ea0dd81959f756b11cc491" Apr 20 15:04:56.548287 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:56.548235 2575 scope.go:117] "RemoveContainer" containerID="8702cf283dbde2d293aae31f4d850cd955380cb9a11fb0aaeebfe6a66203c402" Apr 20 15:04:56.548674 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:04:56.548451 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-pbfc9_openshift-console-operator(ab96425d-f444-4af7-9052-1bddacccef53)\"" pod="openshift-console-operator/console-operator-9d4b6777b-pbfc9" podUID="ab96425d-f444-4af7-9052-1bddacccef53" Apr 20 15:04:56.549060 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:56.549012 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8489c" event={"ID":"d6a53b42-a8ae-4454-b200-47881e42577a","Type":"ContainerStarted","Data":"e6c33cc00f97730b6989fffa7bd731b143b8718fea8a9b8e29007a301dd71402"} Apr 20 15:04:57.444791 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:57.444749 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-registry-tls\") pod \"image-registry-779c949c75-b2sq6\" (UID: \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\") " pod="openshift-image-registry/image-registry-779c949c75-b2sq6" Apr 20 15:04:57.445058 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:04:57.444899 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 15:04:57.445058 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:04:57.444920 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-779c949c75-b2sq6: secret "image-registry-tls" not found Apr 20 15:04:57.445058 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:04:57.444980 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-registry-tls podName:991dcfd6-74f7-47ef-9682-ac84ec6e81f2 nodeName:}" failed. No retries permitted until 2026-04-20 15:05:01.444962718 +0000 UTC m=+158.817549400 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-registry-tls") pod "image-registry-779c949c75-b2sq6" (UID: "991dcfd6-74f7-47ef-9682-ac84ec6e81f2") : secret "image-registry-tls" not found Apr 20 15:04:57.553150 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:57.553122 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pbfc9_ab96425d-f444-4af7-9052-1bddacccef53/console-operator/1.log" Apr 20 15:04:57.553598 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:57.553539 2575 scope.go:117] "RemoveContainer" containerID="8702cf283dbde2d293aae31f4d850cd955380cb9a11fb0aaeebfe6a66203c402" Apr 20 15:04:57.553774 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:04:57.553742 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-pbfc9_openshift-console-operator(ab96425d-f444-4af7-9052-1bddacccef53)\"" pod="openshift-console-operator/console-operator-9d4b6777b-pbfc9" podUID="ab96425d-f444-4af7-9052-1bddacccef53" Apr 20 15:04:58.348845 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:58.348818 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-wfw5f_2e79beec-c5db-41c7-a60e-c759696b1d60/dns-node-resolver/0.log" Apr 20 15:04:58.519342 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:04:58.519298 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-xp9tj" podUID="bfea901a-e1df-46c7-b211-b94f978562b5" Apr 20 15:04:58.535581 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:04:58.535546 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-gcg7h" podUID="cfa73a34-d39a-4a89-b936-de5c6399f787" Apr 20 15:04:58.555060 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:58.555038 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xp9tj" Apr 20 15:04:59.148865 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:59.148836 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-flrxv_86824264-d16e-4d82-854b-f1f5bc86483c/node-ca/0.log" Apr 20 15:04:59.514011 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:59.513928 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-qpn6s"] Apr 20 15:04:59.516129 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:59.516108 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qpn6s" Apr 20 15:04:59.518728 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:59.518704 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-k8lr5\"" Apr 20 15:04:59.527165 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:59.527134 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-qpn6s"] Apr 20 15:04:59.664307 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:59.664251 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tfdx\" (UniqueName: \"kubernetes.io/projected/aadef030-ec6b-407e-b2b2-b9d3a0070c4e-kube-api-access-7tfdx\") pod \"network-check-source-8894fc9bd-qpn6s\" (UID: \"aadef030-ec6b-407e-b2b2-b9d3a0070c4e\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qpn6s" Apr 20 15:04:59.765855 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:59.765751 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7tfdx\" (UniqueName: \"kubernetes.io/projected/aadef030-ec6b-407e-b2b2-b9d3a0070c4e-kube-api-access-7tfdx\") pod \"network-check-source-8894fc9bd-qpn6s\" (UID: \"aadef030-ec6b-407e-b2b2-b9d3a0070c4e\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qpn6s" Apr 20 15:04:59.778729 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:59.778694 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tfdx\" (UniqueName: \"kubernetes.io/projected/aadef030-ec6b-407e-b2b2-b9d3a0070c4e-kube-api-access-7tfdx\") pod \"network-check-source-8894fc9bd-qpn6s\" (UID: \"aadef030-ec6b-407e-b2b2-b9d3a0070c4e\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qpn6s" Apr 20 15:04:59.826468 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:59.826426 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qpn6s" Apr 20 15:04:59.957328 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:59.957292 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-qpn6s"] Apr 20 15:04:59.960771 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:04:59.960734 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaadef030_ec6b_407e_b2b2_b9d3a0070c4e.slice/crio-a347884ae601297191d25b3f9ce758165e7ea58e990ecf150e9c883909eabd5a WatchSource:0}: Error finding container a347884ae601297191d25b3f9ce758165e7ea58e990ecf150e9c883909eabd5a: Status 404 returned error can't find the container with id a347884ae601297191d25b3f9ce758165e7ea58e990ecf150e9c883909eabd5a Apr 20 15:04:59.967219 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:04:59.967192 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/af67f37d-b332-44b7-8678-28aa45d26ed9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-j22cl\" (UID: \"af67f37d-b332-44b7-8678-28aa45d26ed9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j22cl" Apr 20 15:04:59.967400 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:04:59.967371 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 15:04:59.967475 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:04:59.967449 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af67f37d-b332-44b7-8678-28aa45d26ed9-cluster-monitoring-operator-tls podName:af67f37d-b332-44b7-8678-28aa45d26ed9 nodeName:}" failed. No retries permitted until 2026-04-20 15:05:07.967426159 +0000 UTC m=+165.340012857 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/af67f37d-b332-44b7-8678-28aa45d26ed9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-j22cl" (UID: "af67f37d-b332-44b7-8678-28aa45d26ed9") : secret "cluster-monitoring-operator-tls" not found Apr 20 15:05:00.198876 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:05:00.198832 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-lnqrj" podUID="ee584f46-b9aa-46b2-a060-01c6f4e256e9" Apr 20 15:05:00.560668 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:00.560624 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8489c" event={"ID":"d6a53b42-a8ae-4454-b200-47881e42577a","Type":"ContainerStarted","Data":"17211fdb1002e351d870e1ba336c4c311d814e13ea6cd79f212fd337f7753ae1"} Apr 20 15:05:00.561851 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:00.561827 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qpn6s" event={"ID":"aadef030-ec6b-407e-b2b2-b9d3a0070c4e","Type":"ContainerStarted","Data":"203402ee88126235a578d75c256462fe5af79ed6b452f634eadc057f88cc6e49"} Apr 20 15:05:00.561851 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:00.561854 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qpn6s" event={"ID":"aadef030-ec6b-407e-b2b2-b9d3a0070c4e","Type":"ContainerStarted","Data":"a347884ae601297191d25b3f9ce758165e7ea58e990ecf150e9c883909eabd5a"} Apr 20 15:05:00.580608 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:00.580557 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8489c" podStartSLOduration=1.419870606 podStartE2EDuration="4.580543092s" podCreationTimestamp="2026-04-20 15:04:56 +0000 UTC" firstStartedPulling="2026-04-20 15:04:56.536956697 +0000 UTC m=+153.909543380" lastFinishedPulling="2026-04-20 15:04:59.697629181 +0000 UTC m=+157.070215866" observedRunningTime="2026-04-20 15:05:00.580097377 +0000 UTC m=+157.952684107" watchObservedRunningTime="2026-04-20 15:05:00.580543092 +0000 UTC m=+157.953129803" Apr 20 15:05:00.595801 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:00.595742 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qpn6s" podStartSLOduration=1.5957220859999999 podStartE2EDuration="1.595722086s" podCreationTimestamp="2026-04-20 15:04:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:05:00.594423181 +0000 UTC m=+157.967009922" watchObservedRunningTime="2026-04-20 15:05:00.595722086 +0000 UTC m=+157.968308791" Apr 20 15:05:01.477574 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:01.477531 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-registry-tls\") pod \"image-registry-779c949c75-b2sq6\" (UID: \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\") " pod="openshift-image-registry/image-registry-779c949c75-b2sq6" Apr 20 15:05:01.478001 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:05:01.477653 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 15:05:01.478001 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:05:01.477666 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-779c949c75-b2sq6: secret "image-registry-tls" not found Apr 20 15:05:01.478001 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:05:01.477717 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-registry-tls podName:991dcfd6-74f7-47ef-9682-ac84ec6e81f2 nodeName:}" failed. No retries permitted until 2026-04-20 15:05:09.477702585 +0000 UTC m=+166.850289268 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-registry-tls") pod "image-registry-779c949c75-b2sq6" (UID: "991dcfd6-74f7-47ef-9682-ac84ec6e81f2") : secret "image-registry-tls" not found Apr 20 15:05:02.464288 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:02.464226 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-pbfc9" Apr 20 15:05:02.464480 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:02.464308 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-pbfc9" Apr 20 15:05:02.464662 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:02.464650 2575 scope.go:117] "RemoveContainer" containerID="8702cf283dbde2d293aae31f4d850cd955380cb9a11fb0aaeebfe6a66203c402" Apr 20 15:05:02.464823 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:05:02.464808 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-pbfc9_openshift-console-operator(ab96425d-f444-4af7-9052-1bddacccef53)\"" pod="openshift-console-operator/console-operator-9d4b6777b-pbfc9" podUID="ab96425d-f444-4af7-9052-1bddacccef53" Apr 20 15:05:03.391228 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:03.391171 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfea901a-e1df-46c7-b211-b94f978562b5-metrics-tls\") pod \"dns-default-xp9tj\" (UID: \"bfea901a-e1df-46c7-b211-b94f978562b5\") " pod="openshift-dns/dns-default-xp9tj" Apr 20 15:05:03.391755 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:05:03.391362 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 15:05:03.391755 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:05:03.391465 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfea901a-e1df-46c7-b211-b94f978562b5-metrics-tls podName:bfea901a-e1df-46c7-b211-b94f978562b5 nodeName:}" failed. No retries permitted until 2026-04-20 15:07:05.391443027 +0000 UTC m=+282.764029726 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bfea901a-e1df-46c7-b211-b94f978562b5-metrics-tls") pod "dns-default-xp9tj" (UID: "bfea901a-e1df-46c7-b211-b94f978562b5") : secret "dns-default-metrics-tls" not found Apr 20 15:05:03.491592 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:03.491558 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfa73a34-d39a-4a89-b936-de5c6399f787-cert\") pod \"ingress-canary-gcg7h\" (UID: \"cfa73a34-d39a-4a89-b936-de5c6399f787\") " pod="openshift-ingress-canary/ingress-canary-gcg7h" Apr 20 15:05:03.491787 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:05:03.491679 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 15:05:03.491787 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:05:03.491729 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfa73a34-d39a-4a89-b936-de5c6399f787-cert podName:cfa73a34-d39a-4a89-b936-de5c6399f787 nodeName:}" failed. No retries permitted until 2026-04-20 15:07:05.491715006 +0000 UTC m=+282.864301689 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cfa73a34-d39a-4a89-b936-de5c6399f787-cert") pod "ingress-canary-gcg7h" (UID: "cfa73a34-d39a-4a89-b936-de5c6399f787") : secret "canary-serving-cert" not found Apr 20 15:05:05.527553 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:05.527515 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-tz8cm"] Apr 20 15:05:05.531490 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:05.530360 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-tz8cm" Apr 20 15:05:05.533238 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:05.533217 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-xw5kl\"" Apr 20 15:05:05.533550 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:05.533532 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 20 15:05:05.534359 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:05.534341 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 20 15:05:05.534458 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:05.534419 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 20 15:05:05.534458 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:05.534428 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 20 15:05:05.539556 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:05.539527 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-tz8cm"] Apr 20 15:05:05.709537 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:05.709494 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7fc6fb0d-5c61-4892-acd6-f81181109b07-signing-cabundle\") pod \"service-ca-865cb79987-tz8cm\" (UID: \"7fc6fb0d-5c61-4892-acd6-f81181109b07\") " pod="openshift-service-ca/service-ca-865cb79987-tz8cm" Apr 20 15:05:05.709689 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:05.709611 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7fc6fb0d-5c61-4892-acd6-f81181109b07-signing-key\") pod \"service-ca-865cb79987-tz8cm\" (UID: \"7fc6fb0d-5c61-4892-acd6-f81181109b07\") " pod="openshift-service-ca/service-ca-865cb79987-tz8cm" Apr 20 15:05:05.709689 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:05.709635 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnvqp\" (UniqueName: \"kubernetes.io/projected/7fc6fb0d-5c61-4892-acd6-f81181109b07-kube-api-access-tnvqp\") pod \"service-ca-865cb79987-tz8cm\" (UID: \"7fc6fb0d-5c61-4892-acd6-f81181109b07\") " pod="openshift-service-ca/service-ca-865cb79987-tz8cm" Apr 20 15:05:05.810299 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:05.810215 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7fc6fb0d-5c61-4892-acd6-f81181109b07-signing-key\") pod \"service-ca-865cb79987-tz8cm\" (UID: \"7fc6fb0d-5c61-4892-acd6-f81181109b07\") " pod="openshift-service-ca/service-ca-865cb79987-tz8cm" Apr 20 15:05:05.810299 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:05.810251 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tnvqp\" (UniqueName: \"kubernetes.io/projected/7fc6fb0d-5c61-4892-acd6-f81181109b07-kube-api-access-tnvqp\") pod \"service-ca-865cb79987-tz8cm\" (UID: \"7fc6fb0d-5c61-4892-acd6-f81181109b07\") " pod="openshift-service-ca/service-ca-865cb79987-tz8cm" Apr 20 15:05:05.810445 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:05.810394 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7fc6fb0d-5c61-4892-acd6-f81181109b07-signing-cabundle\") pod \"service-ca-865cb79987-tz8cm\" (UID: \"7fc6fb0d-5c61-4892-acd6-f81181109b07\") " pod="openshift-service-ca/service-ca-865cb79987-tz8cm" Apr 20 15:05:05.811009 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:05.810989 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7fc6fb0d-5c61-4892-acd6-f81181109b07-signing-cabundle\") pod \"service-ca-865cb79987-tz8cm\" (UID: \"7fc6fb0d-5c61-4892-acd6-f81181109b07\") " pod="openshift-service-ca/service-ca-865cb79987-tz8cm" Apr 20 15:05:05.812762 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:05.812738 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7fc6fb0d-5c61-4892-acd6-f81181109b07-signing-key\") pod \"service-ca-865cb79987-tz8cm\" (UID: \"7fc6fb0d-5c61-4892-acd6-f81181109b07\") " pod="openshift-service-ca/service-ca-865cb79987-tz8cm" Apr 20 15:05:05.819121 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:05.819092 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnvqp\" (UniqueName: \"kubernetes.io/projected/7fc6fb0d-5c61-4892-acd6-f81181109b07-kube-api-access-tnvqp\") pod \"service-ca-865cb79987-tz8cm\" (UID: \"7fc6fb0d-5c61-4892-acd6-f81181109b07\") " pod="openshift-service-ca/service-ca-865cb79987-tz8cm" Apr 20 15:05:05.840584 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:05.840548 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-tz8cm" Apr 20 15:05:05.976441 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:05.976350 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-tz8cm"] Apr 20 15:05:05.978929 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:05:05.978893 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fc6fb0d_5c61_4892_acd6_f81181109b07.slice/crio-f2d972219379c75ebc6383145614ccba39dbb80c3a1b2ef3fbaf82bba1eaac44 WatchSource:0}: Error finding container f2d972219379c75ebc6383145614ccba39dbb80c3a1b2ef3fbaf82bba1eaac44: Status 404 returned error can't find the container with id f2d972219379c75ebc6383145614ccba39dbb80c3a1b2ef3fbaf82bba1eaac44 Apr 20 15:05:06.575378 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:06.575284 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-tz8cm" event={"ID":"7fc6fb0d-5c61-4892-acd6-f81181109b07","Type":"ContainerStarted","Data":"f2d972219379c75ebc6383145614ccba39dbb80c3a1b2ef3fbaf82bba1eaac44"} Apr 20 15:05:08.026944 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:08.026828 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/af67f37d-b332-44b7-8678-28aa45d26ed9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-j22cl\" (UID: \"af67f37d-b332-44b7-8678-28aa45d26ed9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j22cl" Apr 20 15:05:08.027423 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:05:08.027018 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 15:05:08.027423 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:05:08.027104 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af67f37d-b332-44b7-8678-28aa45d26ed9-cluster-monitoring-operator-tls podName:af67f37d-b332-44b7-8678-28aa45d26ed9 nodeName:}" failed. No retries permitted until 2026-04-20 15:05:24.027081357 +0000 UTC m=+181.399668064 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/af67f37d-b332-44b7-8678-28aa45d26ed9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-j22cl" (UID: "af67f37d-b332-44b7-8678-28aa45d26ed9") : secret "cluster-monitoring-operator-tls" not found Apr 20 15:05:08.581945 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:08.581905 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-tz8cm" event={"ID":"7fc6fb0d-5c61-4892-acd6-f81181109b07","Type":"ContainerStarted","Data":"449adc2314e28bd91a1d1cbc2afc608757bd9149adb8e15db3561c85b36af27b"} Apr 20 15:05:08.597588 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:08.597532 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-tz8cm" podStartSLOduration=1.8369673 podStartE2EDuration="3.597514576s" podCreationTimestamp="2026-04-20 15:05:05 +0000 UTC" firstStartedPulling="2026-04-20 15:05:05.98090957 +0000 UTC m=+163.353496266" lastFinishedPulling="2026-04-20 15:05:07.741456846 +0000 UTC m=+165.114043542" observedRunningTime="2026-04-20 15:05:08.597402058 +0000 UTC m=+165.969988764" watchObservedRunningTime="2026-04-20 15:05:08.597514576 +0000 UTC m=+165.970101284" Apr 20 15:05:09.537855 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:09.537813 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-registry-tls\") pod \"image-registry-779c949c75-b2sq6\" (UID: \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\") " pod="openshift-image-registry/image-registry-779c949c75-b2sq6" Apr 20 15:05:09.538240 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:05:09.537975 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 15:05:09.538240 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:05:09.537991 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-779c949c75-b2sq6: secret "image-registry-tls" not found Apr 20 15:05:09.538240 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:05:09.538050 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-registry-tls podName:991dcfd6-74f7-47ef-9682-ac84ec6e81f2 nodeName:}" failed. No retries permitted until 2026-04-20 15:05:25.538033119 +0000 UTC m=+182.910619808 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-registry-tls") pod "image-registry-779c949c75-b2sq6" (UID: "991dcfd6-74f7-47ef-9682-ac84ec6e81f2") : secret "image-registry-tls" not found Apr 20 15:05:11.178441 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:11.178395 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gcg7h" Apr 20 15:05:13.179418 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:13.179381 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnqrj" Apr 20 15:05:15.181867 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:15.181841 2575 scope.go:117] "RemoveContainer" containerID="8702cf283dbde2d293aae31f4d850cd955380cb9a11fb0aaeebfe6a66203c402" Apr 20 15:05:15.607303 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:15.607255 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pbfc9_ab96425d-f444-4af7-9052-1bddacccef53/console-operator/1.log" Apr 20 15:05:15.607484 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:15.607359 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-pbfc9" event={"ID":"ab96425d-f444-4af7-9052-1bddacccef53","Type":"ContainerStarted","Data":"ca4bb7d01eb860e73187053fcc8c4c934c107f05c6be2591249faa9a1e983716"} Apr 20 15:05:15.607662 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:15.607642 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-pbfc9" Apr 20 15:05:15.624230 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:15.624177 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-pbfc9" podStartSLOduration=21.380566937 podStartE2EDuration="23.624164075s" podCreationTimestamp="2026-04-20 15:04:52 +0000 UTC" firstStartedPulling="2026-04-20 15:04:52.579737321 +0000 UTC m=+149.952324003" lastFinishedPulling="2026-04-20 15:04:54.823334443 +0000 UTC m=+152.195921141" observedRunningTime="2026-04-20 15:05:15.623011371 +0000 UTC m=+172.995598105" watchObservedRunningTime="2026-04-20 15:05:15.624164075 +0000 UTC m=+172.996750775" Apr 20 15:05:15.997103 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:15.997022 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-pbfc9" Apr 20 15:05:24.041482 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:24.041426 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/af67f37d-b332-44b7-8678-28aa45d26ed9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-j22cl\" (UID: \"af67f37d-b332-44b7-8678-28aa45d26ed9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j22cl" Apr 20 15:05:24.043899 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:24.043873 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/af67f37d-b332-44b7-8678-28aa45d26ed9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-j22cl\" (UID: \"af67f37d-b332-44b7-8678-28aa45d26ed9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j22cl" Apr 20 15:05:24.260743 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:24.260711 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-tmk6r\"" Apr 20 15:05:24.268880 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:24.268852 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j22cl" Apr 20 15:05:24.388998 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:24.388966 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-j22cl"] Apr 20 15:05:24.392094 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:05:24.392060 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf67f37d_b332_44b7_8678_28aa45d26ed9.slice/crio-cf082be8d07f177f98341f00348d8551653e347fb11d80a8866207bfaecb89e3 WatchSource:0}: Error finding container cf082be8d07f177f98341f00348d8551653e347fb11d80a8866207bfaecb89e3: Status 404 returned error can't find the container with id cf082be8d07f177f98341f00348d8551653e347fb11d80a8866207bfaecb89e3 Apr 20 15:05:24.633132 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:24.633097 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j22cl" event={"ID":"af67f37d-b332-44b7-8678-28aa45d26ed9","Type":"ContainerStarted","Data":"cf082be8d07f177f98341f00348d8551653e347fb11d80a8866207bfaecb89e3"} Apr 20 15:05:25.551996 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:25.551962 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-registry-tls\") pod \"image-registry-779c949c75-b2sq6\" (UID: \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\") " pod="openshift-image-registry/image-registry-779c949c75-b2sq6" Apr 20 15:05:25.554466 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:25.554441 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-registry-tls\") pod \"image-registry-779c949c75-b2sq6\" (UID: \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\") " pod="openshift-image-registry/image-registry-779c949c75-b2sq6" Apr 20 15:05:25.842959 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:25.842922 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-nfkdq\"" Apr 20 15:05:25.845012 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:25.844988 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-779c949c75-b2sq6" Apr 20 15:05:25.979297 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:25.979243 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-779c949c75-b2sq6"] Apr 20 15:05:25.982803 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:05:25.982765 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod991dcfd6_74f7_47ef_9682_ac84ec6e81f2.slice/crio-a8a8e4a4c608fb272c1a3aab6487d6949462aafd7d946fc5ebb30569b8507ec5 WatchSource:0}: Error finding container a8a8e4a4c608fb272c1a3aab6487d6949462aafd7d946fc5ebb30569b8507ec5: Status 404 returned error can't find the container with id a8a8e4a4c608fb272c1a3aab6487d6949462aafd7d946fc5ebb30569b8507ec5 Apr 20 15:05:26.639125 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:26.639097 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-779c949c75-b2sq6" event={"ID":"991dcfd6-74f7-47ef-9682-ac84ec6e81f2","Type":"ContainerStarted","Data":"de72db6cad19d24a90ab4a7c4040cb5fccc2815548046ff6ae7c5b3450b3c7b3"} Apr 20 15:05:26.639125 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:26.639129 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-779c949c75-b2sq6" event={"ID":"991dcfd6-74f7-47ef-9682-ac84ec6e81f2","Type":"ContainerStarted","Data":"a8a8e4a4c608fb272c1a3aab6487d6949462aafd7d946fc5ebb30569b8507ec5"} Apr 20 15:05:26.639514 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:26.639154 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-779c949c75-b2sq6" Apr 20 15:05:26.657693 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:26.657641 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-779c949c75-b2sq6" podStartSLOduration=33.657623177 podStartE2EDuration="33.657623177s" podCreationTimestamp="2026-04-20 15:04:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:05:26.656537086 +0000 UTC m=+184.029123803" watchObservedRunningTime="2026-04-20 15:05:26.657623177 +0000 UTC m=+184.030209882" Apr 20 15:05:27.643684 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:27.643644 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j22cl" event={"ID":"af67f37d-b332-44b7-8678-28aa45d26ed9","Type":"ContainerStarted","Data":"318a4c9ebb8ef0ae85dddf0be52f3d243373c6f3ce76885a382b0ba5e7058f48"} Apr 20 15:05:27.659547 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:27.659496 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j22cl" podStartSLOduration=33.457965064 podStartE2EDuration="35.659481191s" podCreationTimestamp="2026-04-20 15:04:52 +0000 UTC" firstStartedPulling="2026-04-20 15:05:24.393919597 +0000 UTC m=+181.766506282" lastFinishedPulling="2026-04-20 15:05:26.595435723 +0000 UTC m=+183.968022409" observedRunningTime="2026-04-20 15:05:27.659302223 +0000 UTC m=+185.031888928" watchObservedRunningTime="2026-04-20 15:05:27.659481191 +0000 UTC m=+185.032067895" Apr 20 15:05:28.198659 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.198629 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-6jz9t"] Apr 20 15:05:28.200717 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.200690 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6jz9t" Apr 20 15:05:28.203150 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.203124 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 15:05:28.204237 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.204204 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 15:05:28.204387 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.204252 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-8mlqf\"" Apr 20 15:05:28.205744 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.205725 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 15:05:28.205820 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.205779 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 15:05:28.219540 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.219512 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6jz9t"] Apr 20 15:05:28.242885 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.242845 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-779c949c75-b2sq6"] Apr 20 15:05:28.273870 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.273827 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/486a1f57-c24b-4698-b65e-1c79387c2c19-crio-socket\") pod \"insights-runtime-extractor-6jz9t\" (UID: \"486a1f57-c24b-4698-b65e-1c79387c2c19\") " pod="openshift-insights/insights-runtime-extractor-6jz9t" Apr 20 15:05:28.273870 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.273865 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/486a1f57-c24b-4698-b65e-1c79387c2c19-data-volume\") pod \"insights-runtime-extractor-6jz9t\" (UID: \"486a1f57-c24b-4698-b65e-1c79387c2c19\") " pod="openshift-insights/insights-runtime-extractor-6jz9t" Apr 20 15:05:28.274081 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.273886 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/486a1f57-c24b-4698-b65e-1c79387c2c19-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6jz9t\" (UID: \"486a1f57-c24b-4698-b65e-1c79387c2c19\") " pod="openshift-insights/insights-runtime-extractor-6jz9t" Apr 20 15:05:28.274081 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.273953 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dc5c\" (UniqueName: \"kubernetes.io/projected/486a1f57-c24b-4698-b65e-1c79387c2c19-kube-api-access-8dc5c\") pod \"insights-runtime-extractor-6jz9t\" (UID: \"486a1f57-c24b-4698-b65e-1c79387c2c19\") " pod="openshift-insights/insights-runtime-extractor-6jz9t" Apr 20 15:05:28.274081 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.273983 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/486a1f57-c24b-4698-b65e-1c79387c2c19-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6jz9t\" (UID: \"486a1f57-c24b-4698-b65e-1c79387c2c19\") " pod="openshift-insights/insights-runtime-extractor-6jz9t" Apr 20 15:05:28.303631 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.303601 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-797c758f48-6r26r"] Apr 20 15:05:28.305503 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.305484 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-797c758f48-6r26r" Apr 20 15:05:28.310158 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.310118 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 15:05:28.310385 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.310361 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 15:05:28.310483 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.310362 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 15:05:28.310483 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.310438 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 15:05:28.310869 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.310745 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-6cr2p\"" Apr 20 15:05:28.310869 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.310759 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 15:05:28.310869 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.310818 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 15:05:28.310869 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.310836 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 15:05:28.322202 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.322171 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-797c758f48-6r26r"] Apr 20 15:05:28.375318 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.375266 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/72600b94-c8a8-4eef-900d-a1f3ec8450e8-console-serving-cert\") pod \"console-797c758f48-6r26r\" (UID: \"72600b94-c8a8-4eef-900d-a1f3ec8450e8\") " pod="openshift-console/console-797c758f48-6r26r" Apr 20 15:05:28.375482 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.375327 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/72600b94-c8a8-4eef-900d-a1f3ec8450e8-console-config\") pod \"console-797c758f48-6r26r\" (UID: \"72600b94-c8a8-4eef-900d-a1f3ec8450e8\") " pod="openshift-console/console-797c758f48-6r26r" Apr 20 15:05:28.375482 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.375385 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8dc5c\" (UniqueName: \"kubernetes.io/projected/486a1f57-c24b-4698-b65e-1c79387c2c19-kube-api-access-8dc5c\") pod \"insights-runtime-extractor-6jz9t\" (UID: \"486a1f57-c24b-4698-b65e-1c79387c2c19\") " pod="openshift-insights/insights-runtime-extractor-6jz9t" Apr 20 15:05:28.375482 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.375457 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/486a1f57-c24b-4698-b65e-1c79387c2c19-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6jz9t\" (UID: \"486a1f57-c24b-4698-b65e-1c79387c2c19\") " pod="openshift-insights/insights-runtime-extractor-6jz9t" Apr 20 15:05:28.375655 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.375526 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/72600b94-c8a8-4eef-900d-a1f3ec8450e8-oauth-serving-cert\") pod \"console-797c758f48-6r26r\" (UID: \"72600b94-c8a8-4eef-900d-a1f3ec8450e8\") " pod="openshift-console/console-797c758f48-6r26r" Apr 20 15:05:28.375708 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.375650 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/72600b94-c8a8-4eef-900d-a1f3ec8450e8-console-oauth-config\") pod \"console-797c758f48-6r26r\" (UID: \"72600b94-c8a8-4eef-900d-a1f3ec8450e8\") " pod="openshift-console/console-797c758f48-6r26r" Apr 20 15:05:28.375708 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.375702 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/486a1f57-c24b-4698-b65e-1c79387c2c19-crio-socket\") pod \"insights-runtime-extractor-6jz9t\" (UID: \"486a1f57-c24b-4698-b65e-1c79387c2c19\") " pod="openshift-insights/insights-runtime-extractor-6jz9t" Apr 20 15:05:28.375795 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.375723 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/486a1f57-c24b-4698-b65e-1c79387c2c19-data-volume\") pod \"insights-runtime-extractor-6jz9t\" (UID: \"486a1f57-c24b-4698-b65e-1c79387c2c19\") " pod="openshift-insights/insights-runtime-extractor-6jz9t" Apr 20 15:05:28.375795 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.375748 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/486a1f57-c24b-4698-b65e-1c79387c2c19-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6jz9t\" (UID: \"486a1f57-c24b-4698-b65e-1c79387c2c19\") " pod="openshift-insights/insights-runtime-extractor-6jz9t" Apr 20 15:05:28.375795 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.375783 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkvw6\" (UniqueName: \"kubernetes.io/projected/72600b94-c8a8-4eef-900d-a1f3ec8450e8-kube-api-access-tkvw6\") pod \"console-797c758f48-6r26r\" (UID: \"72600b94-c8a8-4eef-900d-a1f3ec8450e8\") " pod="openshift-console/console-797c758f48-6r26r" Apr 20 15:05:28.375795 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.375788 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/486a1f57-c24b-4698-b65e-1c79387c2c19-crio-socket\") pod \"insights-runtime-extractor-6jz9t\" (UID: \"486a1f57-c24b-4698-b65e-1c79387c2c19\") " pod="openshift-insights/insights-runtime-extractor-6jz9t" Apr 20 15:05:28.375986 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.375847 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/72600b94-c8a8-4eef-900d-a1f3ec8450e8-service-ca\") pod \"console-797c758f48-6r26r\" (UID: \"72600b94-c8a8-4eef-900d-a1f3ec8450e8\") " pod="openshift-console/console-797c758f48-6r26r" Apr 20 15:05:28.376175 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.376154 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/486a1f57-c24b-4698-b65e-1c79387c2c19-data-volume\") pod \"insights-runtime-extractor-6jz9t\" (UID: \"486a1f57-c24b-4698-b65e-1c79387c2c19\") " pod="openshift-insights/insights-runtime-extractor-6jz9t" Apr 20 15:05:28.376446 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.376426 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/486a1f57-c24b-4698-b65e-1c79387c2c19-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6jz9t\" (UID: \"486a1f57-c24b-4698-b65e-1c79387c2c19\") " pod="openshift-insights/insights-runtime-extractor-6jz9t" Apr 20 15:05:28.378053 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.378036 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/486a1f57-c24b-4698-b65e-1c79387c2c19-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6jz9t\" (UID: \"486a1f57-c24b-4698-b65e-1c79387c2c19\") " pod="openshift-insights/insights-runtime-extractor-6jz9t" Apr 20 15:05:28.400965 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.400928 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dc5c\" (UniqueName: \"kubernetes.io/projected/486a1f57-c24b-4698-b65e-1c79387c2c19-kube-api-access-8dc5c\") pod \"insights-runtime-extractor-6jz9t\" (UID: \"486a1f57-c24b-4698-b65e-1c79387c2c19\") " pod="openshift-insights/insights-runtime-extractor-6jz9t" Apr 20 15:05:28.476455 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.476357 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/72600b94-c8a8-4eef-900d-a1f3ec8450e8-oauth-serving-cert\") pod \"console-797c758f48-6r26r\" (UID: \"72600b94-c8a8-4eef-900d-a1f3ec8450e8\") " pod="openshift-console/console-797c758f48-6r26r" Apr 20 15:05:28.476455 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.476405 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/72600b94-c8a8-4eef-900d-a1f3ec8450e8-console-oauth-config\") pod \"console-797c758f48-6r26r\" (UID: \"72600b94-c8a8-4eef-900d-a1f3ec8450e8\") " pod="openshift-console/console-797c758f48-6r26r" Apr 20 15:05:28.476455 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.476440 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tkvw6\" (UniqueName: \"kubernetes.io/projected/72600b94-c8a8-4eef-900d-a1f3ec8450e8-kube-api-access-tkvw6\") pod \"console-797c758f48-6r26r\" (UID: \"72600b94-c8a8-4eef-900d-a1f3ec8450e8\") " pod="openshift-console/console-797c758f48-6r26r" Apr 20 15:05:28.476733 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.476478 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/72600b94-c8a8-4eef-900d-a1f3ec8450e8-service-ca\") pod \"console-797c758f48-6r26r\" (UID: \"72600b94-c8a8-4eef-900d-a1f3ec8450e8\") " pod="openshift-console/console-797c758f48-6r26r" Apr 20 15:05:28.476733 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.476512 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/72600b94-c8a8-4eef-900d-a1f3ec8450e8-console-serving-cert\") pod \"console-797c758f48-6r26r\" (UID: \"72600b94-c8a8-4eef-900d-a1f3ec8450e8\") " pod="openshift-console/console-797c758f48-6r26r" Apr 20 15:05:28.476733 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.476534 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/72600b94-c8a8-4eef-900d-a1f3ec8450e8-console-config\") pod \"console-797c758f48-6r26r\" (UID: \"72600b94-c8a8-4eef-900d-a1f3ec8450e8\") " pod="openshift-console/console-797c758f48-6r26r" Apr 20 15:05:28.477249 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.477218 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/72600b94-c8a8-4eef-900d-a1f3ec8450e8-oauth-serving-cert\") pod \"console-797c758f48-6r26r\" (UID: \"72600b94-c8a8-4eef-900d-a1f3ec8450e8\") " pod="openshift-console/console-797c758f48-6r26r" Apr 20 15:05:28.477380 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.477323 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/72600b94-c8a8-4eef-900d-a1f3ec8450e8-console-config\") pod \"console-797c758f48-6r26r\" (UID: \"72600b94-c8a8-4eef-900d-a1f3ec8450e8\") " pod="openshift-console/console-797c758f48-6r26r" Apr 20 15:05:28.477380 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.477357 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/72600b94-c8a8-4eef-900d-a1f3ec8450e8-service-ca\") pod \"console-797c758f48-6r26r\" (UID: \"72600b94-c8a8-4eef-900d-a1f3ec8450e8\") " pod="openshift-console/console-797c758f48-6r26r" Apr 20 15:05:28.479154 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.479135 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/72600b94-c8a8-4eef-900d-a1f3ec8450e8-console-oauth-config\") pod \"console-797c758f48-6r26r\" (UID: \"72600b94-c8a8-4eef-900d-a1f3ec8450e8\") " pod="openshift-console/console-797c758f48-6r26r" Apr 20 15:05:28.479237 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.479220 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/72600b94-c8a8-4eef-900d-a1f3ec8450e8-console-serving-cert\") pod \"console-797c758f48-6r26r\" (UID: \"72600b94-c8a8-4eef-900d-a1f3ec8450e8\") " pod="openshift-console/console-797c758f48-6r26r" Apr 20 15:05:28.486980 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.486952 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkvw6\" (UniqueName: \"kubernetes.io/projected/72600b94-c8a8-4eef-900d-a1f3ec8450e8-kube-api-access-tkvw6\") pod \"console-797c758f48-6r26r\" (UID: \"72600b94-c8a8-4eef-900d-a1f3ec8450e8\") " pod="openshift-console/console-797c758f48-6r26r" Apr 20 15:05:28.510206 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.510162 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6jz9t" Apr 20 15:05:28.615178 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.615143 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-797c758f48-6r26r" Apr 20 15:05:28.640294 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.640237 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6jz9t"] Apr 20 15:05:28.643893 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:05:28.643862 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod486a1f57_c24b_4698_b65e_1c79387c2c19.slice/crio-e563db47bd63f065fe053eb779ab6eb9a6ae17f35b4d30c3f0044153c2fc9795 WatchSource:0}: Error finding container e563db47bd63f065fe053eb779ab6eb9a6ae17f35b4d30c3f0044153c2fc9795: Status 404 returned error can't find the container with id e563db47bd63f065fe053eb779ab6eb9a6ae17f35b4d30c3f0044153c2fc9795 Apr 20 15:05:28.740858 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:28.740829 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-797c758f48-6r26r"] Apr 20 15:05:28.744335 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:05:28.744256 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72600b94_c8a8_4eef_900d_a1f3ec8450e8.slice/crio-2e68f59dfa6931b01373e1d83a2c8d3b0cd40b44fc4247dfd52adc245a3f3c89 WatchSource:0}: Error finding container 2e68f59dfa6931b01373e1d83a2c8d3b0cd40b44fc4247dfd52adc245a3f3c89: Status 404 returned error can't find the container with id 2e68f59dfa6931b01373e1d83a2c8d3b0cd40b44fc4247dfd52adc245a3f3c89 Apr 20 15:05:29.652592 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:29.652551 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-797c758f48-6r26r" event={"ID":"72600b94-c8a8-4eef-900d-a1f3ec8450e8","Type":"ContainerStarted","Data":"2e68f59dfa6931b01373e1d83a2c8d3b0cd40b44fc4247dfd52adc245a3f3c89"} Apr 20 15:05:29.654235 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:29.654207 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6jz9t" event={"ID":"486a1f57-c24b-4698-b65e-1c79387c2c19","Type":"ContainerStarted","Data":"d561c84819bbc180a8f38068ca832e3f91b47366bf6d9dd5176b17a184fe5174"} Apr 20 15:05:29.654368 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:29.654240 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6jz9t" event={"ID":"486a1f57-c24b-4698-b65e-1c79387c2c19","Type":"ContainerStarted","Data":"d207ad72392798cd42e0083fe44c98e045b1ae9b88e8f631f23f8bbea4c72c58"} Apr 20 15:05:29.654368 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:29.654250 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6jz9t" event={"ID":"486a1f57-c24b-4698-b65e-1c79387c2c19","Type":"ContainerStarted","Data":"e563db47bd63f065fe053eb779ab6eb9a6ae17f35b4d30c3f0044153c2fc9795"} Apr 20 15:05:32.664564 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:32.664531 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-797c758f48-6r26r" event={"ID":"72600b94-c8a8-4eef-900d-a1f3ec8450e8","Type":"ContainerStarted","Data":"5dea7b32d014db592c778f837cc9b921f8cd873ff16b4d237e29d8996804f2b0"} Apr 20 15:05:32.666489 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:32.666457 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6jz9t" event={"ID":"486a1f57-c24b-4698-b65e-1c79387c2c19","Type":"ContainerStarted","Data":"f01e5f3b86898b06e96c0df31fa2675e0da94ce4b43ec7f7261adb11ccbfa1dc"} Apr 20 15:05:32.683102 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:32.683052 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-797c758f48-6r26r" podStartSLOduration=1.224150267 podStartE2EDuration="4.683035613s" podCreationTimestamp="2026-04-20 15:05:28 +0000 UTC" firstStartedPulling="2026-04-20 15:05:28.746214888 +0000 UTC m=+186.118801579" lastFinishedPulling="2026-04-20 15:05:32.205100243 +0000 UTC m=+189.577686925" observedRunningTime="2026-04-20 15:05:32.681936502 +0000 UTC m=+190.054523207" watchObservedRunningTime="2026-04-20 15:05:32.683035613 +0000 UTC m=+190.055622295" Apr 20 15:05:32.699095 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:32.699042 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-6jz9t" podStartSLOduration=1.238744369 podStartE2EDuration="4.699020031s" podCreationTimestamp="2026-04-20 15:05:28 +0000 UTC" firstStartedPulling="2026-04-20 15:05:28.744471184 +0000 UTC m=+186.117057866" lastFinishedPulling="2026-04-20 15:05:32.204746844 +0000 UTC m=+189.577333528" observedRunningTime="2026-04-20 15:05:32.69893167 +0000 UTC m=+190.071518375" watchObservedRunningTime="2026-04-20 15:05:32.699020031 +0000 UTC m=+190.071606735" Apr 20 15:05:35.680965 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:35.680928 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-68ff96dbdb-n96jh"] Apr 20 15:05:35.682726 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:35.682708 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68ff96dbdb-n96jh" Apr 20 15:05:35.691154 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:35.691127 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 20 15:05:35.694829 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:35.694802 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68ff96dbdb-n96jh"] Apr 20 15:05:35.734448 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:35.734408 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2125979b-7f4a-4e85-9570-b04ab627c4aa-oauth-serving-cert\") pod \"console-68ff96dbdb-n96jh\" (UID: \"2125979b-7f4a-4e85-9570-b04ab627c4aa\") " pod="openshift-console/console-68ff96dbdb-n96jh" Apr 20 15:05:35.734448 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:35.734460 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsm4f\" (UniqueName: \"kubernetes.io/projected/2125979b-7f4a-4e85-9570-b04ab627c4aa-kube-api-access-tsm4f\") pod \"console-68ff96dbdb-n96jh\" (UID: \"2125979b-7f4a-4e85-9570-b04ab627c4aa\") " pod="openshift-console/console-68ff96dbdb-n96jh" Apr 20 15:05:35.734706 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:35.734530 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2125979b-7f4a-4e85-9570-b04ab627c4aa-console-config\") pod \"console-68ff96dbdb-n96jh\" (UID: \"2125979b-7f4a-4e85-9570-b04ab627c4aa\") " pod="openshift-console/console-68ff96dbdb-n96jh" Apr 20 15:05:35.734706 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:35.734578 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2125979b-7f4a-4e85-9570-b04ab627c4aa-service-ca\") pod \"console-68ff96dbdb-n96jh\" (UID: \"2125979b-7f4a-4e85-9570-b04ab627c4aa\") " pod="openshift-console/console-68ff96dbdb-n96jh" Apr 20 15:05:35.734706 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:35.734612 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2125979b-7f4a-4e85-9570-b04ab627c4aa-console-oauth-config\") pod \"console-68ff96dbdb-n96jh\" (UID: \"2125979b-7f4a-4e85-9570-b04ab627c4aa\") " pod="openshift-console/console-68ff96dbdb-n96jh" Apr 20 15:05:35.734706 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:35.734688 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2125979b-7f4a-4e85-9570-b04ab627c4aa-console-serving-cert\") pod \"console-68ff96dbdb-n96jh\" (UID: \"2125979b-7f4a-4e85-9570-b04ab627c4aa\") " pod="openshift-console/console-68ff96dbdb-n96jh" Apr 20 15:05:35.734850 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:35.734727 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2125979b-7f4a-4e85-9570-b04ab627c4aa-trusted-ca-bundle\") pod \"console-68ff96dbdb-n96jh\" (UID: \"2125979b-7f4a-4e85-9570-b04ab627c4aa\") " pod="openshift-console/console-68ff96dbdb-n96jh" Apr 20 15:05:35.836056 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:35.836020 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2125979b-7f4a-4e85-9570-b04ab627c4aa-oauth-serving-cert\") pod \"console-68ff96dbdb-n96jh\" (UID: \"2125979b-7f4a-4e85-9570-b04ab627c4aa\") " pod="openshift-console/console-68ff96dbdb-n96jh" Apr 20 15:05:35.836245 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:35.836076 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tsm4f\" (UniqueName: \"kubernetes.io/projected/2125979b-7f4a-4e85-9570-b04ab627c4aa-kube-api-access-tsm4f\") pod \"console-68ff96dbdb-n96jh\" (UID: \"2125979b-7f4a-4e85-9570-b04ab627c4aa\") " pod="openshift-console/console-68ff96dbdb-n96jh" Apr 20 15:05:35.836245 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:35.836113 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2125979b-7f4a-4e85-9570-b04ab627c4aa-console-config\") pod \"console-68ff96dbdb-n96jh\" (UID: \"2125979b-7f4a-4e85-9570-b04ab627c4aa\") " pod="openshift-console/console-68ff96dbdb-n96jh" Apr 20 15:05:35.836245 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:35.836138 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2125979b-7f4a-4e85-9570-b04ab627c4aa-service-ca\") pod \"console-68ff96dbdb-n96jh\" (UID: \"2125979b-7f4a-4e85-9570-b04ab627c4aa\") " pod="openshift-console/console-68ff96dbdb-n96jh" Apr 20 15:05:35.836424 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:35.836292 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2125979b-7f4a-4e85-9570-b04ab627c4aa-console-oauth-config\") pod \"console-68ff96dbdb-n96jh\" (UID: \"2125979b-7f4a-4e85-9570-b04ab627c4aa\") " pod="openshift-console/console-68ff96dbdb-n96jh" Apr 20 15:05:35.836424 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:35.836363 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2125979b-7f4a-4e85-9570-b04ab627c4aa-console-serving-cert\") pod \"console-68ff96dbdb-n96jh\" (UID: \"2125979b-7f4a-4e85-9570-b04ab627c4aa\") " pod="openshift-console/console-68ff96dbdb-n96jh" Apr 20 15:05:35.836424 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:35.836411 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2125979b-7f4a-4e85-9570-b04ab627c4aa-trusted-ca-bundle\") pod \"console-68ff96dbdb-n96jh\" (UID: \"2125979b-7f4a-4e85-9570-b04ab627c4aa\") " pod="openshift-console/console-68ff96dbdb-n96jh" Apr 20 15:05:35.836814 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:35.836787 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2125979b-7f4a-4e85-9570-b04ab627c4aa-console-config\") pod \"console-68ff96dbdb-n96jh\" (UID: \"2125979b-7f4a-4e85-9570-b04ab627c4aa\") " pod="openshift-console/console-68ff96dbdb-n96jh" Apr 20 15:05:35.836928 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:35.836853 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2125979b-7f4a-4e85-9570-b04ab627c4aa-oauth-serving-cert\") pod \"console-68ff96dbdb-n96jh\" (UID: \"2125979b-7f4a-4e85-9570-b04ab627c4aa\") " pod="openshift-console/console-68ff96dbdb-n96jh" Apr 20 15:05:35.836928 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:35.836893 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2125979b-7f4a-4e85-9570-b04ab627c4aa-service-ca\") pod \"console-68ff96dbdb-n96jh\" (UID: \"2125979b-7f4a-4e85-9570-b04ab627c4aa\") " pod="openshift-console/console-68ff96dbdb-n96jh" Apr 20 15:05:35.837151 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:35.837131 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2125979b-7f4a-4e85-9570-b04ab627c4aa-trusted-ca-bundle\") pod \"console-68ff96dbdb-n96jh\" (UID: \"2125979b-7f4a-4e85-9570-b04ab627c4aa\") " pod="openshift-console/console-68ff96dbdb-n96jh" Apr 20 15:05:35.838755 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:35.838731 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2125979b-7f4a-4e85-9570-b04ab627c4aa-console-oauth-config\") pod \"console-68ff96dbdb-n96jh\" (UID: \"2125979b-7f4a-4e85-9570-b04ab627c4aa\") " pod="openshift-console/console-68ff96dbdb-n96jh" Apr 20 15:05:35.838893 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:35.838872 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2125979b-7f4a-4e85-9570-b04ab627c4aa-console-serving-cert\") pod \"console-68ff96dbdb-n96jh\" (UID: \"2125979b-7f4a-4e85-9570-b04ab627c4aa\") " pod="openshift-console/console-68ff96dbdb-n96jh" Apr 20 15:05:35.843678 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:35.843652 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsm4f\" (UniqueName: \"kubernetes.io/projected/2125979b-7f4a-4e85-9570-b04ab627c4aa-kube-api-access-tsm4f\") pod \"console-68ff96dbdb-n96jh\" (UID: \"2125979b-7f4a-4e85-9570-b04ab627c4aa\") " pod="openshift-console/console-68ff96dbdb-n96jh" Apr 20 15:05:35.991292 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:35.991169 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68ff96dbdb-n96jh" Apr 20 15:05:36.110380 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:36.110349 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68ff96dbdb-n96jh"] Apr 20 15:05:36.113561 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:05:36.113535 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2125979b_7f4a_4e85_9570_b04ab627c4aa.slice/crio-b5f72d2f58ed118e54e0d3f698547e202ce97d840b9722265cb795ff94b4bff8 WatchSource:0}: Error finding container b5f72d2f58ed118e54e0d3f698547e202ce97d840b9722265cb795ff94b4bff8: Status 404 returned error can't find the container with id b5f72d2f58ed118e54e0d3f698547e202ce97d840b9722265cb795ff94b4bff8 Apr 20 15:05:36.682374 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:36.682335 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68ff96dbdb-n96jh" event={"ID":"2125979b-7f4a-4e85-9570-b04ab627c4aa","Type":"ContainerStarted","Data":"d01d7912cf708fd8cf40ca99277b294bac26dc877cfd8fafac97a165ba0c959b"} Apr 20 15:05:36.682374 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:36.682375 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68ff96dbdb-n96jh" event={"ID":"2125979b-7f4a-4e85-9570-b04ab627c4aa","Type":"ContainerStarted","Data":"b5f72d2f58ed118e54e0d3f698547e202ce97d840b9722265cb795ff94b4bff8"} Apr 20 15:05:36.700750 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:36.700695 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-68ff96dbdb-n96jh" podStartSLOduration=1.7006792530000001 podStartE2EDuration="1.700679253s" podCreationTimestamp="2026-04-20 15:05:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:05:36.699914125 +0000 UTC m=+194.072500830" watchObservedRunningTime="2026-04-20 15:05:36.700679253 +0000 UTC m=+194.073265956" Apr 20 15:05:37.495006 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.494968 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-jwgdc"] Apr 20 15:05:37.497405 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.497377 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jwgdc" Apr 20 15:05:37.500406 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.500378 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 20 15:05:37.501528 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.501502 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 20 15:05:37.501644 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.501504 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-5cg6j\"" Apr 20 15:05:37.501644 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.501504 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 15:05:37.510501 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.510475 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-jwgdc"] Apr 20 15:05:37.514158 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.514133 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-7rp8b"] Apr 20 15:05:37.516925 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.516902 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-7rp8b" Apr 20 15:05:37.520238 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.520182 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-6mt8b\"" Apr 20 15:05:37.520454 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.520419 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 20 15:05:37.520795 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.520624 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 20 15:05:37.520795 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.520787 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 20 15:05:37.528622 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.528595 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-gbvlq"] Apr 20 15:05:37.531119 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.531095 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-gbvlq" Apr 20 15:05:37.531320 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.531266 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-7rp8b"] Apr 20 15:05:37.534412 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.534390 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 15:05:37.534921 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.534903 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 15:05:37.535414 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.535394 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-lk8g4\"" Apr 20 15:05:37.535525 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.535441 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 15:05:37.549715 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.549672 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b2e813cc-e62f-44d2-a6cb-ede8248e09ac-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-jwgdc\" (UID: \"b2e813cc-e62f-44d2-a6cb-ede8248e09ac\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jwgdc" Apr 20 15:05:37.549886 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.549727 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b2e813cc-e62f-44d2-a6cb-ede8248e09ac-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-jwgdc\" (UID: \"b2e813cc-e62f-44d2-a6cb-ede8248e09ac\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jwgdc" Apr 20 15:05:37.549886 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.549814 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/0525c669-b74e-43d2-a613-4a95d5c51bdd-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-7rp8b\" (UID: \"0525c669-b74e-43d2-a613-4a95d5c51bdd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7rp8b" Apr 20 15:05:37.549886 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.549849 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hslch\" (UniqueName: \"kubernetes.io/projected/0525c669-b74e-43d2-a613-4a95d5c51bdd-kube-api-access-hslch\") pod \"kube-state-metrics-69db897b98-7rp8b\" (UID: \"0525c669-b74e-43d2-a613-4a95d5c51bdd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7rp8b" Apr 20 15:05:37.549886 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.549877 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bbnl\" (UniqueName: \"kubernetes.io/projected/b2e813cc-e62f-44d2-a6cb-ede8248e09ac-kube-api-access-7bbnl\") pod \"openshift-state-metrics-9d44df66c-jwgdc\" (UID: \"b2e813cc-e62f-44d2-a6cb-ede8248e09ac\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jwgdc" Apr 20 15:05:37.550116 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.549980 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0525c669-b74e-43d2-a613-4a95d5c51bdd-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-7rp8b\" (UID: \"0525c669-b74e-43d2-a613-4a95d5c51bdd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7rp8b" Apr 20 15:05:37.550116 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.550041 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b2e813cc-e62f-44d2-a6cb-ede8248e09ac-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-jwgdc\" (UID: \"b2e813cc-e62f-44d2-a6cb-ede8248e09ac\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jwgdc" Apr 20 15:05:37.550116 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.550073 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/0525c669-b74e-43d2-a613-4a95d5c51bdd-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-7rp8b\" (UID: \"0525c669-b74e-43d2-a613-4a95d5c51bdd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7rp8b" Apr 20 15:05:37.550116 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.550097 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0525c669-b74e-43d2-a613-4a95d5c51bdd-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-7rp8b\" (UID: \"0525c669-b74e-43d2-a613-4a95d5c51bdd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7rp8b" Apr 20 15:05:37.550357 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.550126 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0525c669-b74e-43d2-a613-4a95d5c51bdd-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-7rp8b\" (UID: \"0525c669-b74e-43d2-a613-4a95d5c51bdd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7rp8b" Apr 20 15:05:37.651194 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.651137 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0525c669-b74e-43d2-a613-4a95d5c51bdd-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-7rp8b\" (UID: \"0525c669-b74e-43d2-a613-4a95d5c51bdd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7rp8b" Apr 20 15:05:37.651194 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.651200 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cb4d0256-b738-4dd1-bf9b-d24347291878-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gbvlq\" (UID: \"cb4d0256-b738-4dd1-bf9b-d24347291878\") " pod="openshift-monitoring/node-exporter-gbvlq" Apr 20 15:05:37.651482 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.651232 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6xbf\" (UniqueName: \"kubernetes.io/projected/cb4d0256-b738-4dd1-bf9b-d24347291878-kube-api-access-z6xbf\") pod \"node-exporter-gbvlq\" (UID: \"cb4d0256-b738-4dd1-bf9b-d24347291878\") " pod="openshift-monitoring/node-exporter-gbvlq" Apr 20 15:05:37.651482 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.651301 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b2e813cc-e62f-44d2-a6cb-ede8248e09ac-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-jwgdc\" (UID: \"b2e813cc-e62f-44d2-a6cb-ede8248e09ac\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jwgdc" Apr 20 15:05:37.651482 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.651327 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cb4d0256-b738-4dd1-bf9b-d24347291878-node-exporter-tls\") pod \"node-exporter-gbvlq\" (UID: \"cb4d0256-b738-4dd1-bf9b-d24347291878\") " pod="openshift-monitoring/node-exporter-gbvlq" Apr 20 15:05:37.651482 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:05:37.651335 2575 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 20 15:05:37.651482 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.651365 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/0525c669-b74e-43d2-a613-4a95d5c51bdd-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-7rp8b\" (UID: \"0525c669-b74e-43d2-a613-4a95d5c51bdd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7rp8b" Apr 20 15:05:37.651482 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.651395 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0525c669-b74e-43d2-a613-4a95d5c51bdd-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-7rp8b\" (UID: \"0525c669-b74e-43d2-a613-4a95d5c51bdd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7rp8b" Apr 20 15:05:37.651482 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:05:37.651424 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0525c669-b74e-43d2-a613-4a95d5c51bdd-kube-state-metrics-tls podName:0525c669-b74e-43d2-a613-4a95d5c51bdd nodeName:}" failed. No retries permitted until 2026-04-20 15:05:38.151402376 +0000 UTC m=+195.523989066 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/0525c669-b74e-43d2-a613-4a95d5c51bdd-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-7rp8b" (UID: "0525c669-b74e-43d2-a613-4a95d5c51bdd") : secret "kube-state-metrics-tls" not found Apr 20 15:05:37.651482 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.651464 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cb4d0256-b738-4dd1-bf9b-d24347291878-sys\") pod \"node-exporter-gbvlq\" (UID: \"cb4d0256-b738-4dd1-bf9b-d24347291878\") " pod="openshift-monitoring/node-exporter-gbvlq" Apr 20 15:05:37.651887 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.651495 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/cb4d0256-b738-4dd1-bf9b-d24347291878-root\") pod \"node-exporter-gbvlq\" (UID: \"cb4d0256-b738-4dd1-bf9b-d24347291878\") " pod="openshift-monitoring/node-exporter-gbvlq" Apr 20 15:05:37.651887 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.651527 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/cb4d0256-b738-4dd1-bf9b-d24347291878-node-exporter-textfile\") pod \"node-exporter-gbvlq\" (UID: \"cb4d0256-b738-4dd1-bf9b-d24347291878\") " pod="openshift-monitoring/node-exporter-gbvlq" Apr 20 15:05:37.651887 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.651557 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0525c669-b74e-43d2-a613-4a95d5c51bdd-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-7rp8b\" (UID: \"0525c669-b74e-43d2-a613-4a95d5c51bdd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7rp8b" Apr 20 15:05:37.651887 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.651595 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b2e813cc-e62f-44d2-a6cb-ede8248e09ac-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-jwgdc\" (UID: \"b2e813cc-e62f-44d2-a6cb-ede8248e09ac\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jwgdc" Apr 20 15:05:37.651887 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.651632 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b2e813cc-e62f-44d2-a6cb-ede8248e09ac-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-jwgdc\" (UID: \"b2e813cc-e62f-44d2-a6cb-ede8248e09ac\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jwgdc" Apr 20 15:05:37.651887 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.651667 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/cb4d0256-b738-4dd1-bf9b-d24347291878-node-exporter-accelerators-collector-config\") pod \"node-exporter-gbvlq\" (UID: \"cb4d0256-b738-4dd1-bf9b-d24347291878\") " pod="openshift-monitoring/node-exporter-gbvlq" Apr 20 15:05:37.651887 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.651700 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/cb4d0256-b738-4dd1-bf9b-d24347291878-node-exporter-wtmp\") pod \"node-exporter-gbvlq\" (UID: \"cb4d0256-b738-4dd1-bf9b-d24347291878\") " pod="openshift-monitoring/node-exporter-gbvlq" Apr 20 15:05:37.651887 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.651752 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cb4d0256-b738-4dd1-bf9b-d24347291878-metrics-client-ca\") pod \"node-exporter-gbvlq\" (UID: \"cb4d0256-b738-4dd1-bf9b-d24347291878\") " pod="openshift-monitoring/node-exporter-gbvlq" Apr 20 15:05:37.651887 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.651828 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/0525c669-b74e-43d2-a613-4a95d5c51bdd-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-7rp8b\" (UID: \"0525c669-b74e-43d2-a613-4a95d5c51bdd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7rp8b" Apr 20 15:05:37.651887 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.651869 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hslch\" (UniqueName: \"kubernetes.io/projected/0525c669-b74e-43d2-a613-4a95d5c51bdd-kube-api-access-hslch\") pod \"kube-state-metrics-69db897b98-7rp8b\" (UID: \"0525c669-b74e-43d2-a613-4a95d5c51bdd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7rp8b" Apr 20 15:05:37.652366 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.651908 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7bbnl\" (UniqueName: \"kubernetes.io/projected/b2e813cc-e62f-44d2-a6cb-ede8248e09ac-kube-api-access-7bbnl\") pod \"openshift-state-metrics-9d44df66c-jwgdc\" (UID: \"b2e813cc-e62f-44d2-a6cb-ede8248e09ac\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jwgdc" Apr 20 15:05:37.652366 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:05:37.652059 2575 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 20 15:05:37.652366 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:05:37.652104 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2e813cc-e62f-44d2-a6cb-ede8248e09ac-openshift-state-metrics-tls podName:b2e813cc-e62f-44d2-a6cb-ede8248e09ac nodeName:}" failed. No retries permitted until 2026-04-20 15:05:38.152089682 +0000 UTC m=+195.524676372 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/b2e813cc-e62f-44d2-a6cb-ede8248e09ac-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-jwgdc" (UID: "b2e813cc-e62f-44d2-a6cb-ede8248e09ac") : secret "openshift-state-metrics-tls" not found Apr 20 15:05:37.652839 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.652819 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/0525c669-b74e-43d2-a613-4a95d5c51bdd-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-7rp8b\" (UID: \"0525c669-b74e-43d2-a613-4a95d5c51bdd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7rp8b" Apr 20 15:05:37.653572 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.653547 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0525c669-b74e-43d2-a613-4a95d5c51bdd-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-7rp8b\" (UID: \"0525c669-b74e-43d2-a613-4a95d5c51bdd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7rp8b" Apr 20 15:05:37.654407 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.654233 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/0525c669-b74e-43d2-a613-4a95d5c51bdd-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-7rp8b\" (UID: \"0525c669-b74e-43d2-a613-4a95d5c51bdd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7rp8b" Apr 20 15:05:37.654511 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.654434 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b2e813cc-e62f-44d2-a6cb-ede8248e09ac-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-jwgdc\" (UID: \"b2e813cc-e62f-44d2-a6cb-ede8248e09ac\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jwgdc" Apr 20 15:05:37.654878 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.654837 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b2e813cc-e62f-44d2-a6cb-ede8248e09ac-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-jwgdc\" (UID: \"b2e813cc-e62f-44d2-a6cb-ede8248e09ac\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jwgdc" Apr 20 15:05:37.655134 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.655114 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0525c669-b74e-43d2-a613-4a95d5c51bdd-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-7rp8b\" (UID: \"0525c669-b74e-43d2-a613-4a95d5c51bdd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7rp8b" Apr 20 15:05:37.661500 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.661473 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bbnl\" (UniqueName: \"kubernetes.io/projected/b2e813cc-e62f-44d2-a6cb-ede8248e09ac-kube-api-access-7bbnl\") pod \"openshift-state-metrics-9d44df66c-jwgdc\" (UID: \"b2e813cc-e62f-44d2-a6cb-ede8248e09ac\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jwgdc" Apr 20 15:05:37.662380 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.662347 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hslch\" (UniqueName: \"kubernetes.io/projected/0525c669-b74e-43d2-a613-4a95d5c51bdd-kube-api-access-hslch\") pod \"kube-state-metrics-69db897b98-7rp8b\" (UID: \"0525c669-b74e-43d2-a613-4a95d5c51bdd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7rp8b" Apr 20 15:05:37.752656 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.752559 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cb4d0256-b738-4dd1-bf9b-d24347291878-sys\") pod \"node-exporter-gbvlq\" (UID: \"cb4d0256-b738-4dd1-bf9b-d24347291878\") " pod="openshift-monitoring/node-exporter-gbvlq" Apr 20 15:05:37.752656 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.752620 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/cb4d0256-b738-4dd1-bf9b-d24347291878-root\") pod \"node-exporter-gbvlq\" (UID: \"cb4d0256-b738-4dd1-bf9b-d24347291878\") " pod="openshift-monitoring/node-exporter-gbvlq" Apr 20 15:05:37.753164 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.752663 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/cb4d0256-b738-4dd1-bf9b-d24347291878-node-exporter-textfile\") pod \"node-exporter-gbvlq\" (UID: \"cb4d0256-b738-4dd1-bf9b-d24347291878\") " pod="openshift-monitoring/node-exporter-gbvlq" Apr 20 15:05:37.753164 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.752707 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/cb4d0256-b738-4dd1-bf9b-d24347291878-root\") pod \"node-exporter-gbvlq\" (UID: \"cb4d0256-b738-4dd1-bf9b-d24347291878\") " pod="openshift-monitoring/node-exporter-gbvlq" Apr 20 15:05:37.753164 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.752739 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/cb4d0256-b738-4dd1-bf9b-d24347291878-node-exporter-accelerators-collector-config\") pod \"node-exporter-gbvlq\" (UID: \"cb4d0256-b738-4dd1-bf9b-d24347291878\") " pod="openshift-monitoring/node-exporter-gbvlq" Apr 20 15:05:37.753164 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.752772 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/cb4d0256-b738-4dd1-bf9b-d24347291878-node-exporter-wtmp\") pod \"node-exporter-gbvlq\" (UID: \"cb4d0256-b738-4dd1-bf9b-d24347291878\") " pod="openshift-monitoring/node-exporter-gbvlq" Apr 20 15:05:37.753164 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.752892 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/cb4d0256-b738-4dd1-bf9b-d24347291878-node-exporter-wtmp\") pod \"node-exporter-gbvlq\" (UID: \"cb4d0256-b738-4dd1-bf9b-d24347291878\") " pod="openshift-monitoring/node-exporter-gbvlq" Apr 20 15:05:37.753164 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.752923 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cb4d0256-b738-4dd1-bf9b-d24347291878-metrics-client-ca\") pod \"node-exporter-gbvlq\" (UID: \"cb4d0256-b738-4dd1-bf9b-d24347291878\") " pod="openshift-monitoring/node-exporter-gbvlq" Apr 20 15:05:37.753164 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.752993 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/cb4d0256-b738-4dd1-bf9b-d24347291878-node-exporter-textfile\") pod \"node-exporter-gbvlq\" (UID: \"cb4d0256-b738-4dd1-bf9b-d24347291878\") " pod="openshift-monitoring/node-exporter-gbvlq" Apr 20 15:05:37.753164 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.753076 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cb4d0256-b738-4dd1-bf9b-d24347291878-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gbvlq\" (UID: \"cb4d0256-b738-4dd1-bf9b-d24347291878\") " pod="openshift-monitoring/node-exporter-gbvlq" Apr 20 15:05:37.753164 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.753127 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z6xbf\" (UniqueName: \"kubernetes.io/projected/cb4d0256-b738-4dd1-bf9b-d24347291878-kube-api-access-z6xbf\") pod \"node-exporter-gbvlq\" (UID: \"cb4d0256-b738-4dd1-bf9b-d24347291878\") " pod="openshift-monitoring/node-exporter-gbvlq" Apr 20 15:05:37.753639 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.753365 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cb4d0256-b738-4dd1-bf9b-d24347291878-node-exporter-tls\") pod \"node-exporter-gbvlq\" (UID: \"cb4d0256-b738-4dd1-bf9b-d24347291878\") " pod="openshift-monitoring/node-exporter-gbvlq" Apr 20 15:05:37.753639 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.752665 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cb4d0256-b738-4dd1-bf9b-d24347291878-sys\") pod \"node-exporter-gbvlq\" (UID: \"cb4d0256-b738-4dd1-bf9b-d24347291878\") " pod="openshift-monitoring/node-exporter-gbvlq" Apr 20 15:05:37.753639 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.753365 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/cb4d0256-b738-4dd1-bf9b-d24347291878-node-exporter-accelerators-collector-config\") pod \"node-exporter-gbvlq\" (UID: \"cb4d0256-b738-4dd1-bf9b-d24347291878\") " pod="openshift-monitoring/node-exporter-gbvlq" Apr 20 15:05:37.753639 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.753490 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cb4d0256-b738-4dd1-bf9b-d24347291878-metrics-client-ca\") pod \"node-exporter-gbvlq\" (UID: \"cb4d0256-b738-4dd1-bf9b-d24347291878\") " pod="openshift-monitoring/node-exporter-gbvlq" Apr 20 15:05:37.755354 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.755334 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cb4d0256-b738-4dd1-bf9b-d24347291878-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gbvlq\" (UID: \"cb4d0256-b738-4dd1-bf9b-d24347291878\") " pod="openshift-monitoring/node-exporter-gbvlq" Apr 20 15:05:37.755613 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.755592 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cb4d0256-b738-4dd1-bf9b-d24347291878-node-exporter-tls\") pod \"node-exporter-gbvlq\" (UID: \"cb4d0256-b738-4dd1-bf9b-d24347291878\") " pod="openshift-monitoring/node-exporter-gbvlq" Apr 20 15:05:37.762244 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.762224 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6xbf\" (UniqueName: \"kubernetes.io/projected/cb4d0256-b738-4dd1-bf9b-d24347291878-kube-api-access-z6xbf\") pod \"node-exporter-gbvlq\" (UID: \"cb4d0256-b738-4dd1-bf9b-d24347291878\") " pod="openshift-monitoring/node-exporter-gbvlq" Apr 20 15:05:37.842299 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:37.842240 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-gbvlq" Apr 20 15:05:37.852187 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:05:37.852149 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb4d0256_b738_4dd1_bf9b_d24347291878.slice/crio-0277ec7d41e850eacefa5c8fc799325f67a19b62fb74c88502078161353c61b7 WatchSource:0}: Error finding container 0277ec7d41e850eacefa5c8fc799325f67a19b62fb74c88502078161353c61b7: Status 404 returned error can't find the container with id 0277ec7d41e850eacefa5c8fc799325f67a19b62fb74c88502078161353c61b7 Apr 20 15:05:38.156185 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.156147 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0525c669-b74e-43d2-a613-4a95d5c51bdd-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-7rp8b\" (UID: \"0525c669-b74e-43d2-a613-4a95d5c51bdd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7rp8b" Apr 20 15:05:38.156385 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.156232 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b2e813cc-e62f-44d2-a6cb-ede8248e09ac-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-jwgdc\" (UID: \"b2e813cc-e62f-44d2-a6cb-ede8248e09ac\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jwgdc" Apr 20 15:05:38.158679 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.158646 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b2e813cc-e62f-44d2-a6cb-ede8248e09ac-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-jwgdc\" (UID: \"b2e813cc-e62f-44d2-a6cb-ede8248e09ac\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jwgdc" Apr 20 15:05:38.158821 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.158736 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0525c669-b74e-43d2-a613-4a95d5c51bdd-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-7rp8b\" (UID: \"0525c669-b74e-43d2-a613-4a95d5c51bdd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7rp8b" Apr 20 15:05:38.409665 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.409570 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jwgdc" Apr 20 15:05:38.428523 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.428490 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-7rp8b" Apr 20 15:05:38.592688 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.592653 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 15:05:38.597400 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.596383 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:38.599648 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.599596 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 15:05:38.599870 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.599842 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 15:05:38.600791 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.600163 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 15:05:38.600791 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.600651 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 15:05:38.603527 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.601189 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 15:05:38.603527 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.601427 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-xj8z4\"" Apr 20 15:05:38.603527 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.601639 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 15:05:38.603527 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.601851 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 15:05:38.603527 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.602025 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 15:05:38.603527 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.602298 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 15:05:38.616602 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.615417 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-797c758f48-6r26r" Apr 20 15:05:38.616602 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.616400 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-797c758f48-6r26r" Apr 20 15:05:38.618818 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.618133 2575 patch_prober.go:28] interesting pod/console-797c758f48-6r26r container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.133.0.15:8443/health\": dial tcp 10.133.0.15:8443: connect: connection refused" start-of-body= Apr 20 15:05:38.618818 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.618193 2575 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-797c758f48-6r26r" podUID="72600b94-c8a8-4eef-900d-a1f3ec8450e8" containerName="console" probeResult="failure" output="Get \"https://10.133.0.15:8443/health\": dial tcp 10.133.0.15:8443: connect: connection refused" Apr 20 15:05:38.620534 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.619985 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 15:05:38.662147 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.660811 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f34f657e-9dd8-45c3-8621-d8e62ad289fc-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:38.662147 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.660913 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f34f657e-9dd8-45c3-8621-d8e62ad289fc-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:38.662147 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.660955 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f34f657e-9dd8-45c3-8621-d8e62ad289fc-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:38.662147 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.661003 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f34f657e-9dd8-45c3-8621-d8e62ad289fc-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:38.662147 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.661028 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f34f657e-9dd8-45c3-8621-d8e62ad289fc-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:38.662147 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.661055 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn956\" (UniqueName: \"kubernetes.io/projected/f34f657e-9dd8-45c3-8621-d8e62ad289fc-kube-api-access-jn956\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:38.662147 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.661093 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f34f657e-9dd8-45c3-8621-d8e62ad289fc-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:38.662147 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.661115 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f34f657e-9dd8-45c3-8621-d8e62ad289fc-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:38.662147 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.661156 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f34f657e-9dd8-45c3-8621-d8e62ad289fc-config-out\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:38.662147 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.661183 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f34f657e-9dd8-45c3-8621-d8e62ad289fc-config-volume\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:38.662147 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.661258 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f34f657e-9dd8-45c3-8621-d8e62ad289fc-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:38.662147 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.661311 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f34f657e-9dd8-45c3-8621-d8e62ad289fc-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:38.662147 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.661391 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f34f657e-9dd8-45c3-8621-d8e62ad289fc-web-config\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:38.690207 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.690170 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gbvlq" event={"ID":"cb4d0256-b738-4dd1-bf9b-d24347291878","Type":"ContainerStarted","Data":"0277ec7d41e850eacefa5c8fc799325f67a19b62fb74c88502078161353c61b7"} Apr 20 15:05:38.730928 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.730863 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-7rp8b"] Apr 20 15:05:38.736626 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:05:38.736573 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0525c669_b74e_43d2_a613_4a95d5c51bdd.slice/crio-7f37cac180ce1dc5219d785ef2a068cc0610faab486e93479b8466dab77ca5e7 WatchSource:0}: Error finding container 7f37cac180ce1dc5219d785ef2a068cc0610faab486e93479b8466dab77ca5e7: Status 404 returned error can't find the container with id 7f37cac180ce1dc5219d785ef2a068cc0610faab486e93479b8466dab77ca5e7 Apr 20 15:05:38.755420 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.755386 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-jwgdc"] Apr 20 15:05:38.760218 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:05:38.760177 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2e813cc_e62f_44d2_a6cb_ede8248e09ac.slice/crio-a11553f7969245533a3caea68aba6942d3a12e8d4e6520d073017fc7759c3df5 WatchSource:0}: Error finding container a11553f7969245533a3caea68aba6942d3a12e8d4e6520d073017fc7759c3df5: Status 404 returned error can't find the container with id a11553f7969245533a3caea68aba6942d3a12e8d4e6520d073017fc7759c3df5 Apr 20 15:05:38.761926 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.761886 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f34f657e-9dd8-45c3-8621-d8e62ad289fc-web-config\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:38.762041 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.761949 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f34f657e-9dd8-45c3-8621-d8e62ad289fc-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:38.762141 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.762052 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f34f657e-9dd8-45c3-8621-d8e62ad289fc-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:38.762141 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.762091 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f34f657e-9dd8-45c3-8621-d8e62ad289fc-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:38.762717 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.762266 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f34f657e-9dd8-45c3-8621-d8e62ad289fc-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:38.762717 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.762356 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f34f657e-9dd8-45c3-8621-d8e62ad289fc-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:38.762717 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.762386 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jn956\" (UniqueName: \"kubernetes.io/projected/f34f657e-9dd8-45c3-8621-d8e62ad289fc-kube-api-access-jn956\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:38.762717 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:05:38.762502 2575 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 20 15:05:38.762717 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:05:38.762575 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f34f657e-9dd8-45c3-8621-d8e62ad289fc-secret-alertmanager-main-tls podName:f34f657e-9dd8-45c3-8621-d8e62ad289fc nodeName:}" failed. No retries permitted until 2026-04-20 15:05:39.262555038 +0000 UTC m=+196.635141727 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/f34f657e-9dd8-45c3-8621-d8e62ad289fc-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "f34f657e-9dd8-45c3-8621-d8e62ad289fc") : secret "alertmanager-main-tls" not found Apr 20 15:05:38.763260 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.763120 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f34f657e-9dd8-45c3-8621-d8e62ad289fc-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:38.763260 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.763158 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f34f657e-9dd8-45c3-8621-d8e62ad289fc-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:38.763260 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.763211 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f34f657e-9dd8-45c3-8621-d8e62ad289fc-config-out\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:38.763260 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.763243 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f34f657e-9dd8-45c3-8621-d8e62ad289fc-config-volume\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:38.763501 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.763291 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f34f657e-9dd8-45c3-8621-d8e62ad289fc-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:38.763501 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.763304 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f34f657e-9dd8-45c3-8621-d8e62ad289fc-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:38.763501 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.763333 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f34f657e-9dd8-45c3-8621-d8e62ad289fc-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:38.763652 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.763579 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f34f657e-9dd8-45c3-8621-d8e62ad289fc-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:38.764098 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:05:38.764081 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f34f657e-9dd8-45c3-8621-d8e62ad289fc-alertmanager-trusted-ca-bundle podName:f34f657e-9dd8-45c3-8621-d8e62ad289fc nodeName:}" failed. No retries permitted until 2026-04-20 15:05:39.264061434 +0000 UTC m=+196.636648121 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/f34f657e-9dd8-45c3-8621-d8e62ad289fc-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "f34f657e-9dd8-45c3-8621-d8e62ad289fc") : configmap references non-existent config key: ca-bundle.crt Apr 20 15:05:38.766314 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.766291 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f34f657e-9dd8-45c3-8621-d8e62ad289fc-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:38.767176 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.767107 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f34f657e-9dd8-45c3-8621-d8e62ad289fc-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:38.767387 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.767364 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f34f657e-9dd8-45c3-8621-d8e62ad289fc-config-volume\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:38.767941 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.767918 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f34f657e-9dd8-45c3-8621-d8e62ad289fc-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:38.768224 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.768185 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f34f657e-9dd8-45c3-8621-d8e62ad289fc-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:38.768580 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.768504 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f34f657e-9dd8-45c3-8621-d8e62ad289fc-config-out\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:38.768654 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.768615 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f34f657e-9dd8-45c3-8621-d8e62ad289fc-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:38.769534 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.769511 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f34f657e-9dd8-45c3-8621-d8e62ad289fc-web-config\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:38.771903 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:38.771868 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn956\" (UniqueName: \"kubernetes.io/projected/f34f657e-9dd8-45c3-8621-d8e62ad289fc-kube-api-access-jn956\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:39.268767 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.268670 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f34f657e-9dd8-45c3-8621-d8e62ad289fc-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:39.268950 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.268854 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f34f657e-9dd8-45c3-8621-d8e62ad289fc-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:39.269244 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:05:39.269115 2575 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 20 15:05:39.269244 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:05:39.269205 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f34f657e-9dd8-45c3-8621-d8e62ad289fc-secret-alertmanager-main-tls podName:f34f657e-9dd8-45c3-8621-d8e62ad289fc nodeName:}" failed. No retries permitted until 2026-04-20 15:05:40.269181127 +0000 UTC m=+197.641767809 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/f34f657e-9dd8-45c3-8621-d8e62ad289fc-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "f34f657e-9dd8-45c3-8621-d8e62ad289fc") : secret "alertmanager-main-tls" not found Apr 20 15:05:39.270048 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.270024 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f34f657e-9dd8-45c3-8621-d8e62ad289fc-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:39.580602 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.580569 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7bd788d4ff-jmqxb"] Apr 20 15:05:39.583290 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.583255 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7bd788d4ff-jmqxb" Apr 20 15:05:39.586063 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.586039 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 20 15:05:39.586189 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.586166 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 20 15:05:39.586247 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.586210 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 20 15:05:39.586343 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.586328 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-qv76r\"" Apr 20 15:05:39.586411 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.586395 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-3e7m07lsm9t7j\"" Apr 20 15:05:39.586474 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.586435 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 20 15:05:39.586525 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.586470 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 20 15:05:39.597021 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.596993 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7bd788d4ff-jmqxb"] Apr 20 15:05:39.672518 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.672334 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e9beb5ce-32fe-4694-9470-0cd472f5523d-secret-grpc-tls\") pod \"thanos-querier-7bd788d4ff-jmqxb\" (UID: \"e9beb5ce-32fe-4694-9470-0cd472f5523d\") " pod="openshift-monitoring/thanos-querier-7bd788d4ff-jmqxb" Apr 20 15:05:39.672518 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.672402 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/e9beb5ce-32fe-4694-9470-0cd472f5523d-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7bd788d4ff-jmqxb\" (UID: \"e9beb5ce-32fe-4694-9470-0cd472f5523d\") " pod="openshift-monitoring/thanos-querier-7bd788d4ff-jmqxb" Apr 20 15:05:39.672518 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.672478 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/e9beb5ce-32fe-4694-9470-0cd472f5523d-secret-thanos-querier-tls\") pod \"thanos-querier-7bd788d4ff-jmqxb\" (UID: \"e9beb5ce-32fe-4694-9470-0cd472f5523d\") " pod="openshift-monitoring/thanos-querier-7bd788d4ff-jmqxb" Apr 20 15:05:39.672518 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.672523 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/e9beb5ce-32fe-4694-9470-0cd472f5523d-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7bd788d4ff-jmqxb\" (UID: \"e9beb5ce-32fe-4694-9470-0cd472f5523d\") " pod="openshift-monitoring/thanos-querier-7bd788d4ff-jmqxb" Apr 20 15:05:39.672857 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.672598 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psq6c\" (UniqueName: \"kubernetes.io/projected/e9beb5ce-32fe-4694-9470-0cd472f5523d-kube-api-access-psq6c\") pod \"thanos-querier-7bd788d4ff-jmqxb\" (UID: \"e9beb5ce-32fe-4694-9470-0cd472f5523d\") " pod="openshift-monitoring/thanos-querier-7bd788d4ff-jmqxb" Apr 20 15:05:39.672857 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.672696 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e9beb5ce-32fe-4694-9470-0cd472f5523d-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7bd788d4ff-jmqxb\" (UID: \"e9beb5ce-32fe-4694-9470-0cd472f5523d\") " pod="openshift-monitoring/thanos-querier-7bd788d4ff-jmqxb" Apr 20 15:05:39.672857 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.672742 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e9beb5ce-32fe-4694-9470-0cd472f5523d-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7bd788d4ff-jmqxb\" (UID: \"e9beb5ce-32fe-4694-9470-0cd472f5523d\") " pod="openshift-monitoring/thanos-querier-7bd788d4ff-jmqxb" Apr 20 15:05:39.672857 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.672787 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e9beb5ce-32fe-4694-9470-0cd472f5523d-metrics-client-ca\") pod \"thanos-querier-7bd788d4ff-jmqxb\" (UID: \"e9beb5ce-32fe-4694-9470-0cd472f5523d\") " pod="openshift-monitoring/thanos-querier-7bd788d4ff-jmqxb" Apr 20 15:05:39.694251 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.694218 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jwgdc" event={"ID":"b2e813cc-e62f-44d2-a6cb-ede8248e09ac","Type":"ContainerStarted","Data":"679d6f637d6e5f0df0724990b676a310e371c194bbb4defcad1d361529ec450b"} Apr 20 15:05:39.694251 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.694257 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jwgdc" event={"ID":"b2e813cc-e62f-44d2-a6cb-ede8248e09ac","Type":"ContainerStarted","Data":"126be4555636dbf223cbd3be33ee09b1dd07010fc573c1e604584611328223d8"} Apr 20 15:05:39.694472 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.694267 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jwgdc" event={"ID":"b2e813cc-e62f-44d2-a6cb-ede8248e09ac","Type":"ContainerStarted","Data":"a11553f7969245533a3caea68aba6942d3a12e8d4e6520d073017fc7759c3df5"} Apr 20 15:05:39.695806 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.695781 2575 generic.go:358] "Generic (PLEG): container finished" podID="cb4d0256-b738-4dd1-bf9b-d24347291878" containerID="14f7566377a8e051ee596a621e7a5dfe8a6f37e3ee3e22666927d6f409250e3e" exitCode=0 Apr 20 15:05:39.695922 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.695810 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gbvlq" event={"ID":"cb4d0256-b738-4dd1-bf9b-d24347291878","Type":"ContainerDied","Data":"14f7566377a8e051ee596a621e7a5dfe8a6f37e3ee3e22666927d6f409250e3e"} Apr 20 15:05:39.696993 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.696967 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-7rp8b" event={"ID":"0525c669-b74e-43d2-a613-4a95d5c51bdd","Type":"ContainerStarted","Data":"7f37cac180ce1dc5219d785ef2a068cc0610faab486e93479b8466dab77ca5e7"} Apr 20 15:05:39.773924 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.773872 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-psq6c\" (UniqueName: \"kubernetes.io/projected/e9beb5ce-32fe-4694-9470-0cd472f5523d-kube-api-access-psq6c\") pod \"thanos-querier-7bd788d4ff-jmqxb\" (UID: \"e9beb5ce-32fe-4694-9470-0cd472f5523d\") " pod="openshift-monitoring/thanos-querier-7bd788d4ff-jmqxb" Apr 20 15:05:39.774416 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.773941 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e9beb5ce-32fe-4694-9470-0cd472f5523d-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7bd788d4ff-jmqxb\" (UID: \"e9beb5ce-32fe-4694-9470-0cd472f5523d\") " pod="openshift-monitoring/thanos-querier-7bd788d4ff-jmqxb" Apr 20 15:05:39.774416 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.773973 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e9beb5ce-32fe-4694-9470-0cd472f5523d-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7bd788d4ff-jmqxb\" (UID: \"e9beb5ce-32fe-4694-9470-0cd472f5523d\") " pod="openshift-monitoring/thanos-querier-7bd788d4ff-jmqxb" Apr 20 15:05:39.774416 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.774031 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e9beb5ce-32fe-4694-9470-0cd472f5523d-metrics-client-ca\") pod \"thanos-querier-7bd788d4ff-jmqxb\" (UID: \"e9beb5ce-32fe-4694-9470-0cd472f5523d\") " pod="openshift-monitoring/thanos-querier-7bd788d4ff-jmqxb" Apr 20 15:05:39.774416 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.774114 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e9beb5ce-32fe-4694-9470-0cd472f5523d-secret-grpc-tls\") pod \"thanos-querier-7bd788d4ff-jmqxb\" (UID: \"e9beb5ce-32fe-4694-9470-0cd472f5523d\") " pod="openshift-monitoring/thanos-querier-7bd788d4ff-jmqxb" Apr 20 15:05:39.774416 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.774333 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/e9beb5ce-32fe-4694-9470-0cd472f5523d-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7bd788d4ff-jmqxb\" (UID: \"e9beb5ce-32fe-4694-9470-0cd472f5523d\") " pod="openshift-monitoring/thanos-querier-7bd788d4ff-jmqxb" Apr 20 15:05:39.774416 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.774405 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/e9beb5ce-32fe-4694-9470-0cd472f5523d-secret-thanos-querier-tls\") pod \"thanos-querier-7bd788d4ff-jmqxb\" (UID: \"e9beb5ce-32fe-4694-9470-0cd472f5523d\") " pod="openshift-monitoring/thanos-querier-7bd788d4ff-jmqxb" Apr 20 15:05:39.774696 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.774446 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/e9beb5ce-32fe-4694-9470-0cd472f5523d-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7bd788d4ff-jmqxb\" (UID: \"e9beb5ce-32fe-4694-9470-0cd472f5523d\") " pod="openshift-monitoring/thanos-querier-7bd788d4ff-jmqxb" Apr 20 15:05:39.774875 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.774854 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e9beb5ce-32fe-4694-9470-0cd472f5523d-metrics-client-ca\") pod \"thanos-querier-7bd788d4ff-jmqxb\" (UID: \"e9beb5ce-32fe-4694-9470-0cd472f5523d\") " pod="openshift-monitoring/thanos-querier-7bd788d4ff-jmqxb" Apr 20 15:05:39.777296 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.777252 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e9beb5ce-32fe-4694-9470-0cd472f5523d-secret-grpc-tls\") pod \"thanos-querier-7bd788d4ff-jmqxb\" (UID: \"e9beb5ce-32fe-4694-9470-0cd472f5523d\") " pod="openshift-monitoring/thanos-querier-7bd788d4ff-jmqxb" Apr 20 15:05:39.777422 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.777348 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/e9beb5ce-32fe-4694-9470-0cd472f5523d-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7bd788d4ff-jmqxb\" (UID: \"e9beb5ce-32fe-4694-9470-0cd472f5523d\") " pod="openshift-monitoring/thanos-querier-7bd788d4ff-jmqxb" Apr 20 15:05:39.777805 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.777781 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/e9beb5ce-32fe-4694-9470-0cd472f5523d-secret-thanos-querier-tls\") pod \"thanos-querier-7bd788d4ff-jmqxb\" (UID: \"e9beb5ce-32fe-4694-9470-0cd472f5523d\") " pod="openshift-monitoring/thanos-querier-7bd788d4ff-jmqxb" Apr 20 15:05:39.778045 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.778022 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/e9beb5ce-32fe-4694-9470-0cd472f5523d-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7bd788d4ff-jmqxb\" (UID: \"e9beb5ce-32fe-4694-9470-0cd472f5523d\") " pod="openshift-monitoring/thanos-querier-7bd788d4ff-jmqxb" Apr 20 15:05:39.778146 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.778125 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e9beb5ce-32fe-4694-9470-0cd472f5523d-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7bd788d4ff-jmqxb\" (UID: \"e9beb5ce-32fe-4694-9470-0cd472f5523d\") " pod="openshift-monitoring/thanos-querier-7bd788d4ff-jmqxb" Apr 20 15:05:39.778210 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.778130 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e9beb5ce-32fe-4694-9470-0cd472f5523d-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7bd788d4ff-jmqxb\" (UID: \"e9beb5ce-32fe-4694-9470-0cd472f5523d\") " pod="openshift-monitoring/thanos-querier-7bd788d4ff-jmqxb" Apr 20 15:05:39.781971 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.781947 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-psq6c\" (UniqueName: \"kubernetes.io/projected/e9beb5ce-32fe-4694-9470-0cd472f5523d-kube-api-access-psq6c\") pod \"thanos-querier-7bd788d4ff-jmqxb\" (UID: \"e9beb5ce-32fe-4694-9470-0cd472f5523d\") " pod="openshift-monitoring/thanos-querier-7bd788d4ff-jmqxb" Apr 20 15:05:39.893089 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:39.893052 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7bd788d4ff-jmqxb" Apr 20 15:05:40.050840 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:40.050806 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7bd788d4ff-jmqxb"] Apr 20 15:05:40.146942 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:05:40.146845 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9beb5ce_32fe_4694_9470_0cd472f5523d.slice/crio-960e46f71406a3dcfa7e5d0d9126aa764ae9d61cb4e73231334d75cbd4fa0a17 WatchSource:0}: Error finding container 960e46f71406a3dcfa7e5d0d9126aa764ae9d61cb4e73231334d75cbd4fa0a17: Status 404 returned error can't find the container with id 960e46f71406a3dcfa7e5d0d9126aa764ae9d61cb4e73231334d75cbd4fa0a17 Apr 20 15:05:40.279163 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:40.278935 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f34f657e-9dd8-45c3-8621-d8e62ad289fc-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:40.283170 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:40.282367 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f34f657e-9dd8-45c3-8621-d8e62ad289fc-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f34f657e-9dd8-45c3-8621-d8e62ad289fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:40.414706 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:40.414613 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 15:05:40.563737 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:40.563701 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 15:05:40.567103 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:05:40.567069 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf34f657e_9dd8_45c3_8621_d8e62ad289fc.slice/crio-9b4867f6615a89c2aadfd5c315ef5b28f2b70743cbe238dfa31a8fd96f267f96 WatchSource:0}: Error finding container 9b4867f6615a89c2aadfd5c315ef5b28f2b70743cbe238dfa31a8fd96f267f96: Status 404 returned error can't find the container with id 9b4867f6615a89c2aadfd5c315ef5b28f2b70743cbe238dfa31a8fd96f267f96 Apr 20 15:05:40.706344 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:40.706242 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jwgdc" event={"ID":"b2e813cc-e62f-44d2-a6cb-ede8248e09ac","Type":"ContainerStarted","Data":"e697c0b80e626cc56b6b023a580f8515bce70f9b9fb0d87be62d0a2608c8eeee"} Apr 20 15:05:40.708166 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:40.708132 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gbvlq" event={"ID":"cb4d0256-b738-4dd1-bf9b-d24347291878","Type":"ContainerStarted","Data":"ba74bcc0ec2ffb1c9b2bbf8956e65ba4867f4d1be29ba91b9a4d7e4dc0ce1e28"} Apr 20 15:05:40.708166 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:40.708164 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gbvlq" event={"ID":"cb4d0256-b738-4dd1-bf9b-d24347291878","Type":"ContainerStarted","Data":"98588b5dcd623c123d0259ff2179e0f4081c5f03dc332eaa5dc08e30a809224e"} Apr 20 15:05:40.710016 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:40.709990 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-7rp8b" event={"ID":"0525c669-b74e-43d2-a613-4a95d5c51bdd","Type":"ContainerStarted","Data":"78fb1e9c2f5dff3ee6560e8544cb08a97c9d9eb8b7db387f46c7a10f99663a46"} Apr 20 15:05:40.710132 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:40.710023 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-7rp8b" event={"ID":"0525c669-b74e-43d2-a613-4a95d5c51bdd","Type":"ContainerStarted","Data":"d8c8cfa46cf1f466f1f5b805a6ebbd7f75cd222df024b4eaeee3d4513f0e5a72"} Apr 20 15:05:40.710132 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:40.710035 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-7rp8b" event={"ID":"0525c669-b74e-43d2-a613-4a95d5c51bdd","Type":"ContainerStarted","Data":"478c79c9c947955cc05ea233c5708b2a9160bec5d0425046e80cde3261980cc0"} Apr 20 15:05:40.711002 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:40.710982 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f34f657e-9dd8-45c3-8621-d8e62ad289fc","Type":"ContainerStarted","Data":"9b4867f6615a89c2aadfd5c315ef5b28f2b70743cbe238dfa31a8fd96f267f96"} Apr 20 15:05:40.711972 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:40.711954 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7bd788d4ff-jmqxb" event={"ID":"e9beb5ce-32fe-4694-9470-0cd472f5523d","Type":"ContainerStarted","Data":"960e46f71406a3dcfa7e5d0d9126aa764ae9d61cb4e73231334d75cbd4fa0a17"} Apr 20 15:05:40.724869 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:40.724827 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jwgdc" podStartSLOduration=2.464703261 podStartE2EDuration="3.724815832s" podCreationTimestamp="2026-04-20 15:05:37 +0000 UTC" firstStartedPulling="2026-04-20 15:05:38.91920657 +0000 UTC m=+196.291793252" lastFinishedPulling="2026-04-20 15:05:40.179319141 +0000 UTC m=+197.551905823" observedRunningTime="2026-04-20 15:05:40.72455817 +0000 UTC m=+198.097144874" watchObservedRunningTime="2026-04-20 15:05:40.724815832 +0000 UTC m=+198.097402533" Apr 20 15:05:40.747460 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:40.747412 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-7rp8b" podStartSLOduration=2.567191014 podStartE2EDuration="3.747397372s" podCreationTimestamp="2026-04-20 15:05:37 +0000 UTC" firstStartedPulling="2026-04-20 15:05:38.739110735 +0000 UTC m=+196.111697420" lastFinishedPulling="2026-04-20 15:05:39.919317088 +0000 UTC m=+197.291903778" observedRunningTime="2026-04-20 15:05:40.745896803 +0000 UTC m=+198.118483518" watchObservedRunningTime="2026-04-20 15:05:40.747397372 +0000 UTC m=+198.119984075" Apr 20 15:05:40.776929 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:40.776866 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-gbvlq" podStartSLOduration=3.028606967 podStartE2EDuration="3.77684635s" podCreationTimestamp="2026-04-20 15:05:37 +0000 UTC" firstStartedPulling="2026-04-20 15:05:37.853697104 +0000 UTC m=+195.226283801" lastFinishedPulling="2026-04-20 15:05:38.601936484 +0000 UTC m=+195.974523184" observedRunningTime="2026-04-20 15:05:40.775414057 +0000 UTC m=+198.148000763" watchObservedRunningTime="2026-04-20 15:05:40.77684635 +0000 UTC m=+198.149433055" Apr 20 15:05:42.719171 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.719131 2575 generic.go:358] "Generic (PLEG): container finished" podID="f34f657e-9dd8-45c3-8621-d8e62ad289fc" containerID="966b1ca107eb1b6639ef7ce158aac894c1dc35cfa6e00fe2cad527616e518041" exitCode=0 Apr 20 15:05:42.719652 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.719223 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f34f657e-9dd8-45c3-8621-d8e62ad289fc","Type":"ContainerDied","Data":"966b1ca107eb1b6639ef7ce158aac894c1dc35cfa6e00fe2cad527616e518041"} Apr 20 15:05:42.721325 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.721301 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7bd788d4ff-jmqxb" event={"ID":"e9beb5ce-32fe-4694-9470-0cd472f5523d","Type":"ContainerStarted","Data":"14be3a161461b56c7bb7e84f5eeed1e8ade631ddd6823bb2e4e154c63e4ed8b9"} Apr 20 15:05:42.721418 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.721336 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7bd788d4ff-jmqxb" event={"ID":"e9beb5ce-32fe-4694-9470-0cd472f5523d","Type":"ContainerStarted","Data":"7cdcf7e2e22426778bd877912b73cbcda0775e7478f2a1b2538bfd0afe8ee3ca"} Apr 20 15:05:42.721418 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.721350 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7bd788d4ff-jmqxb" event={"ID":"e9beb5ce-32fe-4694-9470-0cd472f5523d","Type":"ContainerStarted","Data":"8a4d9b131675b6a37c1a7f8c23d0d872426f62de57afd1cb16a94f2fe470e91b"} Apr 20 15:05:42.732498 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.732469 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-7c58c6bbd9-4gmg2"] Apr 20 15:05:42.735113 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.735089 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-7c58c6bbd9-4gmg2" Apr 20 15:05:42.738246 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.738157 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 20 15:05:42.738246 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.738177 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 20 15:05:42.738246 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.738235 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-6m68v\"" Apr 20 15:05:42.738502 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.738156 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 20 15:05:42.738502 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.738181 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 20 15:05:42.738502 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.738367 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 20 15:05:42.742737 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.742716 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 20 15:05:42.752136 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.752098 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-7c58c6bbd9-4gmg2"] Apr 20 15:05:42.803643 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.803600 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/550f14b6-7142-4cc6-958e-fcf427dd9fe0-serving-certs-ca-bundle\") pod \"telemeter-client-7c58c6bbd9-4gmg2\" (UID: \"550f14b6-7142-4cc6-958e-fcf427dd9fe0\") " pod="openshift-monitoring/telemeter-client-7c58c6bbd9-4gmg2" Apr 20 15:05:42.803643 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.803642 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/550f14b6-7142-4cc6-958e-fcf427dd9fe0-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7c58c6bbd9-4gmg2\" (UID: \"550f14b6-7142-4cc6-958e-fcf427dd9fe0\") " pod="openshift-monitoring/telemeter-client-7c58c6bbd9-4gmg2" Apr 20 15:05:42.803857 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.803764 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/550f14b6-7142-4cc6-958e-fcf427dd9fe0-federate-client-tls\") pod \"telemeter-client-7c58c6bbd9-4gmg2\" (UID: \"550f14b6-7142-4cc6-958e-fcf427dd9fe0\") " pod="openshift-monitoring/telemeter-client-7c58c6bbd9-4gmg2" Apr 20 15:05:42.803857 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.803811 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/550f14b6-7142-4cc6-958e-fcf427dd9fe0-metrics-client-ca\") pod \"telemeter-client-7c58c6bbd9-4gmg2\" (UID: \"550f14b6-7142-4cc6-958e-fcf427dd9fe0\") " pod="openshift-monitoring/telemeter-client-7c58c6bbd9-4gmg2" Apr 20 15:05:42.803857 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.803830 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kb79\" (UniqueName: \"kubernetes.io/projected/550f14b6-7142-4cc6-958e-fcf427dd9fe0-kube-api-access-4kb79\") pod \"telemeter-client-7c58c6bbd9-4gmg2\" (UID: \"550f14b6-7142-4cc6-958e-fcf427dd9fe0\") " pod="openshift-monitoring/telemeter-client-7c58c6bbd9-4gmg2" Apr 20 15:05:42.803977 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.803893 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/550f14b6-7142-4cc6-958e-fcf427dd9fe0-secret-telemeter-client\") pod \"telemeter-client-7c58c6bbd9-4gmg2\" (UID: \"550f14b6-7142-4cc6-958e-fcf427dd9fe0\") " pod="openshift-monitoring/telemeter-client-7c58c6bbd9-4gmg2" Apr 20 15:05:42.803977 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.803921 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/550f14b6-7142-4cc6-958e-fcf427dd9fe0-telemeter-client-tls\") pod \"telemeter-client-7c58c6bbd9-4gmg2\" (UID: \"550f14b6-7142-4cc6-958e-fcf427dd9fe0\") " pod="openshift-monitoring/telemeter-client-7c58c6bbd9-4gmg2" Apr 20 15:05:42.804106 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.804079 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/550f14b6-7142-4cc6-958e-fcf427dd9fe0-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7c58c6bbd9-4gmg2\" (UID: \"550f14b6-7142-4cc6-958e-fcf427dd9fe0\") " pod="openshift-monitoring/telemeter-client-7c58c6bbd9-4gmg2" Apr 20 15:05:42.897812 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.897767 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-797c758f48-6r26r"] Apr 20 15:05:42.904599 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.904570 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/550f14b6-7142-4cc6-958e-fcf427dd9fe0-metrics-client-ca\") pod \"telemeter-client-7c58c6bbd9-4gmg2\" (UID: \"550f14b6-7142-4cc6-958e-fcf427dd9fe0\") " pod="openshift-monitoring/telemeter-client-7c58c6bbd9-4gmg2" Apr 20 15:05:42.904599 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.904602 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4kb79\" (UniqueName: \"kubernetes.io/projected/550f14b6-7142-4cc6-958e-fcf427dd9fe0-kube-api-access-4kb79\") pod \"telemeter-client-7c58c6bbd9-4gmg2\" (UID: \"550f14b6-7142-4cc6-958e-fcf427dd9fe0\") " pod="openshift-monitoring/telemeter-client-7c58c6bbd9-4gmg2" Apr 20 15:05:42.904832 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.904625 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/550f14b6-7142-4cc6-958e-fcf427dd9fe0-secret-telemeter-client\") pod \"telemeter-client-7c58c6bbd9-4gmg2\" (UID: \"550f14b6-7142-4cc6-958e-fcf427dd9fe0\") " pod="openshift-monitoring/telemeter-client-7c58c6bbd9-4gmg2" Apr 20 15:05:42.904832 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.904748 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/550f14b6-7142-4cc6-958e-fcf427dd9fe0-telemeter-client-tls\") pod \"telemeter-client-7c58c6bbd9-4gmg2\" (UID: \"550f14b6-7142-4cc6-958e-fcf427dd9fe0\") " pod="openshift-monitoring/telemeter-client-7c58c6bbd9-4gmg2" Apr 20 15:05:42.904962 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.904861 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/550f14b6-7142-4cc6-958e-fcf427dd9fe0-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7c58c6bbd9-4gmg2\" (UID: \"550f14b6-7142-4cc6-958e-fcf427dd9fe0\") " pod="openshift-monitoring/telemeter-client-7c58c6bbd9-4gmg2" Apr 20 15:05:42.904962 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.904938 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/550f14b6-7142-4cc6-958e-fcf427dd9fe0-serving-certs-ca-bundle\") pod \"telemeter-client-7c58c6bbd9-4gmg2\" (UID: \"550f14b6-7142-4cc6-958e-fcf427dd9fe0\") " pod="openshift-monitoring/telemeter-client-7c58c6bbd9-4gmg2" Apr 20 15:05:42.905056 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.904973 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/550f14b6-7142-4cc6-958e-fcf427dd9fe0-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7c58c6bbd9-4gmg2\" (UID: \"550f14b6-7142-4cc6-958e-fcf427dd9fe0\") " pod="openshift-monitoring/telemeter-client-7c58c6bbd9-4gmg2" Apr 20 15:05:42.905056 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.905043 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/550f14b6-7142-4cc6-958e-fcf427dd9fe0-federate-client-tls\") pod \"telemeter-client-7c58c6bbd9-4gmg2\" (UID: \"550f14b6-7142-4cc6-958e-fcf427dd9fe0\") " pod="openshift-monitoring/telemeter-client-7c58c6bbd9-4gmg2" Apr 20 15:05:42.905684 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.905631 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/550f14b6-7142-4cc6-958e-fcf427dd9fe0-serving-certs-ca-bundle\") pod \"telemeter-client-7c58c6bbd9-4gmg2\" (UID: \"550f14b6-7142-4cc6-958e-fcf427dd9fe0\") " pod="openshift-monitoring/telemeter-client-7c58c6bbd9-4gmg2" Apr 20 15:05:42.912287 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.905663 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/550f14b6-7142-4cc6-958e-fcf427dd9fe0-metrics-client-ca\") pod \"telemeter-client-7c58c6bbd9-4gmg2\" (UID: \"550f14b6-7142-4cc6-958e-fcf427dd9fe0\") " pod="openshift-monitoring/telemeter-client-7c58c6bbd9-4gmg2" Apr 20 15:05:42.912287 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.907539 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/550f14b6-7142-4cc6-958e-fcf427dd9fe0-secret-telemeter-client\") pod \"telemeter-client-7c58c6bbd9-4gmg2\" (UID: \"550f14b6-7142-4cc6-958e-fcf427dd9fe0\") " pod="openshift-monitoring/telemeter-client-7c58c6bbd9-4gmg2" Apr 20 15:05:42.912287 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.908011 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/550f14b6-7142-4cc6-958e-fcf427dd9fe0-telemeter-client-tls\") pod \"telemeter-client-7c58c6bbd9-4gmg2\" (UID: \"550f14b6-7142-4cc6-958e-fcf427dd9fe0\") " pod="openshift-monitoring/telemeter-client-7c58c6bbd9-4gmg2" Apr 20 15:05:42.912287 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.908163 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/550f14b6-7142-4cc6-958e-fcf427dd9fe0-federate-client-tls\") pod \"telemeter-client-7c58c6bbd9-4gmg2\" (UID: \"550f14b6-7142-4cc6-958e-fcf427dd9fe0\") " pod="openshift-monitoring/telemeter-client-7c58c6bbd9-4gmg2" Apr 20 15:05:42.912287 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.908308 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/550f14b6-7142-4cc6-958e-fcf427dd9fe0-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7c58c6bbd9-4gmg2\" (UID: \"550f14b6-7142-4cc6-958e-fcf427dd9fe0\") " pod="openshift-monitoring/telemeter-client-7c58c6bbd9-4gmg2" Apr 20 15:05:42.912287 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.910341 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/550f14b6-7142-4cc6-958e-fcf427dd9fe0-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7c58c6bbd9-4gmg2\" (UID: \"550f14b6-7142-4cc6-958e-fcf427dd9fe0\") " pod="openshift-monitoring/telemeter-client-7c58c6bbd9-4gmg2" Apr 20 15:05:42.913755 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:42.913732 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kb79\" (UniqueName: \"kubernetes.io/projected/550f14b6-7142-4cc6-958e-fcf427dd9fe0-kube-api-access-4kb79\") pod \"telemeter-client-7c58c6bbd9-4gmg2\" (UID: \"550f14b6-7142-4cc6-958e-fcf427dd9fe0\") " pod="openshift-monitoring/telemeter-client-7c58c6bbd9-4gmg2" Apr 20 15:05:43.044655 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.044565 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-7c58c6bbd9-4gmg2" Apr 20 15:05:43.182381 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:05:43.182339 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod550f14b6_7142_4cc6_958e_fcf427dd9fe0.slice/crio-55061ee75b45e0bf66efb09774d9789eb643a72658d0533094dae56e2a6faa50 WatchSource:0}: Error finding container 55061ee75b45e0bf66efb09774d9789eb643a72658d0533094dae56e2a6faa50: Status 404 returned error can't find the container with id 55061ee75b45e0bf66efb09774d9789eb643a72658d0533094dae56e2a6faa50 Apr 20 15:05:43.182663 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.182641 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-7c58c6bbd9-4gmg2"] Apr 20 15:05:43.719603 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.719507 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 15:05:43.723351 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.723325 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.727072 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.727005 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 15:05:43.727530 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.727351 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 15:05:43.727530 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.727462 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 15:05:43.728020 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.727994 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 15:05:43.728297 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.728256 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 15:05:43.728644 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.728625 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 15:05:43.728718 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.728699 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 15:05:43.728852 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.728838 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 15:05:43.729834 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.728626 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-mhrx5\"" Apr 20 15:05:43.729834 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.729266 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-6r1hnid0a2hl0\"" Apr 20 15:05:43.729834 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.729428 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 15:05:43.729834 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.729507 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 15:05:43.729834 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.729555 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 15:05:43.731474 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.731348 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 15:05:43.732289 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.732224 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7bd788d4ff-jmqxb" event={"ID":"e9beb5ce-32fe-4694-9470-0cd472f5523d","Type":"ContainerStarted","Data":"b0555594bd2972584c4a8b4567e17891f3a242e510282a1615333380acfebe02"} Apr 20 15:05:43.732289 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.732260 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7bd788d4ff-jmqxb" event={"ID":"e9beb5ce-32fe-4694-9470-0cd472f5523d","Type":"ContainerStarted","Data":"215a61a50b00401011fff2b797bdf837a00b8333e9d408b4fd498c741559ed47"} Apr 20 15:05:43.732427 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.732300 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7bd788d4ff-jmqxb" event={"ID":"e9beb5ce-32fe-4694-9470-0cd472f5523d","Type":"ContainerStarted","Data":"4d11d2537d05d14e090635bfe3d90264f19862ba3883995b62b0ae2f964695be"} Apr 20 15:05:43.733723 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.733353 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-7bd788d4ff-jmqxb" Apr 20 15:05:43.733723 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.733372 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 15:05:43.735543 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.735516 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7c58c6bbd9-4gmg2" event={"ID":"550f14b6-7142-4cc6-958e-fcf427dd9fe0","Type":"ContainerStarted","Data":"55061ee75b45e0bf66efb09774d9789eb643a72658d0533094dae56e2a6faa50"} Apr 20 15:05:43.736716 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.736696 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 15:05:43.786606 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.786543 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7bd788d4ff-jmqxb" podStartSLOduration=1.46548866 podStartE2EDuration="4.786522088s" podCreationTimestamp="2026-04-20 15:05:39 +0000 UTC" firstStartedPulling="2026-04-20 15:05:40.149194339 +0000 UTC m=+197.521781033" lastFinishedPulling="2026-04-20 15:05:43.470227764 +0000 UTC m=+200.842814461" observedRunningTime="2026-04-20 15:05:43.786044685 +0000 UTC m=+201.158631396" watchObservedRunningTime="2026-04-20 15:05:43.786522088 +0000 UTC m=+201.159108790" Apr 20 15:05:43.815351 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.815307 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e66edc20-fc2f-47ec-b46e-a12689d026bb-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.815351 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.815348 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e66edc20-fc2f-47ec-b46e-a12689d026bb-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.815558 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.815418 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e66edc20-fc2f-47ec-b46e-a12689d026bb-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.815558 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.815457 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e66edc20-fc2f-47ec-b46e-a12689d026bb-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.815558 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.815508 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e66edc20-fc2f-47ec-b46e-a12689d026bb-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.815791 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.815600 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e66edc20-fc2f-47ec-b46e-a12689d026bb-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.815791 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.815657 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e66edc20-fc2f-47ec-b46e-a12689d026bb-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.815791 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.815707 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e66edc20-fc2f-47ec-b46e-a12689d026bb-config\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.815791 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.815728 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e66edc20-fc2f-47ec-b46e-a12689d026bb-web-config\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.815791 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.815750 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xg66\" (UniqueName: \"kubernetes.io/projected/e66edc20-fc2f-47ec-b46e-a12689d026bb-kube-api-access-9xg66\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.816022 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.815827 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e66edc20-fc2f-47ec-b46e-a12689d026bb-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.816022 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.815908 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e66edc20-fc2f-47ec-b46e-a12689d026bb-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.816022 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.815935 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e66edc20-fc2f-47ec-b46e-a12689d026bb-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.816022 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.815980 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e66edc20-fc2f-47ec-b46e-a12689d026bb-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.816022 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.816016 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e66edc20-fc2f-47ec-b46e-a12689d026bb-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.816229 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.816071 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e66edc20-fc2f-47ec-b46e-a12689d026bb-config-out\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.816437 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.816409 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e66edc20-fc2f-47ec-b46e-a12689d026bb-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.816535 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.816508 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e66edc20-fc2f-47ec-b46e-a12689d026bb-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.917730 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.917680 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e66edc20-fc2f-47ec-b46e-a12689d026bb-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.917920 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.917746 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e66edc20-fc2f-47ec-b46e-a12689d026bb-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.917920 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.917782 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e66edc20-fc2f-47ec-b46e-a12689d026bb-config\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.917920 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.917810 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e66edc20-fc2f-47ec-b46e-a12689d026bb-web-config\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.917920 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.917834 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9xg66\" (UniqueName: \"kubernetes.io/projected/e66edc20-fc2f-47ec-b46e-a12689d026bb-kube-api-access-9xg66\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.917920 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.917878 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e66edc20-fc2f-47ec-b46e-a12689d026bb-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.918099 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.917928 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e66edc20-fc2f-47ec-b46e-a12689d026bb-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.918099 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.917952 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e66edc20-fc2f-47ec-b46e-a12689d026bb-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.918099 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.917977 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e66edc20-fc2f-47ec-b46e-a12689d026bb-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.918099 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.918020 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e66edc20-fc2f-47ec-b46e-a12689d026bb-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.918099 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.918048 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e66edc20-fc2f-47ec-b46e-a12689d026bb-config-out\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.918099 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.918089 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e66edc20-fc2f-47ec-b46e-a12689d026bb-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.918470 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.918155 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e66edc20-fc2f-47ec-b46e-a12689d026bb-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.918470 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.918202 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e66edc20-fc2f-47ec-b46e-a12689d026bb-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.918470 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.918208 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e66edc20-fc2f-47ec-b46e-a12689d026bb-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.918470 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.918229 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e66edc20-fc2f-47ec-b46e-a12689d026bb-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.918470 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.918257 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e66edc20-fc2f-47ec-b46e-a12689d026bb-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.918470 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.918310 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e66edc20-fc2f-47ec-b46e-a12689d026bb-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.918470 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.918336 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e66edc20-fc2f-47ec-b46e-a12689d026bb-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.921171 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.918968 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e66edc20-fc2f-47ec-b46e-a12689d026bb-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.921171 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.918999 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e66edc20-fc2f-47ec-b46e-a12689d026bb-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.921171 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.920220 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e66edc20-fc2f-47ec-b46e-a12689d026bb-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.921756 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.921713 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e66edc20-fc2f-47ec-b46e-a12689d026bb-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.921857 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.921801 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e66edc20-fc2f-47ec-b46e-a12689d026bb-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.922148 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.922124 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e66edc20-fc2f-47ec-b46e-a12689d026bb-config\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.923289 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.923243 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e66edc20-fc2f-47ec-b46e-a12689d026bb-web-config\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.927517 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.923873 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e66edc20-fc2f-47ec-b46e-a12689d026bb-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.927517 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.923990 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e66edc20-fc2f-47ec-b46e-a12689d026bb-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.927517 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.926430 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e66edc20-fc2f-47ec-b46e-a12689d026bb-config-out\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.927517 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.927439 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e66edc20-fc2f-47ec-b46e-a12689d026bb-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.928978 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.928308 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e66edc20-fc2f-47ec-b46e-a12689d026bb-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.928978 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.928386 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e66edc20-fc2f-47ec-b46e-a12689d026bb-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.928978 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.928415 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e66edc20-fc2f-47ec-b46e-a12689d026bb-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.928978 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.928729 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e66edc20-fc2f-47ec-b46e-a12689d026bb-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.928978 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.928909 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e66edc20-fc2f-47ec-b46e-a12689d026bb-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:43.931752 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:43.931565 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xg66\" (UniqueName: \"kubernetes.io/projected/e66edc20-fc2f-47ec-b46e-a12689d026bb-kube-api-access-9xg66\") pod \"prometheus-k8s-0\" (UID: \"e66edc20-fc2f-47ec-b46e-a12689d026bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:44.037576 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:44.037492 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:44.516189 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:44.516164 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 15:05:44.518376 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:05:44.518336 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode66edc20_fc2f_47ec_b46e_a12689d026bb.slice/crio-0e4673c54605a3bca9cf7f4d0c4d5985b0c54732396e37eecc1c91bba3522897 WatchSource:0}: Error finding container 0e4673c54605a3bca9cf7f4d0c4d5985b0c54732396e37eecc1c91bba3522897: Status 404 returned error can't find the container with id 0e4673c54605a3bca9cf7f4d0c4d5985b0c54732396e37eecc1c91bba3522897 Apr 20 15:05:44.745387 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:44.745348 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f34f657e-9dd8-45c3-8621-d8e62ad289fc","Type":"ContainerStarted","Data":"7832a2facb1390c0450d48fd47d4d1524e509e0b72c299de5b178bc3d01c4c23"} Apr 20 15:05:44.745387 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:44.745392 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f34f657e-9dd8-45c3-8621-d8e62ad289fc","Type":"ContainerStarted","Data":"266a8c7b35c5fd0d6eac88f3eae0b93481baf6bec1dd5b1f5a301d0960d0cbe8"} Apr 20 15:05:44.745881 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:44.745407 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f34f657e-9dd8-45c3-8621-d8e62ad289fc","Type":"ContainerStarted","Data":"59a9f1a8bf6eeb012df701fa9dac5ff46865058f7d24f9fb0cd2689b34cc8ed1"} Apr 20 15:05:44.745881 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:44.745421 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f34f657e-9dd8-45c3-8621-d8e62ad289fc","Type":"ContainerStarted","Data":"e1074a5227f0f070a961c0aae380ac023658db23cf23305f80d985bcf9352fa3"} Apr 20 15:05:44.745881 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:44.745433 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f34f657e-9dd8-45c3-8621-d8e62ad289fc","Type":"ContainerStarted","Data":"7f99e11e48210171169d74fd37f03a724b044e8871d7fe545886e2023c5f840b"} Apr 20 15:05:44.749451 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:44.747550 2575 generic.go:358] "Generic (PLEG): container finished" podID="e66edc20-fc2f-47ec-b46e-a12689d026bb" containerID="2650ed1c6427b0010dda9859e4e0d200308db87d3bcbd4b45a622c52d9fc3b16" exitCode=0 Apr 20 15:05:44.749451 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:44.747926 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e66edc20-fc2f-47ec-b46e-a12689d026bb","Type":"ContainerDied","Data":"2650ed1c6427b0010dda9859e4e0d200308db87d3bcbd4b45a622c52d9fc3b16"} Apr 20 15:05:44.749451 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:44.748559 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e66edc20-fc2f-47ec-b46e-a12689d026bb","Type":"ContainerStarted","Data":"0e4673c54605a3bca9cf7f4d0c4d5985b0c54732396e37eecc1c91bba3522897"} Apr 20 15:05:45.753464 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:45.753331 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7c58c6bbd9-4gmg2" event={"ID":"550f14b6-7142-4cc6-958e-fcf427dd9fe0","Type":"ContainerStarted","Data":"8d6178ffa63cb7c0d94186ae43969e66824f3d03d2d93bfa298390d19ea53110"} Apr 20 15:05:45.753464 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:45.753378 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7c58c6bbd9-4gmg2" event={"ID":"550f14b6-7142-4cc6-958e-fcf427dd9fe0","Type":"ContainerStarted","Data":"52dbadcc217f4e7b2a029886a66a42af7cf16f9a8f7a4910d80c9630ebea3620"} Apr 20 15:05:45.753464 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:45.753392 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7c58c6bbd9-4gmg2" event={"ID":"550f14b6-7142-4cc6-958e-fcf427dd9fe0","Type":"ContainerStarted","Data":"64b1263541a32791e7ef43eeb50878ad4e6138fe2b041009bbe072df630398ed"} Apr 20 15:05:45.756862 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:45.756832 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f34f657e-9dd8-45c3-8621-d8e62ad289fc","Type":"ContainerStarted","Data":"c84c9d5238bfcc13d5a8b1b630b611fc3bedfae31840019904f291495b92ec46"} Apr 20 15:05:45.776446 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:45.776387 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-7c58c6bbd9-4gmg2" podStartSLOduration=1.641223208 podStartE2EDuration="3.776372579s" podCreationTimestamp="2026-04-20 15:05:42 +0000 UTC" firstStartedPulling="2026-04-20 15:05:43.184714936 +0000 UTC m=+200.557301619" lastFinishedPulling="2026-04-20 15:05:45.319864308 +0000 UTC m=+202.692450990" observedRunningTime="2026-04-20 15:05:45.774075179 +0000 UTC m=+203.146661897" watchObservedRunningTime="2026-04-20 15:05:45.776372579 +0000 UTC m=+203.148959282" Apr 20 15:05:45.802870 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:45.802795 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=3.9880262760000003 podStartE2EDuration="7.802775685s" podCreationTimestamp="2026-04-20 15:05:38 +0000 UTC" firstStartedPulling="2026-04-20 15:05:40.569094717 +0000 UTC m=+197.941681414" lastFinishedPulling="2026-04-20 15:05:44.383844141 +0000 UTC m=+201.756430823" observedRunningTime="2026-04-20 15:05:45.799072956 +0000 UTC m=+203.171659662" watchObservedRunningTime="2026-04-20 15:05:45.802775685 +0000 UTC m=+203.175362392" Apr 20 15:05:45.992232 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:45.992187 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-68ff96dbdb-n96jh" Apr 20 15:05:45.992232 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:45.992231 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-68ff96dbdb-n96jh" Apr 20 15:05:45.993497 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:45.993471 2575 patch_prober.go:28] interesting pod/console-68ff96dbdb-n96jh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.133.0.16:8443/health\": dial tcp 10.133.0.16:8443: connect: connection refused" start-of-body= Apr 20 15:05:45.993616 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:45.993516 2575 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-68ff96dbdb-n96jh" podUID="2125979b-7f4a-4e85-9570-b04ab627c4aa" containerName="console" probeResult="failure" output="Get \"https://10.133.0.16:8443/health\": dial tcp 10.133.0.16:8443: connect: connection refused" Apr 20 15:05:46.317131 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:46.317083 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68ff96dbdb-n96jh"] Apr 20 15:05:48.656574 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:48.656536 2575 patch_prober.go:28] interesting pod/image-registry-779c949c75-b2sq6 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 20 15:05:48.657015 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:48.656593 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-779c949c75-b2sq6" podUID="991dcfd6-74f7-47ef-9682-ac84ec6e81f2" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 15:05:48.770574 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:48.770542 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e66edc20-fc2f-47ec-b46e-a12689d026bb","Type":"ContainerStarted","Data":"2afe18a71d976e81092f260e15277e4f146cd7014d1ef2793398c28ab09fa715"} Apr 20 15:05:48.770713 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:48.770582 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e66edc20-fc2f-47ec-b46e-a12689d026bb","Type":"ContainerStarted","Data":"27b8f8a6df40070c8a900c081627b43fe7a9f281e3bcb6dd6d88aaaa62eca4ae"} Apr 20 15:05:48.770713 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:48.770594 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e66edc20-fc2f-47ec-b46e-a12689d026bb","Type":"ContainerStarted","Data":"1fd6e40c196c96f6e0ca95b1ba9c656a931564c2db140a46b43b95e75232ca63"} Apr 20 15:05:48.770713 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:48.770602 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e66edc20-fc2f-47ec-b46e-a12689d026bb","Type":"ContainerStarted","Data":"fea589df37477d807f52c932c950912da8435ed98e28dce91e4b4306b61c7f92"} Apr 20 15:05:48.770713 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:48.770610 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e66edc20-fc2f-47ec-b46e-a12689d026bb","Type":"ContainerStarted","Data":"8231c83e9b71b35f604791a726c061d30c6fef439cc14225ca88f57b1d8aa7e5"} Apr 20 15:05:48.770713 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:48.770619 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e66edc20-fc2f-47ec-b46e-a12689d026bb","Type":"ContainerStarted","Data":"040c3c0a3bb1698de52d95c595fae728438655d213ecefaac6aebb5d14adbc6a"} Apr 20 15:05:48.803079 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:48.802957 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.270561471 podStartE2EDuration="5.802940359s" podCreationTimestamp="2026-04-20 15:05:43 +0000 UTC" firstStartedPulling="2026-04-20 15:05:44.750318453 +0000 UTC m=+202.122905142" lastFinishedPulling="2026-04-20 15:05:48.282697346 +0000 UTC m=+205.655284030" observedRunningTime="2026-04-20 15:05:48.796305291 +0000 UTC m=+206.168891993" watchObservedRunningTime="2026-04-20 15:05:48.802940359 +0000 UTC m=+206.175527064" Apr 20 15:05:49.037822 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:49.037777 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:05:50.762736 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:50.762705 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7bd788d4ff-jmqxb" Apr 20 15:05:53.669022 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:53.668956 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-779c949c75-b2sq6" podUID="991dcfd6-74f7-47ef-9682-ac84ec6e81f2" containerName="registry" containerID="cri-o://de72db6cad19d24a90ab4a7c4040cb5fccc2815548046ff6ae7c5b3450b3c7b3" gracePeriod=30 Apr 20 15:05:53.789390 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:53.789362 2575 generic.go:358] "Generic (PLEG): container finished" podID="991dcfd6-74f7-47ef-9682-ac84ec6e81f2" containerID="de72db6cad19d24a90ab4a7c4040cb5fccc2815548046ff6ae7c5b3450b3c7b3" exitCode=0 Apr 20 15:05:53.789513 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:53.789430 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-779c949c75-b2sq6" event={"ID":"991dcfd6-74f7-47ef-9682-ac84ec6e81f2","Type":"ContainerDied","Data":"de72db6cad19d24a90ab4a7c4040cb5fccc2815548046ff6ae7c5b3450b3c7b3"} Apr 20 15:05:53.906314 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:53.906291 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-779c949c75-b2sq6" Apr 20 15:05:54.027122 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:54.027039 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-registry-certificates\") pod \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\" (UID: \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\") " Apr 20 15:05:54.027122 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:54.027089 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-image-registry-private-configuration\") pod \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\" (UID: \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\") " Apr 20 15:05:54.027122 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:54.027110 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-ca-trust-extracted\") pod \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\" (UID: \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\") " Apr 20 15:05:54.027412 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:54.027169 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-installation-pull-secrets\") pod \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\" (UID: \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\") " Apr 20 15:05:54.027412 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:54.027200 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-trusted-ca\") pod \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\" (UID: \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\") " Apr 20 15:05:54.027412 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:54.027216 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-bound-sa-token\") pod \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\" (UID: \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\") " Apr 20 15:05:54.027412 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:54.027263 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-registry-tls\") pod \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\" (UID: \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\") " Apr 20 15:05:54.027412 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:54.027333 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdf4g\" (UniqueName: \"kubernetes.io/projected/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-kube-api-access-zdf4g\") pod \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\" (UID: \"991dcfd6-74f7-47ef-9682-ac84ec6e81f2\") " Apr 20 15:05:54.027778 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:54.027746 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "991dcfd6-74f7-47ef-9682-ac84ec6e81f2" (UID: "991dcfd6-74f7-47ef-9682-ac84ec6e81f2"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 15:05:54.028053 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:54.028029 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "991dcfd6-74f7-47ef-9682-ac84ec6e81f2" (UID: "991dcfd6-74f7-47ef-9682-ac84ec6e81f2"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 15:05:54.029720 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:54.029664 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "991dcfd6-74f7-47ef-9682-ac84ec6e81f2" (UID: "991dcfd6-74f7-47ef-9682-ac84ec6e81f2"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 15:05:54.029995 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:54.029968 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "991dcfd6-74f7-47ef-9682-ac84ec6e81f2" (UID: "991dcfd6-74f7-47ef-9682-ac84ec6e81f2"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:05:54.030068 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:54.029985 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "991dcfd6-74f7-47ef-9682-ac84ec6e81f2" (UID: "991dcfd6-74f7-47ef-9682-ac84ec6e81f2"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:05:54.030068 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:54.030023 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "991dcfd6-74f7-47ef-9682-ac84ec6e81f2" (UID: "991dcfd6-74f7-47ef-9682-ac84ec6e81f2"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 15:05:54.030136 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:54.030074 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-kube-api-access-zdf4g" (OuterVolumeSpecName: "kube-api-access-zdf4g") pod "991dcfd6-74f7-47ef-9682-ac84ec6e81f2" (UID: "991dcfd6-74f7-47ef-9682-ac84ec6e81f2"). InnerVolumeSpecName "kube-api-access-zdf4g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:05:54.046953 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:54.046915 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "991dcfd6-74f7-47ef-9682-ac84ec6e81f2" (UID: "991dcfd6-74f7-47ef-9682-ac84ec6e81f2"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:05:54.129162 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:54.129126 2575 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-installation-pull-secrets\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 20 15:05:54.129162 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:54.129160 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-trusted-ca\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 20 15:05:54.129343 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:54.129171 2575 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-bound-sa-token\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 20 15:05:54.129343 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:54.129181 2575 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-registry-tls\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 20 15:05:54.129343 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:54.129191 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zdf4g\" (UniqueName: \"kubernetes.io/projected/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-kube-api-access-zdf4g\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 20 15:05:54.129343 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:54.129200 2575 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-registry-certificates\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 20 15:05:54.129343 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:54.129210 2575 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-image-registry-private-configuration\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 20 15:05:54.129343 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:54.129220 2575 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/991dcfd6-74f7-47ef-9682-ac84ec6e81f2-ca-trust-extracted\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 20 15:05:54.793186 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:54.793154 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-779c949c75-b2sq6" event={"ID":"991dcfd6-74f7-47ef-9682-ac84ec6e81f2","Type":"ContainerDied","Data":"a8a8e4a4c608fb272c1a3aab6487d6949462aafd7d946fc5ebb30569b8507ec5"} Apr 20 15:05:54.793596 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:54.793198 2575 scope.go:117] "RemoveContainer" containerID="de72db6cad19d24a90ab4a7c4040cb5fccc2815548046ff6ae7c5b3450b3c7b3" Apr 20 15:05:54.793596 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:54.793161 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-779c949c75-b2sq6" Apr 20 15:05:54.814867 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:54.814834 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-779c949c75-b2sq6"] Apr 20 15:05:54.817103 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:54.817071 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-779c949c75-b2sq6"] Apr 20 15:05:55.182546 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:05:55.182504 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="991dcfd6-74f7-47ef-9682-ac84ec6e81f2" path="/var/lib/kubelet/pods/991dcfd6-74f7-47ef-9682-ac84ec6e81f2/volumes" Apr 20 15:06:07.921075 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:07.921031 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-797c758f48-6r26r" podUID="72600b94-c8a8-4eef-900d-a1f3ec8450e8" containerName="console" containerID="cri-o://5dea7b32d014db592c778f837cc9b921f8cd873ff16b4d237e29d8996804f2b0" gracePeriod=15 Apr 20 15:06:08.159159 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:08.159118 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-797c758f48-6r26r_72600b94-c8a8-4eef-900d-a1f3ec8450e8/console/0.log" Apr 20 15:06:08.159321 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:08.159195 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-797c758f48-6r26r" Apr 20 15:06:08.252566 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:08.252487 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/72600b94-c8a8-4eef-900d-a1f3ec8450e8-console-config\") pod \"72600b94-c8a8-4eef-900d-a1f3ec8450e8\" (UID: \"72600b94-c8a8-4eef-900d-a1f3ec8450e8\") " Apr 20 15:06:08.252566 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:08.252528 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/72600b94-c8a8-4eef-900d-a1f3ec8450e8-service-ca\") pod \"72600b94-c8a8-4eef-900d-a1f3ec8450e8\" (UID: \"72600b94-c8a8-4eef-900d-a1f3ec8450e8\") " Apr 20 15:06:08.252566 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:08.252557 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/72600b94-c8a8-4eef-900d-a1f3ec8450e8-oauth-serving-cert\") pod \"72600b94-c8a8-4eef-900d-a1f3ec8450e8\" (UID: \"72600b94-c8a8-4eef-900d-a1f3ec8450e8\") " Apr 20 15:06:08.252773 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:08.252677 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkvw6\" (UniqueName: \"kubernetes.io/projected/72600b94-c8a8-4eef-900d-a1f3ec8450e8-kube-api-access-tkvw6\") pod \"72600b94-c8a8-4eef-900d-a1f3ec8450e8\" (UID: \"72600b94-c8a8-4eef-900d-a1f3ec8450e8\") " Apr 20 15:06:08.252773 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:08.252729 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/72600b94-c8a8-4eef-900d-a1f3ec8450e8-console-oauth-config\") pod \"72600b94-c8a8-4eef-900d-a1f3ec8450e8\" (UID: \"72600b94-c8a8-4eef-900d-a1f3ec8450e8\") " Apr 20 15:06:08.252867 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:08.252773 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/72600b94-c8a8-4eef-900d-a1f3ec8450e8-console-serving-cert\") pod \"72600b94-c8a8-4eef-900d-a1f3ec8450e8\" (UID: \"72600b94-c8a8-4eef-900d-a1f3ec8450e8\") " Apr 20 15:06:08.252954 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:08.252929 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72600b94-c8a8-4eef-900d-a1f3ec8450e8-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "72600b94-c8a8-4eef-900d-a1f3ec8450e8" (UID: "72600b94-c8a8-4eef-900d-a1f3ec8450e8"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 15:06:08.252954 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:08.252939 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72600b94-c8a8-4eef-900d-a1f3ec8450e8-console-config" (OuterVolumeSpecName: "console-config") pod "72600b94-c8a8-4eef-900d-a1f3ec8450e8" (UID: "72600b94-c8a8-4eef-900d-a1f3ec8450e8"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 15:06:08.253071 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:08.252963 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72600b94-c8a8-4eef-900d-a1f3ec8450e8-service-ca" (OuterVolumeSpecName: "service-ca") pod "72600b94-c8a8-4eef-900d-a1f3ec8450e8" (UID: "72600b94-c8a8-4eef-900d-a1f3ec8450e8"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 15:06:08.253181 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:08.253160 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/72600b94-c8a8-4eef-900d-a1f3ec8450e8-service-ca\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 20 15:06:08.253181 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:08.253179 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/72600b94-c8a8-4eef-900d-a1f3ec8450e8-oauth-serving-cert\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 20 15:06:08.253380 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:08.253189 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/72600b94-c8a8-4eef-900d-a1f3ec8450e8-console-config\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 20 15:06:08.254955 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:08.254932 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72600b94-c8a8-4eef-900d-a1f3ec8450e8-kube-api-access-tkvw6" (OuterVolumeSpecName: "kube-api-access-tkvw6") pod "72600b94-c8a8-4eef-900d-a1f3ec8450e8" (UID: "72600b94-c8a8-4eef-900d-a1f3ec8450e8"). InnerVolumeSpecName "kube-api-access-tkvw6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:06:08.255063 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:08.254976 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72600b94-c8a8-4eef-900d-a1f3ec8450e8-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "72600b94-c8a8-4eef-900d-a1f3ec8450e8" (UID: "72600b94-c8a8-4eef-900d-a1f3ec8450e8"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 15:06:08.255063 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:08.255018 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72600b94-c8a8-4eef-900d-a1f3ec8450e8-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "72600b94-c8a8-4eef-900d-a1f3ec8450e8" (UID: "72600b94-c8a8-4eef-900d-a1f3ec8450e8"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 15:06:08.354567 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:08.354525 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tkvw6\" (UniqueName: \"kubernetes.io/projected/72600b94-c8a8-4eef-900d-a1f3ec8450e8-kube-api-access-tkvw6\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 20 15:06:08.354567 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:08.354558 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/72600b94-c8a8-4eef-900d-a1f3ec8450e8-console-oauth-config\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 20 15:06:08.354567 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:08.354568 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/72600b94-c8a8-4eef-900d-a1f3ec8450e8-console-serving-cert\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 20 15:06:08.837624 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:08.837595 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-797c758f48-6r26r_72600b94-c8a8-4eef-900d-a1f3ec8450e8/console/0.log" Apr 20 15:06:08.837830 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:08.837635 2575 generic.go:358] "Generic (PLEG): container finished" podID="72600b94-c8a8-4eef-900d-a1f3ec8450e8" containerID="5dea7b32d014db592c778f837cc9b921f8cd873ff16b4d237e29d8996804f2b0" exitCode=2 Apr 20 15:06:08.837830 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:08.837716 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-797c758f48-6r26r" Apr 20 15:06:08.837830 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:08.837734 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-797c758f48-6r26r" event={"ID":"72600b94-c8a8-4eef-900d-a1f3ec8450e8","Type":"ContainerDied","Data":"5dea7b32d014db592c778f837cc9b921f8cd873ff16b4d237e29d8996804f2b0"} Apr 20 15:06:08.837830 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:08.837774 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-797c758f48-6r26r" event={"ID":"72600b94-c8a8-4eef-900d-a1f3ec8450e8","Type":"ContainerDied","Data":"2e68f59dfa6931b01373e1d83a2c8d3b0cd40b44fc4247dfd52adc245a3f3c89"} Apr 20 15:06:08.837830 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:08.837790 2575 scope.go:117] "RemoveContainer" containerID="5dea7b32d014db592c778f837cc9b921f8cd873ff16b4d237e29d8996804f2b0" Apr 20 15:06:08.846599 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:08.846581 2575 scope.go:117] "RemoveContainer" containerID="5dea7b32d014db592c778f837cc9b921f8cd873ff16b4d237e29d8996804f2b0" Apr 20 15:06:08.846903 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:06:08.846882 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dea7b32d014db592c778f837cc9b921f8cd873ff16b4d237e29d8996804f2b0\": container with ID starting with 5dea7b32d014db592c778f837cc9b921f8cd873ff16b4d237e29d8996804f2b0 not found: ID does not exist" containerID="5dea7b32d014db592c778f837cc9b921f8cd873ff16b4d237e29d8996804f2b0" Apr 20 15:06:08.846952 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:08.846910 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dea7b32d014db592c778f837cc9b921f8cd873ff16b4d237e29d8996804f2b0"} err="failed to get container status \"5dea7b32d014db592c778f837cc9b921f8cd873ff16b4d237e29d8996804f2b0\": rpc error: code = NotFound desc = could not find container \"5dea7b32d014db592c778f837cc9b921f8cd873ff16b4d237e29d8996804f2b0\": container with ID starting with 5dea7b32d014db592c778f837cc9b921f8cd873ff16b4d237e29d8996804f2b0 not found: ID does not exist" Apr 20 15:06:08.858602 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:08.858551 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-797c758f48-6r26r"] Apr 20 15:06:08.863193 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:08.863159 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-797c758f48-6r26r"] Apr 20 15:06:09.182672 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:09.182593 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72600b94-c8a8-4eef-900d-a1f3ec8450e8" path="/var/lib/kubelet/pods/72600b94-c8a8-4eef-900d-a1f3ec8450e8/volumes" Apr 20 15:06:10.845497 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:10.845464 2575 generic.go:358] "Generic (PLEG): container finished" podID="d6a53b42-a8ae-4454-b200-47881e42577a" containerID="17211fdb1002e351d870e1ba336c4c311d814e13ea6cd79f212fd337f7753ae1" exitCode=0 Apr 20 15:06:10.845869 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:10.845539 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8489c" event={"ID":"d6a53b42-a8ae-4454-b200-47881e42577a","Type":"ContainerDied","Data":"17211fdb1002e351d870e1ba336c4c311d814e13ea6cd79f212fd337f7753ae1"} Apr 20 15:06:10.845915 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:10.845879 2575 scope.go:117] "RemoveContainer" containerID="17211fdb1002e351d870e1ba336c4c311d814e13ea6cd79f212fd337f7753ae1" Apr 20 15:06:11.786937 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:11.786889 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-68ff96dbdb-n96jh" podUID="2125979b-7f4a-4e85-9570-b04ab627c4aa" containerName="console" containerID="cri-o://d01d7912cf708fd8cf40ca99277b294bac26dc877cfd8fafac97a165ba0c959b" gracePeriod=15 Apr 20 15:06:11.850412 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:11.850383 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8489c" event={"ID":"d6a53b42-a8ae-4454-b200-47881e42577a","Type":"ContainerStarted","Data":"41ff6c16955aa47f845d371c58219497760d43361deac6c2003173da18122c9d"} Apr 20 15:06:12.023732 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:12.023709 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68ff96dbdb-n96jh_2125979b-7f4a-4e85-9570-b04ab627c4aa/console/0.log" Apr 20 15:06:12.023866 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:12.023770 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68ff96dbdb-n96jh" Apr 20 15:06:12.087664 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:12.087631 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2125979b-7f4a-4e85-9570-b04ab627c4aa-trusted-ca-bundle\") pod \"2125979b-7f4a-4e85-9570-b04ab627c4aa\" (UID: \"2125979b-7f4a-4e85-9570-b04ab627c4aa\") " Apr 20 15:06:12.087844 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:12.087679 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsm4f\" (UniqueName: \"kubernetes.io/projected/2125979b-7f4a-4e85-9570-b04ab627c4aa-kube-api-access-tsm4f\") pod \"2125979b-7f4a-4e85-9570-b04ab627c4aa\" (UID: \"2125979b-7f4a-4e85-9570-b04ab627c4aa\") " Apr 20 15:06:12.087844 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:12.087702 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2125979b-7f4a-4e85-9570-b04ab627c4aa-console-serving-cert\") pod \"2125979b-7f4a-4e85-9570-b04ab627c4aa\" (UID: \"2125979b-7f4a-4e85-9570-b04ab627c4aa\") " Apr 20 15:06:12.087844 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:12.087771 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2125979b-7f4a-4e85-9570-b04ab627c4aa-oauth-serving-cert\") pod \"2125979b-7f4a-4e85-9570-b04ab627c4aa\" (UID: \"2125979b-7f4a-4e85-9570-b04ab627c4aa\") " Apr 20 15:06:12.087844 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:12.087830 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2125979b-7f4a-4e85-9570-b04ab627c4aa-console-oauth-config\") pod \"2125979b-7f4a-4e85-9570-b04ab627c4aa\" (UID: \"2125979b-7f4a-4e85-9570-b04ab627c4aa\") " Apr 20 15:06:12.088012 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:12.087889 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2125979b-7f4a-4e85-9570-b04ab627c4aa-service-ca\") pod \"2125979b-7f4a-4e85-9570-b04ab627c4aa\" (UID: \"2125979b-7f4a-4e85-9570-b04ab627c4aa\") " Apr 20 15:06:12.088012 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:12.087914 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2125979b-7f4a-4e85-9570-b04ab627c4aa-console-config\") pod \"2125979b-7f4a-4e85-9570-b04ab627c4aa\" (UID: \"2125979b-7f4a-4e85-9570-b04ab627c4aa\") " Apr 20 15:06:12.088219 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:12.088191 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2125979b-7f4a-4e85-9570-b04ab627c4aa-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "2125979b-7f4a-4e85-9570-b04ab627c4aa" (UID: "2125979b-7f4a-4e85-9570-b04ab627c4aa"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 15:06:12.088219 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:12.088206 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2125979b-7f4a-4e85-9570-b04ab627c4aa-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "2125979b-7f4a-4e85-9570-b04ab627c4aa" (UID: "2125979b-7f4a-4e85-9570-b04ab627c4aa"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 15:06:12.088505 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:12.088480 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2125979b-7f4a-4e85-9570-b04ab627c4aa-console-config" (OuterVolumeSpecName: "console-config") pod "2125979b-7f4a-4e85-9570-b04ab627c4aa" (UID: "2125979b-7f4a-4e85-9570-b04ab627c4aa"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 15:06:12.088574 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:12.088526 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2125979b-7f4a-4e85-9570-b04ab627c4aa-service-ca" (OuterVolumeSpecName: "service-ca") pod "2125979b-7f4a-4e85-9570-b04ab627c4aa" (UID: "2125979b-7f4a-4e85-9570-b04ab627c4aa"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 15:06:12.089993 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:12.089966 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2125979b-7f4a-4e85-9570-b04ab627c4aa-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "2125979b-7f4a-4e85-9570-b04ab627c4aa" (UID: "2125979b-7f4a-4e85-9570-b04ab627c4aa"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 15:06:12.090097 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:12.090021 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2125979b-7f4a-4e85-9570-b04ab627c4aa-kube-api-access-tsm4f" (OuterVolumeSpecName: "kube-api-access-tsm4f") pod "2125979b-7f4a-4e85-9570-b04ab627c4aa" (UID: "2125979b-7f4a-4e85-9570-b04ab627c4aa"). InnerVolumeSpecName "kube-api-access-tsm4f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:06:12.090097 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:12.090075 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2125979b-7f4a-4e85-9570-b04ab627c4aa-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "2125979b-7f4a-4e85-9570-b04ab627c4aa" (UID: "2125979b-7f4a-4e85-9570-b04ab627c4aa"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 15:06:12.189476 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:12.189422 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2125979b-7f4a-4e85-9570-b04ab627c4aa-service-ca\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 20 15:06:12.189476 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:12.189468 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2125979b-7f4a-4e85-9570-b04ab627c4aa-console-config\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 20 15:06:12.189476 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:12.189481 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2125979b-7f4a-4e85-9570-b04ab627c4aa-trusted-ca-bundle\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 20 15:06:12.189723 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:12.189497 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tsm4f\" (UniqueName: \"kubernetes.io/projected/2125979b-7f4a-4e85-9570-b04ab627c4aa-kube-api-access-tsm4f\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 20 15:06:12.189723 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:12.189510 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2125979b-7f4a-4e85-9570-b04ab627c4aa-console-serving-cert\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 20 15:06:12.189723 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:12.189523 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2125979b-7f4a-4e85-9570-b04ab627c4aa-oauth-serving-cert\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 20 15:06:12.189723 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:12.189536 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2125979b-7f4a-4e85-9570-b04ab627c4aa-console-oauth-config\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 20 15:06:12.854512 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:12.854487 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68ff96dbdb-n96jh_2125979b-7f4a-4e85-9570-b04ab627c4aa/console/0.log" Apr 20 15:06:12.854903 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:12.854523 2575 generic.go:358] "Generic (PLEG): container finished" podID="2125979b-7f4a-4e85-9570-b04ab627c4aa" containerID="d01d7912cf708fd8cf40ca99277b294bac26dc877cfd8fafac97a165ba0c959b" exitCode=2 Apr 20 15:06:12.854903 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:12.854555 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68ff96dbdb-n96jh" event={"ID":"2125979b-7f4a-4e85-9570-b04ab627c4aa","Type":"ContainerDied","Data":"d01d7912cf708fd8cf40ca99277b294bac26dc877cfd8fafac97a165ba0c959b"} Apr 20 15:06:12.854903 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:12.854598 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68ff96dbdb-n96jh" Apr 20 15:06:12.854903 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:12.854604 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68ff96dbdb-n96jh" event={"ID":"2125979b-7f4a-4e85-9570-b04ab627c4aa","Type":"ContainerDied","Data":"b5f72d2f58ed118e54e0d3f698547e202ce97d840b9722265cb795ff94b4bff8"} Apr 20 15:06:12.854903 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:12.854624 2575 scope.go:117] "RemoveContainer" containerID="d01d7912cf708fd8cf40ca99277b294bac26dc877cfd8fafac97a165ba0c959b" Apr 20 15:06:12.863517 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:12.863496 2575 scope.go:117] "RemoveContainer" containerID="d01d7912cf708fd8cf40ca99277b294bac26dc877cfd8fafac97a165ba0c959b" Apr 20 15:06:12.863796 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:06:12.863774 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d01d7912cf708fd8cf40ca99277b294bac26dc877cfd8fafac97a165ba0c959b\": container with ID starting with d01d7912cf708fd8cf40ca99277b294bac26dc877cfd8fafac97a165ba0c959b not found: ID does not exist" containerID="d01d7912cf708fd8cf40ca99277b294bac26dc877cfd8fafac97a165ba0c959b" Apr 20 15:06:12.863855 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:12.863809 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d01d7912cf708fd8cf40ca99277b294bac26dc877cfd8fafac97a165ba0c959b"} err="failed to get container status \"d01d7912cf708fd8cf40ca99277b294bac26dc877cfd8fafac97a165ba0c959b\": rpc error: code = NotFound desc = could not find container \"d01d7912cf708fd8cf40ca99277b294bac26dc877cfd8fafac97a165ba0c959b\": container with ID starting with d01d7912cf708fd8cf40ca99277b294bac26dc877cfd8fafac97a165ba0c959b not found: ID does not exist" Apr 20 15:06:12.875447 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:12.875411 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68ff96dbdb-n96jh"] Apr 20 15:06:12.878091 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:12.878066 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-68ff96dbdb-n96jh"] Apr 20 15:06:13.183403 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:13.183323 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2125979b-7f4a-4e85-9570-b04ab627c4aa" path="/var/lib/kubelet/pods/2125979b-7f4a-4e85-9570-b04ab627c4aa/volumes" Apr 20 15:06:18.985823 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:18.985781 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f34f657e-9dd8-45c3-8621-d8e62ad289fc/init-config-reloader/0.log" Apr 20 15:06:19.183803 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:19.183776 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f34f657e-9dd8-45c3-8621-d8e62ad289fc/alertmanager/0.log" Apr 20 15:06:19.385683 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:19.385657 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f34f657e-9dd8-45c3-8621-d8e62ad289fc/config-reloader/0.log" Apr 20 15:06:19.584253 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:19.584227 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f34f657e-9dd8-45c3-8621-d8e62ad289fc/kube-rbac-proxy-web/0.log" Apr 20 15:06:19.784583 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:19.784520 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f34f657e-9dd8-45c3-8621-d8e62ad289fc/kube-rbac-proxy/0.log" Apr 20 15:06:19.984110 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:19.984080 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f34f657e-9dd8-45c3-8621-d8e62ad289fc/kube-rbac-proxy-metric/0.log" Apr 20 15:06:20.183995 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:20.183969 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f34f657e-9dd8-45c3-8621-d8e62ad289fc/prom-label-proxy/0.log" Apr 20 15:06:20.385608 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:20.385581 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-j22cl_af67f37d-b332-44b7-8678-28aa45d26ed9/cluster-monitoring-operator/0.log" Apr 20 15:06:20.583646 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:20.583620 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-7rp8b_0525c669-b74e-43d2-a613-4a95d5c51bdd/kube-state-metrics/0.log" Apr 20 15:06:20.784230 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:20.784199 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-7rp8b_0525c669-b74e-43d2-a613-4a95d5c51bdd/kube-rbac-proxy-main/0.log" Apr 20 15:06:20.983810 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:20.983734 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-7rp8b_0525c669-b74e-43d2-a613-4a95d5c51bdd/kube-rbac-proxy-self/0.log" Apr 20 15:06:22.784111 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:22.784083 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gbvlq_cb4d0256-b738-4dd1-bf9b-d24347291878/init-textfile/0.log" Apr 20 15:06:22.987193 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:22.987159 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gbvlq_cb4d0256-b738-4dd1-bf9b-d24347291878/node-exporter/0.log" Apr 20 15:06:23.185144 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:23.185119 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gbvlq_cb4d0256-b738-4dd1-bf9b-d24347291878/kube-rbac-proxy/0.log" Apr 20 15:06:23.383873 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:23.383843 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-jwgdc_b2e813cc-e62f-44d2-a6cb-ede8248e09ac/kube-rbac-proxy-main/0.log" Apr 20 15:06:23.584223 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:23.584170 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-jwgdc_b2e813cc-e62f-44d2-a6cb-ede8248e09ac/kube-rbac-proxy-self/0.log" Apr 20 15:06:23.783682 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:23.783590 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-jwgdc_b2e813cc-e62f-44d2-a6cb-ede8248e09ac/openshift-state-metrics/0.log" Apr 20 15:06:23.983878 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:23.983850 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e66edc20-fc2f-47ec-b46e-a12689d026bb/init-config-reloader/0.log" Apr 20 15:06:24.185574 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:24.185550 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e66edc20-fc2f-47ec-b46e-a12689d026bb/prometheus/0.log" Apr 20 15:06:24.384586 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:24.384562 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e66edc20-fc2f-47ec-b46e-a12689d026bb/config-reloader/0.log" Apr 20 15:06:24.583512 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:24.583490 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e66edc20-fc2f-47ec-b46e-a12689d026bb/thanos-sidecar/0.log" Apr 20 15:06:24.783792 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:24.783763 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e66edc20-fc2f-47ec-b46e-a12689d026bb/kube-rbac-proxy-web/0.log" Apr 20 15:06:24.986034 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:24.985960 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e66edc20-fc2f-47ec-b46e-a12689d026bb/kube-rbac-proxy/0.log" Apr 20 15:06:25.183438 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:25.183408 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e66edc20-fc2f-47ec-b46e-a12689d026bb/kube-rbac-proxy-thanos/0.log" Apr 20 15:06:25.984552 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:25.984525 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7c58c6bbd9-4gmg2_550f14b6-7142-4cc6-958e-fcf427dd9fe0/telemeter-client/0.log" Apr 20 15:06:26.183792 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:26.183763 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7c58c6bbd9-4gmg2_550f14b6-7142-4cc6-958e-fcf427dd9fe0/reload/0.log" Apr 20 15:06:26.384358 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:26.384332 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7c58c6bbd9-4gmg2_550f14b6-7142-4cc6-958e-fcf427dd9fe0/kube-rbac-proxy/0.log" Apr 20 15:06:26.584510 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:26.584480 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7bd788d4ff-jmqxb_e9beb5ce-32fe-4694-9470-0cd472f5523d/thanos-query/0.log" Apr 20 15:06:26.783729 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:26.783654 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7bd788d4ff-jmqxb_e9beb5ce-32fe-4694-9470-0cd472f5523d/kube-rbac-proxy-web/0.log" Apr 20 15:06:26.984641 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:26.984614 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7bd788d4ff-jmqxb_e9beb5ce-32fe-4694-9470-0cd472f5523d/kube-rbac-proxy/0.log" Apr 20 15:06:27.183641 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:27.183609 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7bd788d4ff-jmqxb_e9beb5ce-32fe-4694-9470-0cd472f5523d/prom-label-proxy/0.log" Apr 20 15:06:27.384171 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:27.384144 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7bd788d4ff-jmqxb_e9beb5ce-32fe-4694-9470-0cd472f5523d/kube-rbac-proxy-rules/0.log" Apr 20 15:06:27.584853 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:27.584825 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7bd788d4ff-jmqxb_e9beb5ce-32fe-4694-9470-0cd472f5523d/kube-rbac-proxy-metrics/0.log" Apr 20 15:06:27.984538 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:27.984449 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pbfc9_ab96425d-f444-4af7-9052-1bddacccef53/console-operator/1.log" Apr 20 15:06:28.185698 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:28.185667 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pbfc9_ab96425d-f444-4af7-9052-1bddacccef53/console-operator/2.log" Apr 20 15:06:30.584004 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:30.583976 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-wfw5f_2e79beec-c5db-41c7-a60e-c759696b1d60/dns-node-resolver/0.log" Apr 20 15:06:31.383789 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:31.383762 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-flrxv_86824264-d16e-4d82-854b-f1f5bc86483c/node-ca/0.log" Apr 20 15:06:34.902127 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:34.902090 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee584f46-b9aa-46b2-a060-01c6f4e256e9-metrics-certs\") pod \"network-metrics-daemon-lnqrj\" (UID: \"ee584f46-b9aa-46b2-a060-01c6f4e256e9\") " pod="openshift-multus/network-metrics-daemon-lnqrj" Apr 20 15:06:34.904446 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:34.904416 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee584f46-b9aa-46b2-a060-01c6f4e256e9-metrics-certs\") pod \"network-metrics-daemon-lnqrj\" (UID: \"ee584f46-b9aa-46b2-a060-01c6f4e256e9\") " pod="openshift-multus/network-metrics-daemon-lnqrj" Apr 20 15:06:35.083164 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:35.083127 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-64ff7\"" Apr 20 15:06:35.091201 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:35.091155 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lnqrj" Apr 20 15:06:35.262051 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:35.262017 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lnqrj"] Apr 20 15:06:35.265521 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:06:35.265490 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee584f46_b9aa_46b2_a060_01c6f4e256e9.slice/crio-1e0946eb5584faa3798bce62ce1609c60383ce72b5aab23e10c8ffe94bbc1588 WatchSource:0}: Error finding container 1e0946eb5584faa3798bce62ce1609c60383ce72b5aab23e10c8ffe94bbc1588: Status 404 returned error can't find the container with id 1e0946eb5584faa3798bce62ce1609c60383ce72b5aab23e10c8ffe94bbc1588 Apr 20 15:06:35.922323 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:35.922266 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lnqrj" event={"ID":"ee584f46-b9aa-46b2-a060-01c6f4e256e9","Type":"ContainerStarted","Data":"1e0946eb5584faa3798bce62ce1609c60383ce72b5aab23e10c8ffe94bbc1588"} Apr 20 15:06:36.927333 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:36.927295 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lnqrj" event={"ID":"ee584f46-b9aa-46b2-a060-01c6f4e256e9","Type":"ContainerStarted","Data":"18b636f14fffb58ac7c31f0b1da30526c482eb080003e484c2d0fb467d2af159"} Apr 20 15:06:36.927333 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:36.927335 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lnqrj" event={"ID":"ee584f46-b9aa-46b2-a060-01c6f4e256e9","Type":"ContainerStarted","Data":"e5a77f19686e6cc42e4e01e624177db2b7efca892eb61a96aa66a39e0b20c727"} Apr 20 15:06:36.946403 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:36.946263 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-lnqrj" podStartSLOduration=252.800302797 podStartE2EDuration="4m13.946237352s" podCreationTimestamp="2026-04-20 15:02:23 +0000 UTC" firstStartedPulling="2026-04-20 15:06:35.26748892 +0000 UTC m=+252.640075601" lastFinishedPulling="2026-04-20 15:06:36.413423469 +0000 UTC m=+253.786010156" observedRunningTime="2026-04-20 15:06:36.943970198 +0000 UTC m=+254.316556905" watchObservedRunningTime="2026-04-20 15:06:36.946237352 +0000 UTC m=+254.318824058" Apr 20 15:06:44.037990 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:44.037950 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:06:44.056974 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:44.056942 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:06:44.967230 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:06:44.967200 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 15:07:01.555660 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:07:01.555600 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-xp9tj" podUID="bfea901a-e1df-46c7-b211-b94f978562b5" Apr 20 15:07:02.003657 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:07:02.003629 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xp9tj" Apr 20 15:07:05.477929 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:07:05.477878 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfea901a-e1df-46c7-b211-b94f978562b5-metrics-tls\") pod \"dns-default-xp9tj\" (UID: \"bfea901a-e1df-46c7-b211-b94f978562b5\") " pod="openshift-dns/dns-default-xp9tj" Apr 20 15:07:05.480263 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:07:05.480233 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfea901a-e1df-46c7-b211-b94f978562b5-metrics-tls\") pod \"dns-default-xp9tj\" (UID: \"bfea901a-e1df-46c7-b211-b94f978562b5\") " pod="openshift-dns/dns-default-xp9tj" Apr 20 15:07:05.578744 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:07:05.578698 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfa73a34-d39a-4a89-b936-de5c6399f787-cert\") pod \"ingress-canary-gcg7h\" (UID: \"cfa73a34-d39a-4a89-b936-de5c6399f787\") " pod="openshift-ingress-canary/ingress-canary-gcg7h" Apr 20 15:07:05.581190 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:07:05.581155 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfa73a34-d39a-4a89-b936-de5c6399f787-cert\") pod \"ingress-canary-gcg7h\" (UID: \"cfa73a34-d39a-4a89-b936-de5c6399f787\") " pod="openshift-ingress-canary/ingress-canary-gcg7h" Apr 20 15:07:05.606608 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:07:05.606575 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vdrzr\"" Apr 20 15:07:05.614248 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:07:05.614220 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xp9tj" Apr 20 15:07:05.741521 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:07:05.741336 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xp9tj"] Apr 20 15:07:05.744164 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:07:05.744133 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfea901a_e1df_46c7_b211_b94f978562b5.slice/crio-c102bc134490dcb2c172992e0e4b3327b8c9e55bfbe9dbdb51245ac328bbbffa WatchSource:0}: Error finding container c102bc134490dcb2c172992e0e4b3327b8c9e55bfbe9dbdb51245ac328bbbffa: Status 404 returned error can't find the container with id c102bc134490dcb2c172992e0e4b3327b8c9e55bfbe9dbdb51245ac328bbbffa Apr 20 15:07:05.781790 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:07:05.781750 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-4l2x2\"" Apr 20 15:07:05.789699 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:07:05.789675 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gcg7h" Apr 20 15:07:05.910759 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:07:05.910666 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gcg7h"] Apr 20 15:07:05.913312 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:07:05.913264 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfa73a34_d39a_4a89_b936_de5c6399f787.slice/crio-432c4f286b50ed7735dc9e3df01f6b8e3af59cf50dab7d562a3d33b831162190 WatchSource:0}: Error finding container 432c4f286b50ed7735dc9e3df01f6b8e3af59cf50dab7d562a3d33b831162190: Status 404 returned error can't find the container with id 432c4f286b50ed7735dc9e3df01f6b8e3af59cf50dab7d562a3d33b831162190 Apr 20 15:07:06.015398 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:07:06.015305 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gcg7h" event={"ID":"cfa73a34-d39a-4a89-b936-de5c6399f787","Type":"ContainerStarted","Data":"432c4f286b50ed7735dc9e3df01f6b8e3af59cf50dab7d562a3d33b831162190"} Apr 20 15:07:06.016262 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:07:06.016226 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xp9tj" event={"ID":"bfea901a-e1df-46c7-b211-b94f978562b5","Type":"ContainerStarted","Data":"c102bc134490dcb2c172992e0e4b3327b8c9e55bfbe9dbdb51245ac328bbbffa"} Apr 20 15:07:08.023995 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:07:08.023904 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xp9tj" event={"ID":"bfea901a-e1df-46c7-b211-b94f978562b5","Type":"ContainerStarted","Data":"dd346851cc041c0f2a13b97db33feab66acbdb6b73624d3139c8ef7a6ee588e2"} Apr 20 15:07:08.023995 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:07:08.023945 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xp9tj" event={"ID":"bfea901a-e1df-46c7-b211-b94f978562b5","Type":"ContainerStarted","Data":"cdcfb519c07485adc07ded8281ac0959223a1583be47f9cdf6e0a4ab589f2487"} Apr 20 15:07:08.023995 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:07:08.023983 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-xp9tj" Apr 20 15:07:08.025914 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:07:08.025895 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gcg7h" event={"ID":"cfa73a34-d39a-4a89-b936-de5c6399f787","Type":"ContainerStarted","Data":"e146bbbcaf5cce7cabffa6ad11aa20402044cf034a6dcb70fb869ac94cb63336"} Apr 20 15:07:08.042538 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:07:08.042490 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-xp9tj" podStartSLOduration=251.268711534 podStartE2EDuration="4m13.04247507s" podCreationTimestamp="2026-04-20 15:02:55 +0000 UTC" firstStartedPulling="2026-04-20 15:07:05.745997224 +0000 UTC m=+283.118583917" lastFinishedPulling="2026-04-20 15:07:07.519760768 +0000 UTC m=+284.892347453" observedRunningTime="2026-04-20 15:07:08.041182097 +0000 UTC m=+285.413768831" watchObservedRunningTime="2026-04-20 15:07:08.04247507 +0000 UTC m=+285.415061774" Apr 20 15:07:08.055848 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:07:08.055787 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-gcg7h" podStartSLOduration=251.406315049 podStartE2EDuration="4m13.055770155s" podCreationTimestamp="2026-04-20 15:02:55 +0000 UTC" firstStartedPulling="2026-04-20 15:07:05.915151738 +0000 UTC m=+283.287738419" lastFinishedPulling="2026-04-20 15:07:07.564606834 +0000 UTC m=+284.937193525" observedRunningTime="2026-04-20 15:07:08.055524304 +0000 UTC m=+285.428111036" watchObservedRunningTime="2026-04-20 15:07:08.055770155 +0000 UTC m=+285.428356859" Apr 20 15:07:18.032158 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:07:18.032119 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-xp9tj" Apr 20 15:07:23.064965 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:07:23.064932 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pbfc9_ab96425d-f444-4af7-9052-1bddacccef53/console-operator/1.log" Apr 20 15:07:23.065515 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:07:23.065486 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pbfc9_ab96425d-f444-4af7-9052-1bddacccef53/console-operator/1.log" Apr 20 15:07:23.074037 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:07:23.074016 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 15:09:32.814518 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:09:32.814481 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-kjdfb"] Apr 20 15:09:32.814949 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:09:32.814841 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72600b94-c8a8-4eef-900d-a1f3ec8450e8" containerName="console" Apr 20 15:09:32.814949 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:09:32.814854 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="72600b94-c8a8-4eef-900d-a1f3ec8450e8" containerName="console" Apr 20 15:09:32.814949 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:09:32.814865 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="991dcfd6-74f7-47ef-9682-ac84ec6e81f2" containerName="registry" Apr 20 15:09:32.814949 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:09:32.814871 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="991dcfd6-74f7-47ef-9682-ac84ec6e81f2" containerName="registry" Apr 20 15:09:32.814949 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:09:32.814881 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2125979b-7f4a-4e85-9570-b04ab627c4aa" containerName="console" Apr 20 15:09:32.814949 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:09:32.814887 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="2125979b-7f4a-4e85-9570-b04ab627c4aa" containerName="console" Apr 20 15:09:32.814949 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:09:32.814934 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="2125979b-7f4a-4e85-9570-b04ab627c4aa" containerName="console" Apr 20 15:09:32.814949 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:09:32.814943 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="72600b94-c8a8-4eef-900d-a1f3ec8450e8" containerName="console" Apr 20 15:09:32.814949 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:09:32.814949 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="991dcfd6-74f7-47ef-9682-ac84ec6e81f2" containerName="registry" Apr 20 15:09:32.817918 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:09:32.817903 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-kjdfb" Apr 20 15:09:32.820368 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:09:32.820346 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 20 15:09:32.820488 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:09:32.820472 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 20 15:09:32.821407 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:09:32.821393 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-8gs6w\"" Apr 20 15:09:32.826469 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:09:32.826447 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-kjdfb"] Apr 20 15:09:32.946750 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:09:32.946721 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f6ffaafc-bf35-4dfd-80cd-cb644d025bcd-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-kjdfb\" (UID: \"f6ffaafc-bf35-4dfd-80cd-cb644d025bcd\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-kjdfb" Apr 20 15:09:32.946898 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:09:32.946781 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9drw\" (UniqueName: \"kubernetes.io/projected/f6ffaafc-bf35-4dfd-80cd-cb644d025bcd-kube-api-access-c9drw\") pod \"cert-manager-cainjector-8966b78d4-kjdfb\" (UID: \"f6ffaafc-bf35-4dfd-80cd-cb644d025bcd\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-kjdfb" Apr 20 15:09:33.048157 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:09:33.048127 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9drw\" (UniqueName: \"kubernetes.io/projected/f6ffaafc-bf35-4dfd-80cd-cb644d025bcd-kube-api-access-c9drw\") pod \"cert-manager-cainjector-8966b78d4-kjdfb\" (UID: \"f6ffaafc-bf35-4dfd-80cd-cb644d025bcd\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-kjdfb" Apr 20 15:09:33.048328 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:09:33.048200 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f6ffaafc-bf35-4dfd-80cd-cb644d025bcd-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-kjdfb\" (UID: \"f6ffaafc-bf35-4dfd-80cd-cb644d025bcd\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-kjdfb" Apr 20 15:09:33.057094 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:09:33.057073 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9drw\" (UniqueName: \"kubernetes.io/projected/f6ffaafc-bf35-4dfd-80cd-cb644d025bcd-kube-api-access-c9drw\") pod \"cert-manager-cainjector-8966b78d4-kjdfb\" (UID: \"f6ffaafc-bf35-4dfd-80cd-cb644d025bcd\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-kjdfb" Apr 20 15:09:33.057363 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:09:33.057347 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f6ffaafc-bf35-4dfd-80cd-cb644d025bcd-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-kjdfb\" (UID: \"f6ffaafc-bf35-4dfd-80cd-cb644d025bcd\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-kjdfb" Apr 20 15:09:33.139368 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:09:33.139339 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-kjdfb" Apr 20 15:09:33.259006 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:09:33.258983 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-kjdfb"] Apr 20 15:09:33.261642 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:09:33.261619 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6ffaafc_bf35_4dfd_80cd_cb644d025bcd.slice/crio-621d98e40c4ff4d8835490a2a8a925129abe4961457c4fdfee3b11414ad2c5ef WatchSource:0}: Error finding container 621d98e40c4ff4d8835490a2a8a925129abe4961457c4fdfee3b11414ad2c5ef: Status 404 returned error can't find the container with id 621d98e40c4ff4d8835490a2a8a925129abe4961457c4fdfee3b11414ad2c5ef Apr 20 15:09:33.263489 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:09:33.263470 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 15:09:33.466721 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:09:33.466638 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-kjdfb" event={"ID":"f6ffaafc-bf35-4dfd-80cd-cb644d025bcd","Type":"ContainerStarted","Data":"621d98e40c4ff4d8835490a2a8a925129abe4961457c4fdfee3b11414ad2c5ef"} Apr 20 15:09:37.486934 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:09:37.486896 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-kjdfb" event={"ID":"f6ffaafc-bf35-4dfd-80cd-cb644d025bcd","Type":"ContainerStarted","Data":"b19824c30580582fa21f7de4d1567550ffdf9e4ca9c66188cc6b20350c0e7dff"} Apr 20 15:09:37.503082 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:09:37.503028 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-kjdfb" podStartSLOduration=2.321572578 podStartE2EDuration="5.50301262s" podCreationTimestamp="2026-04-20 15:09:32 +0000 UTC" firstStartedPulling="2026-04-20 15:09:33.26365156 +0000 UTC m=+430.636238256" lastFinishedPulling="2026-04-20 15:09:36.445091602 +0000 UTC m=+433.817678298" observedRunningTime="2026-04-20 15:09:37.501712591 +0000 UTC m=+434.874299296" watchObservedRunningTime="2026-04-20 15:09:37.50301262 +0000 UTC m=+434.875599324" Apr 20 15:10:04.008532 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:04.008493 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-gc2mf"] Apr 20 15:10:04.014837 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:04.014809 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-gc2mf" Apr 20 15:10:04.017568 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:04.017543 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 15:10:04.017568 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:04.017562 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 15:10:04.017758 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:04.017636 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 15:10:04.017854 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:04.017840 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 15:10:04.018239 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:04.018225 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-td6pf\"" Apr 20 15:10:04.042440 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:04.042410 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-gc2mf"] Apr 20 15:10:04.120779 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:04.120744 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c3aa9a68-e4f8-494e-903a-b68cd8e83b71-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6cdf6786d9-gc2mf\" (UID: \"c3aa9a68-e4f8-494e-903a-b68cd8e83b71\") " pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-gc2mf" Apr 20 15:10:04.120946 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:04.120795 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c3aa9a68-e4f8-494e-903a-b68cd8e83b71-webhook-cert\") pod \"opendatahub-operator-controller-manager-6cdf6786d9-gc2mf\" (UID: \"c3aa9a68-e4f8-494e-903a-b68cd8e83b71\") " pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-gc2mf" Apr 20 15:10:04.120946 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:04.120851 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq2ms\" (UniqueName: \"kubernetes.io/projected/c3aa9a68-e4f8-494e-903a-b68cd8e83b71-kube-api-access-jq2ms\") pod \"opendatahub-operator-controller-manager-6cdf6786d9-gc2mf\" (UID: \"c3aa9a68-e4f8-494e-903a-b68cd8e83b71\") " pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-gc2mf" Apr 20 15:10:04.221807 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:04.221762 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c3aa9a68-e4f8-494e-903a-b68cd8e83b71-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6cdf6786d9-gc2mf\" (UID: \"c3aa9a68-e4f8-494e-903a-b68cd8e83b71\") " pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-gc2mf" Apr 20 15:10:04.221996 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:04.221823 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c3aa9a68-e4f8-494e-903a-b68cd8e83b71-webhook-cert\") pod \"opendatahub-operator-controller-manager-6cdf6786d9-gc2mf\" (UID: \"c3aa9a68-e4f8-494e-903a-b68cd8e83b71\") " pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-gc2mf" Apr 20 15:10:04.221996 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:04.221854 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jq2ms\" (UniqueName: \"kubernetes.io/projected/c3aa9a68-e4f8-494e-903a-b68cd8e83b71-kube-api-access-jq2ms\") pod \"opendatahub-operator-controller-manager-6cdf6786d9-gc2mf\" (UID: \"c3aa9a68-e4f8-494e-903a-b68cd8e83b71\") " pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-gc2mf" Apr 20 15:10:04.224384 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:04.224352 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c3aa9a68-e4f8-494e-903a-b68cd8e83b71-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6cdf6786d9-gc2mf\" (UID: \"c3aa9a68-e4f8-494e-903a-b68cd8e83b71\") " pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-gc2mf" Apr 20 15:10:04.224501 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:04.224400 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c3aa9a68-e4f8-494e-903a-b68cd8e83b71-webhook-cert\") pod \"opendatahub-operator-controller-manager-6cdf6786d9-gc2mf\" (UID: \"c3aa9a68-e4f8-494e-903a-b68cd8e83b71\") " pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-gc2mf" Apr 20 15:10:04.232518 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:04.232489 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq2ms\" (UniqueName: \"kubernetes.io/projected/c3aa9a68-e4f8-494e-903a-b68cd8e83b71-kube-api-access-jq2ms\") pod \"opendatahub-operator-controller-manager-6cdf6786d9-gc2mf\" (UID: \"c3aa9a68-e4f8-494e-903a-b68cd8e83b71\") " pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-gc2mf" Apr 20 15:10:04.325375 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:04.325341 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-gc2mf" Apr 20 15:10:04.460263 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:04.460236 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-gc2mf"] Apr 20 15:10:04.462843 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:10:04.462818 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3aa9a68_e4f8_494e_903a_b68cd8e83b71.slice/crio-9b075f13ffadc552d72d15122332c5107dd8c229974f11516c8a34636722d86d WatchSource:0}: Error finding container 9b075f13ffadc552d72d15122332c5107dd8c229974f11516c8a34636722d86d: Status 404 returned error can't find the container with id 9b075f13ffadc552d72d15122332c5107dd8c229974f11516c8a34636722d86d Apr 20 15:10:04.566932 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:04.566898 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-gc2mf" event={"ID":"c3aa9a68-e4f8-494e-903a-b68cd8e83b71","Type":"ContainerStarted","Data":"9b075f13ffadc552d72d15122332c5107dd8c229974f11516c8a34636722d86d"} Apr 20 15:10:07.585859 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:07.585822 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-gc2mf" event={"ID":"c3aa9a68-e4f8-494e-903a-b68cd8e83b71","Type":"ContainerStarted","Data":"cae4fe9023f2390e29758cb075de8ecc7249c64dbcb4595adcd6ebd2f958fcf1"} Apr 20 15:10:07.586311 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:07.585945 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-gc2mf" Apr 20 15:10:07.612539 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:07.612483 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-gc2mf" podStartSLOduration=2.032317036 podStartE2EDuration="4.612467218s" podCreationTimestamp="2026-04-20 15:10:03 +0000 UTC" firstStartedPulling="2026-04-20 15:10:04.464422419 +0000 UTC m=+461.837009101" lastFinishedPulling="2026-04-20 15:10:07.044572602 +0000 UTC m=+464.417159283" observedRunningTime="2026-04-20 15:10:07.610518342 +0000 UTC m=+464.983105046" watchObservedRunningTime="2026-04-20 15:10:07.612467218 +0000 UTC m=+464.985053922" Apr 20 15:10:08.743802 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:08.743752 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-6b8584f779-8g8x7"] Apr 20 15:10:08.747352 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:08.747331 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6b8584f779-8g8x7" Apr 20 15:10:08.749947 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:08.749919 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 20 15:10:08.751095 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:08.751057 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 20 15:10:08.751095 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:08.751073 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-gxh59\"" Apr 20 15:10:08.751297 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:08.751118 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 15:10:08.751297 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:08.751184 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 20 15:10:08.751297 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:08.751131 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 15:10:08.760054 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:08.760030 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6b8584f779-8g8x7"] Apr 20 15:10:08.869002 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:08.868970 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/0c749bc3-dcb0-4ba1-80b3-819ef3a7f61d-metrics-cert\") pod \"lws-controller-manager-6b8584f779-8g8x7\" (UID: \"0c749bc3-dcb0-4ba1-80b3-819ef3a7f61d\") " pod="openshift-lws-operator/lws-controller-manager-6b8584f779-8g8x7" Apr 20 15:10:08.869171 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:08.869010 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/0c749bc3-dcb0-4ba1-80b3-819ef3a7f61d-manager-config\") pod \"lws-controller-manager-6b8584f779-8g8x7\" (UID: \"0c749bc3-dcb0-4ba1-80b3-819ef3a7f61d\") " pod="openshift-lws-operator/lws-controller-manager-6b8584f779-8g8x7" Apr 20 15:10:08.869171 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:08.869042 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46c67\" (UniqueName: \"kubernetes.io/projected/0c749bc3-dcb0-4ba1-80b3-819ef3a7f61d-kube-api-access-46c67\") pod \"lws-controller-manager-6b8584f779-8g8x7\" (UID: \"0c749bc3-dcb0-4ba1-80b3-819ef3a7f61d\") " pod="openshift-lws-operator/lws-controller-manager-6b8584f779-8g8x7" Apr 20 15:10:08.869171 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:08.869113 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c749bc3-dcb0-4ba1-80b3-819ef3a7f61d-cert\") pod \"lws-controller-manager-6b8584f779-8g8x7\" (UID: \"0c749bc3-dcb0-4ba1-80b3-819ef3a7f61d\") " pod="openshift-lws-operator/lws-controller-manager-6b8584f779-8g8x7" Apr 20 15:10:08.970150 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:08.970107 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/0c749bc3-dcb0-4ba1-80b3-819ef3a7f61d-metrics-cert\") pod \"lws-controller-manager-6b8584f779-8g8x7\" (UID: \"0c749bc3-dcb0-4ba1-80b3-819ef3a7f61d\") " pod="openshift-lws-operator/lws-controller-manager-6b8584f779-8g8x7" Apr 20 15:10:08.970358 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:08.970160 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/0c749bc3-dcb0-4ba1-80b3-819ef3a7f61d-manager-config\") pod \"lws-controller-manager-6b8584f779-8g8x7\" (UID: \"0c749bc3-dcb0-4ba1-80b3-819ef3a7f61d\") " pod="openshift-lws-operator/lws-controller-manager-6b8584f779-8g8x7" Apr 20 15:10:08.970358 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:08.970226 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-46c67\" (UniqueName: \"kubernetes.io/projected/0c749bc3-dcb0-4ba1-80b3-819ef3a7f61d-kube-api-access-46c67\") pod \"lws-controller-manager-6b8584f779-8g8x7\" (UID: \"0c749bc3-dcb0-4ba1-80b3-819ef3a7f61d\") " pod="openshift-lws-operator/lws-controller-manager-6b8584f779-8g8x7" Apr 20 15:10:08.970358 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:08.970313 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c749bc3-dcb0-4ba1-80b3-819ef3a7f61d-cert\") pod \"lws-controller-manager-6b8584f779-8g8x7\" (UID: \"0c749bc3-dcb0-4ba1-80b3-819ef3a7f61d\") " pod="openshift-lws-operator/lws-controller-manager-6b8584f779-8g8x7" Apr 20 15:10:08.970914 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:08.970888 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/0c749bc3-dcb0-4ba1-80b3-819ef3a7f61d-manager-config\") pod \"lws-controller-manager-6b8584f779-8g8x7\" (UID: \"0c749bc3-dcb0-4ba1-80b3-819ef3a7f61d\") " pod="openshift-lws-operator/lws-controller-manager-6b8584f779-8g8x7" Apr 20 15:10:08.972766 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:08.972743 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/0c749bc3-dcb0-4ba1-80b3-819ef3a7f61d-metrics-cert\") pod \"lws-controller-manager-6b8584f779-8g8x7\" (UID: \"0c749bc3-dcb0-4ba1-80b3-819ef3a7f61d\") " pod="openshift-lws-operator/lws-controller-manager-6b8584f779-8g8x7" Apr 20 15:10:08.972845 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:08.972768 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c749bc3-dcb0-4ba1-80b3-819ef3a7f61d-cert\") pod \"lws-controller-manager-6b8584f779-8g8x7\" (UID: \"0c749bc3-dcb0-4ba1-80b3-819ef3a7f61d\") " pod="openshift-lws-operator/lws-controller-manager-6b8584f779-8g8x7" Apr 20 15:10:08.978837 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:08.978813 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-46c67\" (UniqueName: \"kubernetes.io/projected/0c749bc3-dcb0-4ba1-80b3-819ef3a7f61d-kube-api-access-46c67\") pod \"lws-controller-manager-6b8584f779-8g8x7\" (UID: \"0c749bc3-dcb0-4ba1-80b3-819ef3a7f61d\") " pod="openshift-lws-operator/lws-controller-manager-6b8584f779-8g8x7" Apr 20 15:10:09.057149 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:09.057010 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6b8584f779-8g8x7" Apr 20 15:10:09.199013 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:09.198975 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6b8584f779-8g8x7"] Apr 20 15:10:09.201237 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:10:09.201208 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c749bc3_dcb0_4ba1_80b3_819ef3a7f61d.slice/crio-3eacd267ffa7a9f699e1e9f7c83a6a53b5b6dbd338784c96f1126e2a82a43fb0 WatchSource:0}: Error finding container 3eacd267ffa7a9f699e1e9f7c83a6a53b5b6dbd338784c96f1126e2a82a43fb0: Status 404 returned error can't find the container with id 3eacd267ffa7a9f699e1e9f7c83a6a53b5b6dbd338784c96f1126e2a82a43fb0 Apr 20 15:10:09.594077 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:09.594042 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6b8584f779-8g8x7" event={"ID":"0c749bc3-dcb0-4ba1-80b3-819ef3a7f61d","Type":"ContainerStarted","Data":"3eacd267ffa7a9f699e1e9f7c83a6a53b5b6dbd338784c96f1126e2a82a43fb0"} Apr 20 15:10:12.605106 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:12.605068 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6b8584f779-8g8x7" event={"ID":"0c749bc3-dcb0-4ba1-80b3-819ef3a7f61d","Type":"ContainerStarted","Data":"9bd3fd925db0f19c96b13e4054f6ffe3ee82d4719f89f544f0908f22bd0d5369"} Apr 20 15:10:12.605599 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:12.605306 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-6b8584f779-8g8x7" Apr 20 15:10:12.624301 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:12.624223 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-6b8584f779-8g8x7" podStartSLOduration=2.272436224 podStartE2EDuration="4.624208791s" podCreationTimestamp="2026-04-20 15:10:08 +0000 UTC" firstStartedPulling="2026-04-20 15:10:09.203024845 +0000 UTC m=+466.575611531" lastFinishedPulling="2026-04-20 15:10:11.55479741 +0000 UTC m=+468.927384098" observedRunningTime="2026-04-20 15:10:12.623446593 +0000 UTC m=+469.996033298" watchObservedRunningTime="2026-04-20 15:10:12.624208791 +0000 UTC m=+469.996795494" Apr 20 15:10:18.591844 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:18.591812 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6cdf6786d9-gc2mf" Apr 20 15:10:21.423037 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:21.423001 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-7fcf5d587f-zx95c"] Apr 20 15:10:21.426396 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:21.426368 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-7fcf5d587f-zx95c" Apr 20 15:10:21.429059 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:21.429032 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 20 15:10:21.430219 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:21.430192 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 15:10:21.430361 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:21.430225 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 15:10:21.430479 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:21.430464 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 20 15:10:21.430534 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:21.430502 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-8sbn6\"" Apr 20 15:10:21.439315 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:21.439255 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-7fcf5d587f-zx95c"] Apr 20 15:10:21.482042 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:21.482001 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bdc3ef4e-6631-4f08-8712-1d65faaf30c4-tls-certs\") pod \"kube-auth-proxy-7fcf5d587f-zx95c\" (UID: \"bdc3ef4e-6631-4f08-8712-1d65faaf30c4\") " pod="openshift-ingress/kube-auth-proxy-7fcf5d587f-zx95c" Apr 20 15:10:21.482042 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:21.482039 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bdc3ef4e-6631-4f08-8712-1d65faaf30c4-tmp\") pod \"kube-auth-proxy-7fcf5d587f-zx95c\" (UID: \"bdc3ef4e-6631-4f08-8712-1d65faaf30c4\") " pod="openshift-ingress/kube-auth-proxy-7fcf5d587f-zx95c" Apr 20 15:10:21.482261 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:21.482210 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgkvx\" (UniqueName: \"kubernetes.io/projected/bdc3ef4e-6631-4f08-8712-1d65faaf30c4-kube-api-access-jgkvx\") pod \"kube-auth-proxy-7fcf5d587f-zx95c\" (UID: \"bdc3ef4e-6631-4f08-8712-1d65faaf30c4\") " pod="openshift-ingress/kube-auth-proxy-7fcf5d587f-zx95c" Apr 20 15:10:21.583660 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:21.583608 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jgkvx\" (UniqueName: \"kubernetes.io/projected/bdc3ef4e-6631-4f08-8712-1d65faaf30c4-kube-api-access-jgkvx\") pod \"kube-auth-proxy-7fcf5d587f-zx95c\" (UID: \"bdc3ef4e-6631-4f08-8712-1d65faaf30c4\") " pod="openshift-ingress/kube-auth-proxy-7fcf5d587f-zx95c" Apr 20 15:10:21.583829 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:21.583718 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bdc3ef4e-6631-4f08-8712-1d65faaf30c4-tls-certs\") pod \"kube-auth-proxy-7fcf5d587f-zx95c\" (UID: \"bdc3ef4e-6631-4f08-8712-1d65faaf30c4\") " pod="openshift-ingress/kube-auth-proxy-7fcf5d587f-zx95c" Apr 20 15:10:21.583829 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:21.583745 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bdc3ef4e-6631-4f08-8712-1d65faaf30c4-tmp\") pod \"kube-auth-proxy-7fcf5d587f-zx95c\" (UID: \"bdc3ef4e-6631-4f08-8712-1d65faaf30c4\") " pod="openshift-ingress/kube-auth-proxy-7fcf5d587f-zx95c" Apr 20 15:10:21.586249 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:21.586208 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bdc3ef4e-6631-4f08-8712-1d65faaf30c4-tmp\") pod \"kube-auth-proxy-7fcf5d587f-zx95c\" (UID: \"bdc3ef4e-6631-4f08-8712-1d65faaf30c4\") " pod="openshift-ingress/kube-auth-proxy-7fcf5d587f-zx95c" Apr 20 15:10:21.586442 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:21.586423 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bdc3ef4e-6631-4f08-8712-1d65faaf30c4-tls-certs\") pod \"kube-auth-proxy-7fcf5d587f-zx95c\" (UID: \"bdc3ef4e-6631-4f08-8712-1d65faaf30c4\") " pod="openshift-ingress/kube-auth-proxy-7fcf5d587f-zx95c" Apr 20 15:10:21.600626 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:21.600592 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgkvx\" (UniqueName: \"kubernetes.io/projected/bdc3ef4e-6631-4f08-8712-1d65faaf30c4-kube-api-access-jgkvx\") pod \"kube-auth-proxy-7fcf5d587f-zx95c\" (UID: \"bdc3ef4e-6631-4f08-8712-1d65faaf30c4\") " pod="openshift-ingress/kube-auth-proxy-7fcf5d587f-zx95c" Apr 20 15:10:21.737118 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:21.737025 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-7fcf5d587f-zx95c" Apr 20 15:10:21.863173 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:21.863133 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-7fcf5d587f-zx95c"] Apr 20 15:10:21.866181 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:10:21.866150 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdc3ef4e_6631_4f08_8712_1d65faaf30c4.slice/crio-e20f22b082e75163086f3c8084452ad371623580a2dc6505551e1be268f19844 WatchSource:0}: Error finding container e20f22b082e75163086f3c8084452ad371623580a2dc6505551e1be268f19844: Status 404 returned error can't find the container with id e20f22b082e75163086f3c8084452ad371623580a2dc6505551e1be268f19844 Apr 20 15:10:22.641253 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:22.641197 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-7fcf5d587f-zx95c" event={"ID":"bdc3ef4e-6631-4f08-8712-1d65faaf30c4","Type":"ContainerStarted","Data":"e20f22b082e75163086f3c8084452ad371623580a2dc6505551e1be268f19844"} Apr 20 15:10:23.610990 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:23.610954 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-6b8584f779-8g8x7" Apr 20 15:10:25.841473 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:25.841448 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 20 15:10:26.657360 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:26.657314 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-7fcf5d587f-zx95c" event={"ID":"bdc3ef4e-6631-4f08-8712-1d65faaf30c4","Type":"ContainerStarted","Data":"7e84d632eb8e87d629fca5f5c69f5aafb4d1940740442f1f762b6cc38287aeaf"} Apr 20 15:10:26.673617 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:10:26.673559 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-7fcf5d587f-zx95c" podStartSLOduration=1.7025137830000001 podStartE2EDuration="5.673543203s" podCreationTimestamp="2026-04-20 15:10:21 +0000 UTC" firstStartedPulling="2026-04-20 15:10:21.867980432 +0000 UTC m=+479.240567115" lastFinishedPulling="2026-04-20 15:10:25.839009853 +0000 UTC m=+483.211596535" observedRunningTime="2026-04-20 15:10:26.672376582 +0000 UTC m=+484.044963287" watchObservedRunningTime="2026-04-20 15:10:26.673543203 +0000 UTC m=+484.046129907" Apr 20 15:12:05.519473 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:05.519404 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vqltq"] Apr 20 15:12:05.521683 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:05.521664 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vqltq" Apr 20 15:12:05.525595 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:05.525569 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-cl6fg\"" Apr 20 15:12:05.525728 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:05.525632 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 15:12:05.526780 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:05.526764 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 15:12:05.534414 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:05.534384 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vqltq"] Apr 20 15:12:05.618859 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:05.618826 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2lxq\" (UniqueName: \"kubernetes.io/projected/d8dbfb82-ff49-466c-8462-5c0f18ea8a1b-kube-api-access-b2lxq\") pod \"limitador-operator-controller-manager-85c4996f8c-vqltq\" (UID: \"d8dbfb82-ff49-466c-8462-5c0f18ea8a1b\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vqltq" Apr 20 15:12:05.719578 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:05.719539 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2lxq\" (UniqueName: \"kubernetes.io/projected/d8dbfb82-ff49-466c-8462-5c0f18ea8a1b-kube-api-access-b2lxq\") pod \"limitador-operator-controller-manager-85c4996f8c-vqltq\" (UID: \"d8dbfb82-ff49-466c-8462-5c0f18ea8a1b\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vqltq" Apr 20 15:12:05.728325 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:05.728296 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2lxq\" (UniqueName: \"kubernetes.io/projected/d8dbfb82-ff49-466c-8462-5c0f18ea8a1b-kube-api-access-b2lxq\") pod \"limitador-operator-controller-manager-85c4996f8c-vqltq\" (UID: \"d8dbfb82-ff49-466c-8462-5c0f18ea8a1b\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vqltq" Apr 20 15:12:05.832120 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:05.832087 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vqltq" Apr 20 15:12:05.951143 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:05.951113 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vqltq"] Apr 20 15:12:05.953751 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:12:05.953721 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8dbfb82_ff49_466c_8462_5c0f18ea8a1b.slice/crio-168d04832049200b136c91af0aea896d7a09729c28368d6fba590414ad427d0e WatchSource:0}: Error finding container 168d04832049200b136c91af0aea896d7a09729c28368d6fba590414ad427d0e: Status 404 returned error can't find the container with id 168d04832049200b136c91af0aea896d7a09729c28368d6fba590414ad427d0e Apr 20 15:12:06.001335 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:06.001297 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vqltq" event={"ID":"d8dbfb82-ff49-466c-8462-5c0f18ea8a1b","Type":"ContainerStarted","Data":"168d04832049200b136c91af0aea896d7a09729c28368d6fba590414ad427d0e"} Apr 20 15:12:09.012946 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:09.012854 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vqltq" event={"ID":"d8dbfb82-ff49-466c-8462-5c0f18ea8a1b","Type":"ContainerStarted","Data":"c25af830c9d0889014e68981b04425adf2884145342b75ae4a7418790d20cbb8"} Apr 20 15:12:09.013345 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:09.012992 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vqltq" Apr 20 15:12:09.028173 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:09.028120 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vqltq" podStartSLOduration=1.307239112 podStartE2EDuration="4.028106274s" podCreationTimestamp="2026-04-20 15:12:05 +0000 UTC" firstStartedPulling="2026-04-20 15:12:05.955562683 +0000 UTC m=+583.328149368" lastFinishedPulling="2026-04-20 15:12:08.676429846 +0000 UTC m=+586.049016530" observedRunningTime="2026-04-20 15:12:09.027589281 +0000 UTC m=+586.400175987" watchObservedRunningTime="2026-04-20 15:12:09.028106274 +0000 UTC m=+586.400692981" Apr 20 15:12:18.974240 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:18.974205 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vqltq"] Apr 20 15:12:18.974692 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:18.974452 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vqltq" podUID="d8dbfb82-ff49-466c-8462-5c0f18ea8a1b" containerName="manager" containerID="cri-o://c25af830c9d0889014e68981b04425adf2884145342b75ae4a7418790d20cbb8" gracePeriod=2 Apr 20 15:12:18.976428 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:18.976403 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vqltq" Apr 20 15:12:18.994138 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:18.994113 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vqltq"] Apr 20 15:12:19.022388 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:19.022361 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-j6s5f"] Apr 20 15:12:19.022747 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:19.022734 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d8dbfb82-ff49-466c-8462-5c0f18ea8a1b" containerName="manager" Apr 20 15:12:19.022802 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:19.022749 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8dbfb82-ff49-466c-8462-5c0f18ea8a1b" containerName="manager" Apr 20 15:12:19.022835 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:19.022812 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d8dbfb82-ff49-466c-8462-5c0f18ea8a1b" containerName="manager" Apr 20 15:12:19.024676 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:19.024659 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-j6s5f" Apr 20 15:12:19.034006 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:19.033980 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd748\" (UniqueName: \"kubernetes.io/projected/110c0249-fda7-4bfa-bfd5-a1e2c2a5d5c3-kube-api-access-gd748\") pod \"limitador-operator-controller-manager-85c4996f8c-j6s5f\" (UID: \"110c0249-fda7-4bfa-bfd5-a1e2c2a5d5c3\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-j6s5f" Apr 20 15:12:19.037666 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:19.037640 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-j6s5f"] Apr 20 15:12:19.056209 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:19.056170 2575 status_manager.go:895] "Failed to get status for pod" podUID="d8dbfb82-ff49-466c-8462-5c0f18ea8a1b" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vqltq" err="pods \"limitador-operator-controller-manager-85c4996f8c-vqltq\" is forbidden: User \"system:node:ip-10-0-133-198.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-198.ec2.internal' and this object" Apr 20 15:12:19.135377 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:19.135337 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gd748\" (UniqueName: \"kubernetes.io/projected/110c0249-fda7-4bfa-bfd5-a1e2c2a5d5c3-kube-api-access-gd748\") pod \"limitador-operator-controller-manager-85c4996f8c-j6s5f\" (UID: \"110c0249-fda7-4bfa-bfd5-a1e2c2a5d5c3\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-j6s5f" Apr 20 15:12:19.144178 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:19.144143 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd748\" (UniqueName: \"kubernetes.io/projected/110c0249-fda7-4bfa-bfd5-a1e2c2a5d5c3-kube-api-access-gd748\") pod \"limitador-operator-controller-manager-85c4996f8c-j6s5f\" (UID: \"110c0249-fda7-4bfa-bfd5-a1e2c2a5d5c3\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-j6s5f" Apr 20 15:12:19.203609 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:19.203584 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vqltq" Apr 20 15:12:19.236541 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:19.236464 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2lxq\" (UniqueName: \"kubernetes.io/projected/d8dbfb82-ff49-466c-8462-5c0f18ea8a1b-kube-api-access-b2lxq\") pod \"d8dbfb82-ff49-466c-8462-5c0f18ea8a1b\" (UID: \"d8dbfb82-ff49-466c-8462-5c0f18ea8a1b\") " Apr 20 15:12:19.238614 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:19.238587 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8dbfb82-ff49-466c-8462-5c0f18ea8a1b-kube-api-access-b2lxq" (OuterVolumeSpecName: "kube-api-access-b2lxq") pod "d8dbfb82-ff49-466c-8462-5c0f18ea8a1b" (UID: "d8dbfb82-ff49-466c-8462-5c0f18ea8a1b"). InnerVolumeSpecName "kube-api-access-b2lxq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:12:19.337366 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:19.337333 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b2lxq\" (UniqueName: \"kubernetes.io/projected/d8dbfb82-ff49-466c-8462-5c0f18ea8a1b-kube-api-access-b2lxq\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 20 15:12:19.347208 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:19.347187 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-j6s5f" Apr 20 15:12:19.468217 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:19.468197 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-j6s5f"] Apr 20 15:12:19.470518 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:12:19.470492 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod110c0249_fda7_4bfa_bfd5_a1e2c2a5d5c3.slice/crio-6c581ff6e6d32372db9080aad9b4a1b8bc61303891f8b25a20f4342864612a7d WatchSource:0}: Error finding container 6c581ff6e6d32372db9080aad9b4a1b8bc61303891f8b25a20f4342864612a7d: Status 404 returned error can't find the container with id 6c581ff6e6d32372db9080aad9b4a1b8bc61303891f8b25a20f4342864612a7d Apr 20 15:12:20.050417 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:20.050380 2575 generic.go:358] "Generic (PLEG): container finished" podID="d8dbfb82-ff49-466c-8462-5c0f18ea8a1b" containerID="c25af830c9d0889014e68981b04425adf2884145342b75ae4a7418790d20cbb8" exitCode=0 Apr 20 15:12:20.050848 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:20.050431 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vqltq" Apr 20 15:12:20.050848 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:20.050464 2575 scope.go:117] "RemoveContainer" containerID="c25af830c9d0889014e68981b04425adf2884145342b75ae4a7418790d20cbb8" Apr 20 15:12:20.052153 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:20.052124 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-j6s5f" event={"ID":"110c0249-fda7-4bfa-bfd5-a1e2c2a5d5c3","Type":"ContainerStarted","Data":"5e5a676a69b6b12353648c93d03520d5c9340099dc1f207a460f9559e8ad4fa0"} Apr 20 15:12:20.052298 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:20.052157 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-j6s5f" event={"ID":"110c0249-fda7-4bfa-bfd5-a1e2c2a5d5c3","Type":"ContainerStarted","Data":"6c581ff6e6d32372db9080aad9b4a1b8bc61303891f8b25a20f4342864612a7d"} Apr 20 15:12:20.052362 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:20.052326 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-j6s5f" Apr 20 15:12:20.059877 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:20.059857 2575 scope.go:117] "RemoveContainer" containerID="c25af830c9d0889014e68981b04425adf2884145342b75ae4a7418790d20cbb8" Apr 20 15:12:20.060115 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:12:20.060097 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c25af830c9d0889014e68981b04425adf2884145342b75ae4a7418790d20cbb8\": container with ID starting with c25af830c9d0889014e68981b04425adf2884145342b75ae4a7418790d20cbb8 not found: ID does not exist" containerID="c25af830c9d0889014e68981b04425adf2884145342b75ae4a7418790d20cbb8" Apr 20 15:12:20.060170 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:20.060125 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c25af830c9d0889014e68981b04425adf2884145342b75ae4a7418790d20cbb8"} err="failed to get container status \"c25af830c9d0889014e68981b04425adf2884145342b75ae4a7418790d20cbb8\": rpc error: code = NotFound desc = could not find container \"c25af830c9d0889014e68981b04425adf2884145342b75ae4a7418790d20cbb8\": container with ID starting with c25af830c9d0889014e68981b04425adf2884145342b75ae4a7418790d20cbb8 not found: ID does not exist" Apr 20 15:12:20.070129 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:20.070090 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-j6s5f" podStartSLOduration=2.070080078 podStartE2EDuration="2.070080078s" podCreationTimestamp="2026-04-20 15:12:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:12:20.069211561 +0000 UTC m=+597.441798266" watchObservedRunningTime="2026-04-20 15:12:20.070080078 +0000 UTC m=+597.442666784" Apr 20 15:12:20.071160 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:20.071137 2575 status_manager.go:895] "Failed to get status for pod" podUID="d8dbfb82-ff49-466c-8462-5c0f18ea8a1b" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vqltq" err="pods \"limitador-operator-controller-manager-85c4996f8c-vqltq\" is forbidden: User \"system:node:ip-10-0-133-198.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-198.ec2.internal' and this object" Apr 20 15:12:21.182464 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:21.182429 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8dbfb82-ff49-466c-8462-5c0f18ea8a1b" path="/var/lib/kubelet/pods/d8dbfb82-ff49-466c-8462-5c0f18ea8a1b/volumes" Apr 20 15:12:23.092959 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:23.092928 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pbfc9_ab96425d-f444-4af7-9052-1bddacccef53/console-operator/1.log" Apr 20 15:12:23.093818 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:23.093786 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pbfc9_ab96425d-f444-4af7-9052-1bddacccef53/console-operator/1.log" Apr 20 15:12:31.065912 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:31.065879 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-j6s5f" Apr 20 15:12:56.375462 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:56.375430 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-l98gk"] Apr 20 15:12:56.381520 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:56.381499 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-l98gk" Apr 20 15:12:56.384175 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:56.384149 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-49rcd\"" Apr 20 15:12:56.385756 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:56.385732 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-l98gk"] Apr 20 15:12:56.467589 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:56.467560 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rzvn\" (UniqueName: \"kubernetes.io/projected/a4702262-33ef-491b-ad3a-9cf04a0674ee-kube-api-access-2rzvn\") pod \"authorino-f99f4b5cd-l98gk\" (UID: \"a4702262-33ef-491b-ad3a-9cf04a0674ee\") " pod="kuadrant-system/authorino-f99f4b5cd-l98gk" Apr 20 15:12:56.527777 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:56.527740 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-52gvw"] Apr 20 15:12:56.530046 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:56.530030 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-52gvw" Apr 20 15:12:56.539215 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:56.539193 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-52gvw"] Apr 20 15:12:56.568039 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:56.568011 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2rzvn\" (UniqueName: \"kubernetes.io/projected/a4702262-33ef-491b-ad3a-9cf04a0674ee-kube-api-access-2rzvn\") pod \"authorino-f99f4b5cd-l98gk\" (UID: \"a4702262-33ef-491b-ad3a-9cf04a0674ee\") " pod="kuadrant-system/authorino-f99f4b5cd-l98gk" Apr 20 15:12:56.576336 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:56.576309 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rzvn\" (UniqueName: \"kubernetes.io/projected/a4702262-33ef-491b-ad3a-9cf04a0674ee-kube-api-access-2rzvn\") pod \"authorino-f99f4b5cd-l98gk\" (UID: \"a4702262-33ef-491b-ad3a-9cf04a0674ee\") " pod="kuadrant-system/authorino-f99f4b5cd-l98gk" Apr 20 15:12:56.668681 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:56.668597 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpkcs\" (UniqueName: \"kubernetes.io/projected/d2bd5487-94a0-499f-9d01-c82c49cc7513-kube-api-access-zpkcs\") pod \"authorino-7498df8756-52gvw\" (UID: \"d2bd5487-94a0-499f-9d01-c82c49cc7513\") " pod="kuadrant-system/authorino-7498df8756-52gvw" Apr 20 15:12:56.691638 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:56.691605 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-l98gk" Apr 20 15:12:56.770043 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:56.770013 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zpkcs\" (UniqueName: \"kubernetes.io/projected/d2bd5487-94a0-499f-9d01-c82c49cc7513-kube-api-access-zpkcs\") pod \"authorino-7498df8756-52gvw\" (UID: \"d2bd5487-94a0-499f-9d01-c82c49cc7513\") " pod="kuadrant-system/authorino-7498df8756-52gvw" Apr 20 15:12:56.779323 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:56.779265 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpkcs\" (UniqueName: \"kubernetes.io/projected/d2bd5487-94a0-499f-9d01-c82c49cc7513-kube-api-access-zpkcs\") pod \"authorino-7498df8756-52gvw\" (UID: \"d2bd5487-94a0-499f-9d01-c82c49cc7513\") " pod="kuadrant-system/authorino-7498df8756-52gvw" Apr 20 15:12:56.808890 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:56.808865 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-l98gk"] Apr 20 15:12:56.810698 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:12:56.810660 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4702262_33ef_491b_ad3a_9cf04a0674ee.slice/crio-d0347e4e9249bc092c3cc2733f1b8cbb6f1552d7f1f880f42b3ce5b112973cb4 WatchSource:0}: Error finding container d0347e4e9249bc092c3cc2733f1b8cbb6f1552d7f1f880f42b3ce5b112973cb4: Status 404 returned error can't find the container with id d0347e4e9249bc092c3cc2733f1b8cbb6f1552d7f1f880f42b3ce5b112973cb4 Apr 20 15:12:56.839644 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:56.839615 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-52gvw" Apr 20 15:12:56.960658 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:56.960629 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-52gvw"] Apr 20 15:12:56.962432 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:12:56.962404 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2bd5487_94a0_499f_9d01_c82c49cc7513.slice/crio-7282c545f9b3121a78dccb54f568247577fa8f1598433b9a646ea3e5a071b9c6 WatchSource:0}: Error finding container 7282c545f9b3121a78dccb54f568247577fa8f1598433b9a646ea3e5a071b9c6: Status 404 returned error can't find the container with id 7282c545f9b3121a78dccb54f568247577fa8f1598433b9a646ea3e5a071b9c6 Apr 20 15:12:57.182913 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:57.182882 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-52gvw" event={"ID":"d2bd5487-94a0-499f-9d01-c82c49cc7513","Type":"ContainerStarted","Data":"7282c545f9b3121a78dccb54f568247577fa8f1598433b9a646ea3e5a071b9c6"} Apr 20 15:12:57.183819 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:12:57.183794 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-l98gk" event={"ID":"a4702262-33ef-491b-ad3a-9cf04a0674ee","Type":"ContainerStarted","Data":"d0347e4e9249bc092c3cc2733f1b8cbb6f1552d7f1f880f42b3ce5b112973cb4"} Apr 20 15:13:00.198317 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:00.198262 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-l98gk" event={"ID":"a4702262-33ef-491b-ad3a-9cf04a0674ee","Type":"ContainerStarted","Data":"a69be12b6feb2adeab8b721dfab2807d6e13f968c4ce4e456b6a193f7a5b761d"} Apr 20 15:13:00.199542 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:00.199511 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-52gvw" event={"ID":"d2bd5487-94a0-499f-9d01-c82c49cc7513","Type":"ContainerStarted","Data":"ebdcfcb5f33f0cab069bee8c371585b2fe6a61f6e6007f53699149f57e2fe76f"} Apr 20 15:13:00.213642 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:00.213596 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-l98gk" podStartSLOduration=1.573033648 podStartE2EDuration="4.213577954s" podCreationTimestamp="2026-04-20 15:12:56 +0000 UTC" firstStartedPulling="2026-04-20 15:12:56.811898203 +0000 UTC m=+634.184484885" lastFinishedPulling="2026-04-20 15:12:59.452442496 +0000 UTC m=+636.825029191" observedRunningTime="2026-04-20 15:13:00.211781684 +0000 UTC m=+637.584368387" watchObservedRunningTime="2026-04-20 15:13:00.213577954 +0000 UTC m=+637.586164659" Apr 20 15:13:00.230043 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:00.229996 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-52gvw" podStartSLOduration=1.7456300649999998 podStartE2EDuration="4.229982859s" podCreationTimestamp="2026-04-20 15:12:56 +0000 UTC" firstStartedPulling="2026-04-20 15:12:56.963771385 +0000 UTC m=+634.336358069" lastFinishedPulling="2026-04-20 15:12:59.448124167 +0000 UTC m=+636.820710863" observedRunningTime="2026-04-20 15:13:00.229600113 +0000 UTC m=+637.602186824" watchObservedRunningTime="2026-04-20 15:13:00.229982859 +0000 UTC m=+637.602569565" Apr 20 15:13:00.254328 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:00.254292 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-l98gk"] Apr 20 15:13:02.206385 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:02.206347 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-l98gk" podUID="a4702262-33ef-491b-ad3a-9cf04a0674ee" containerName="authorino" containerID="cri-o://a69be12b6feb2adeab8b721dfab2807d6e13f968c4ce4e456b6a193f7a5b761d" gracePeriod=30 Apr 20 15:13:02.456456 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:02.456397 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-l98gk" Apr 20 15:13:02.521347 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:02.521315 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rzvn\" (UniqueName: \"kubernetes.io/projected/a4702262-33ef-491b-ad3a-9cf04a0674ee-kube-api-access-2rzvn\") pod \"a4702262-33ef-491b-ad3a-9cf04a0674ee\" (UID: \"a4702262-33ef-491b-ad3a-9cf04a0674ee\") " Apr 20 15:13:02.523415 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:02.523383 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4702262-33ef-491b-ad3a-9cf04a0674ee-kube-api-access-2rzvn" (OuterVolumeSpecName: "kube-api-access-2rzvn") pod "a4702262-33ef-491b-ad3a-9cf04a0674ee" (UID: "a4702262-33ef-491b-ad3a-9cf04a0674ee"). InnerVolumeSpecName "kube-api-access-2rzvn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:13:02.622034 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:02.621988 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2rzvn\" (UniqueName: \"kubernetes.io/projected/a4702262-33ef-491b-ad3a-9cf04a0674ee-kube-api-access-2rzvn\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 20 15:13:03.210138 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:03.210107 2575 generic.go:358] "Generic (PLEG): container finished" podID="a4702262-33ef-491b-ad3a-9cf04a0674ee" containerID="a69be12b6feb2adeab8b721dfab2807d6e13f968c4ce4e456b6a193f7a5b761d" exitCode=0 Apr 20 15:13:03.210589 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:03.210156 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-l98gk" Apr 20 15:13:03.210589 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:03.210156 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-l98gk" event={"ID":"a4702262-33ef-491b-ad3a-9cf04a0674ee","Type":"ContainerDied","Data":"a69be12b6feb2adeab8b721dfab2807d6e13f968c4ce4e456b6a193f7a5b761d"} Apr 20 15:13:03.210589 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:03.210267 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-l98gk" event={"ID":"a4702262-33ef-491b-ad3a-9cf04a0674ee","Type":"ContainerDied","Data":"d0347e4e9249bc092c3cc2733f1b8cbb6f1552d7f1f880f42b3ce5b112973cb4"} Apr 20 15:13:03.210589 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:03.210313 2575 scope.go:117] "RemoveContainer" containerID="a69be12b6feb2adeab8b721dfab2807d6e13f968c4ce4e456b6a193f7a5b761d" Apr 20 15:13:03.218691 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:03.218677 2575 scope.go:117] "RemoveContainer" containerID="a69be12b6feb2adeab8b721dfab2807d6e13f968c4ce4e456b6a193f7a5b761d" Apr 20 15:13:03.218931 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:13:03.218912 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a69be12b6feb2adeab8b721dfab2807d6e13f968c4ce4e456b6a193f7a5b761d\": container with ID starting with a69be12b6feb2adeab8b721dfab2807d6e13f968c4ce4e456b6a193f7a5b761d not found: ID does not exist" containerID="a69be12b6feb2adeab8b721dfab2807d6e13f968c4ce4e456b6a193f7a5b761d" Apr 20 15:13:03.219003 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:03.218938 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a69be12b6feb2adeab8b721dfab2807d6e13f968c4ce4e456b6a193f7a5b761d"} err="failed to get container status \"a69be12b6feb2adeab8b721dfab2807d6e13f968c4ce4e456b6a193f7a5b761d\": rpc error: code = NotFound desc = could not find container \"a69be12b6feb2adeab8b721dfab2807d6e13f968c4ce4e456b6a193f7a5b761d\": container with ID starting with a69be12b6feb2adeab8b721dfab2807d6e13f968c4ce4e456b6a193f7a5b761d not found: ID does not exist" Apr 20 15:13:03.225227 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:03.225207 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-l98gk"] Apr 20 15:13:03.226968 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:03.226944 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-l98gk"] Apr 20 15:13:05.182644 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:05.182604 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4702262-33ef-491b-ad3a-9cf04a0674ee" path="/var/lib/kubelet/pods/a4702262-33ef-491b-ad3a-9cf04a0674ee/volumes" Apr 20 15:13:24.022254 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:24.022220 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-dt4jp"] Apr 20 15:13:24.022739 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:24.022633 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4702262-33ef-491b-ad3a-9cf04a0674ee" containerName="authorino" Apr 20 15:13:24.022739 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:24.022647 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4702262-33ef-491b-ad3a-9cf04a0674ee" containerName="authorino" Apr 20 15:13:24.022739 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:24.022710 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="a4702262-33ef-491b-ad3a-9cf04a0674ee" containerName="authorino" Apr 20 15:13:24.024943 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:24.024928 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-dt4jp" Apr 20 15:13:24.032205 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:24.032175 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-dt4jp"] Apr 20 15:13:24.109405 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:24.109356 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs96j\" (UniqueName: \"kubernetes.io/projected/d0525599-7d65-4901-8999-d4a2ca2ea99b-kube-api-access-hs96j\") pod \"authorino-8b475cf9f-dt4jp\" (UID: \"d0525599-7d65-4901-8999-d4a2ca2ea99b\") " pod="kuadrant-system/authorino-8b475cf9f-dt4jp" Apr 20 15:13:24.210170 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:24.210130 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hs96j\" (UniqueName: \"kubernetes.io/projected/d0525599-7d65-4901-8999-d4a2ca2ea99b-kube-api-access-hs96j\") pod \"authorino-8b475cf9f-dt4jp\" (UID: \"d0525599-7d65-4901-8999-d4a2ca2ea99b\") " pod="kuadrant-system/authorino-8b475cf9f-dt4jp" Apr 20 15:13:24.219182 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:24.219155 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs96j\" (UniqueName: \"kubernetes.io/projected/d0525599-7d65-4901-8999-d4a2ca2ea99b-kube-api-access-hs96j\") pod \"authorino-8b475cf9f-dt4jp\" (UID: \"d0525599-7d65-4901-8999-d4a2ca2ea99b\") " pod="kuadrant-system/authorino-8b475cf9f-dt4jp" Apr 20 15:13:24.254198 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:24.254163 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-dt4jp"] Apr 20 15:13:24.254452 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:24.254437 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-dt4jp" Apr 20 15:13:24.380651 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:24.380625 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-dt4jp"] Apr 20 15:13:24.382997 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:13:24.382967 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0525599_7d65_4901_8999_d4a2ca2ea99b.slice/crio-9ac39faa5c419aaa17c3244d39b8d1ce6bb0c7972627798ec73134a54247e9af WatchSource:0}: Error finding container 9ac39faa5c419aaa17c3244d39b8d1ce6bb0c7972627798ec73134a54247e9af: Status 404 returned error can't find the container with id 9ac39faa5c419aaa17c3244d39b8d1ce6bb0c7972627798ec73134a54247e9af Apr 20 15:13:24.539887 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:24.539847 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7c6cfd5998-d85mn"] Apr 20 15:13:24.544371 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:24.544343 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7c6cfd5998-d85mn" Apr 20 15:13:24.547133 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:24.547106 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 20 15:13:24.566669 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:24.566582 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7c6cfd5998-d85mn"] Apr 20 15:13:24.614328 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:24.614266 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/30475fd6-292b-4162-85fd-be084af09804-tls-cert\") pod \"authorino-7c6cfd5998-d85mn\" (UID: \"30475fd6-292b-4162-85fd-be084af09804\") " pod="kuadrant-system/authorino-7c6cfd5998-d85mn" Apr 20 15:13:24.614490 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:24.614339 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfvcb\" (UniqueName: \"kubernetes.io/projected/30475fd6-292b-4162-85fd-be084af09804-kube-api-access-tfvcb\") pod \"authorino-7c6cfd5998-d85mn\" (UID: \"30475fd6-292b-4162-85fd-be084af09804\") " pod="kuadrant-system/authorino-7c6cfd5998-d85mn" Apr 20 15:13:24.714929 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:24.714892 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/30475fd6-292b-4162-85fd-be084af09804-tls-cert\") pod \"authorino-7c6cfd5998-d85mn\" (UID: \"30475fd6-292b-4162-85fd-be084af09804\") " pod="kuadrant-system/authorino-7c6cfd5998-d85mn" Apr 20 15:13:24.714929 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:24.714929 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tfvcb\" (UniqueName: \"kubernetes.io/projected/30475fd6-292b-4162-85fd-be084af09804-kube-api-access-tfvcb\") pod \"authorino-7c6cfd5998-d85mn\" (UID: \"30475fd6-292b-4162-85fd-be084af09804\") " pod="kuadrant-system/authorino-7c6cfd5998-d85mn" Apr 20 15:13:24.717478 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:24.717446 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/30475fd6-292b-4162-85fd-be084af09804-tls-cert\") pod \"authorino-7c6cfd5998-d85mn\" (UID: \"30475fd6-292b-4162-85fd-be084af09804\") " pod="kuadrant-system/authorino-7c6cfd5998-d85mn" Apr 20 15:13:24.723552 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:24.723524 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfvcb\" (UniqueName: \"kubernetes.io/projected/30475fd6-292b-4162-85fd-be084af09804-kube-api-access-tfvcb\") pod \"authorino-7c6cfd5998-d85mn\" (UID: \"30475fd6-292b-4162-85fd-be084af09804\") " pod="kuadrant-system/authorino-7c6cfd5998-d85mn" Apr 20 15:13:24.854281 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:24.854250 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7c6cfd5998-d85mn" Apr 20 15:13:24.984885 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:24.984858 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7c6cfd5998-d85mn"] Apr 20 15:13:24.986984 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:13:24.986956 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30475fd6_292b_4162_85fd_be084af09804.slice/crio-7df2dd2b0599f9124fcf9a838ca34d82b223763bbe4ad264101e6a2498fd9f8c WatchSource:0}: Error finding container 7df2dd2b0599f9124fcf9a838ca34d82b223763bbe4ad264101e6a2498fd9f8c: Status 404 returned error can't find the container with id 7df2dd2b0599f9124fcf9a838ca34d82b223763bbe4ad264101e6a2498fd9f8c Apr 20 15:13:25.288905 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:25.288804 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-dt4jp" event={"ID":"d0525599-7d65-4901-8999-d4a2ca2ea99b","Type":"ContainerStarted","Data":"bc8e48de2ac8d301c554603086dc2acb2e35f78e6f75a6ca62405e5c35927b1e"} Apr 20 15:13:25.288905 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:25.288849 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-dt4jp" event={"ID":"d0525599-7d65-4901-8999-d4a2ca2ea99b","Type":"ContainerStarted","Data":"9ac39faa5c419aaa17c3244d39b8d1ce6bb0c7972627798ec73134a54247e9af"} Apr 20 15:13:25.288905 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:25.288854 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-dt4jp" podUID="d0525599-7d65-4901-8999-d4a2ca2ea99b" containerName="authorino" containerID="cri-o://bc8e48de2ac8d301c554603086dc2acb2e35f78e6f75a6ca62405e5c35927b1e" gracePeriod=30 Apr 20 15:13:25.290343 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:25.290314 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7c6cfd5998-d85mn" event={"ID":"30475fd6-292b-4162-85fd-be084af09804","Type":"ContainerStarted","Data":"7df2dd2b0599f9124fcf9a838ca34d82b223763bbe4ad264101e6a2498fd9f8c"} Apr 20 15:13:25.303017 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:25.302974 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8b475cf9f-dt4jp" podStartSLOduration=0.878637499 podStartE2EDuration="1.302959454s" podCreationTimestamp="2026-04-20 15:13:24 +0000 UTC" firstStartedPulling="2026-04-20 15:13:24.384358269 +0000 UTC m=+661.756944951" lastFinishedPulling="2026-04-20 15:13:24.808680218 +0000 UTC m=+662.181266906" observedRunningTime="2026-04-20 15:13:25.301914645 +0000 UTC m=+662.674501362" watchObservedRunningTime="2026-04-20 15:13:25.302959454 +0000 UTC m=+662.675546158" Apr 20 15:13:25.526412 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:25.526385 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-dt4jp" Apr 20 15:13:25.623618 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:25.623588 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hs96j\" (UniqueName: \"kubernetes.io/projected/d0525599-7d65-4901-8999-d4a2ca2ea99b-kube-api-access-hs96j\") pod \"d0525599-7d65-4901-8999-d4a2ca2ea99b\" (UID: \"d0525599-7d65-4901-8999-d4a2ca2ea99b\") " Apr 20 15:13:25.625608 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:25.625587 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0525599-7d65-4901-8999-d4a2ca2ea99b-kube-api-access-hs96j" (OuterVolumeSpecName: "kube-api-access-hs96j") pod "d0525599-7d65-4901-8999-d4a2ca2ea99b" (UID: "d0525599-7d65-4901-8999-d4a2ca2ea99b"). InnerVolumeSpecName "kube-api-access-hs96j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:13:25.724438 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:25.724402 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hs96j\" (UniqueName: \"kubernetes.io/projected/d0525599-7d65-4901-8999-d4a2ca2ea99b-kube-api-access-hs96j\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 20 15:13:26.294718 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:26.294683 2575 generic.go:358] "Generic (PLEG): container finished" podID="d0525599-7d65-4901-8999-d4a2ca2ea99b" containerID="bc8e48de2ac8d301c554603086dc2acb2e35f78e6f75a6ca62405e5c35927b1e" exitCode=0 Apr 20 15:13:26.295116 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:26.294730 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-dt4jp" Apr 20 15:13:26.295116 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:26.294765 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-dt4jp" event={"ID":"d0525599-7d65-4901-8999-d4a2ca2ea99b","Type":"ContainerDied","Data":"bc8e48de2ac8d301c554603086dc2acb2e35f78e6f75a6ca62405e5c35927b1e"} Apr 20 15:13:26.295116 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:26.294808 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-dt4jp" event={"ID":"d0525599-7d65-4901-8999-d4a2ca2ea99b","Type":"ContainerDied","Data":"9ac39faa5c419aaa17c3244d39b8d1ce6bb0c7972627798ec73134a54247e9af"} Apr 20 15:13:26.295116 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:26.294832 2575 scope.go:117] "RemoveContainer" containerID="bc8e48de2ac8d301c554603086dc2acb2e35f78e6f75a6ca62405e5c35927b1e" Apr 20 15:13:26.296327 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:26.296301 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7c6cfd5998-d85mn" event={"ID":"30475fd6-292b-4162-85fd-be084af09804","Type":"ContainerStarted","Data":"e1c53d06968114dcbbe7e231d718d0c0ebc797291cdeb5d1fd73f07aea39bca7"} Apr 20 15:13:26.303718 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:26.303699 2575 scope.go:117] "RemoveContainer" containerID="bc8e48de2ac8d301c554603086dc2acb2e35f78e6f75a6ca62405e5c35927b1e" Apr 20 15:13:26.303966 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:13:26.303947 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc8e48de2ac8d301c554603086dc2acb2e35f78e6f75a6ca62405e5c35927b1e\": container with ID starting with bc8e48de2ac8d301c554603086dc2acb2e35f78e6f75a6ca62405e5c35927b1e not found: ID does not exist" containerID="bc8e48de2ac8d301c554603086dc2acb2e35f78e6f75a6ca62405e5c35927b1e" Apr 20 15:13:26.304009 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:26.303975 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc8e48de2ac8d301c554603086dc2acb2e35f78e6f75a6ca62405e5c35927b1e"} err="failed to get container status \"bc8e48de2ac8d301c554603086dc2acb2e35f78e6f75a6ca62405e5c35927b1e\": rpc error: code = NotFound desc = could not find container \"bc8e48de2ac8d301c554603086dc2acb2e35f78e6f75a6ca62405e5c35927b1e\": container with ID starting with bc8e48de2ac8d301c554603086dc2acb2e35f78e6f75a6ca62405e5c35927b1e not found: ID does not exist" Apr 20 15:13:26.314636 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:26.314594 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7c6cfd5998-d85mn" podStartSLOduration=1.856698771 podStartE2EDuration="2.314582346s" podCreationTimestamp="2026-04-20 15:13:24 +0000 UTC" firstStartedPulling="2026-04-20 15:13:24.988246397 +0000 UTC m=+662.360833079" lastFinishedPulling="2026-04-20 15:13:25.446129972 +0000 UTC m=+662.818716654" observedRunningTime="2026-04-20 15:13:26.312963864 +0000 UTC m=+663.685550572" watchObservedRunningTime="2026-04-20 15:13:26.314582346 +0000 UTC m=+663.687169049" Apr 20 15:13:26.330548 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:26.330521 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-dt4jp"] Apr 20 15:13:26.336634 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:26.336611 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-dt4jp"] Apr 20 15:13:26.344237 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:26.344210 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-52gvw"] Apr 20 15:13:26.344515 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:26.344488 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-52gvw" podUID="d2bd5487-94a0-499f-9d01-c82c49cc7513" containerName="authorino" containerID="cri-o://ebdcfcb5f33f0cab069bee8c371585b2fe6a61f6e6007f53699149f57e2fe76f" gracePeriod=30 Apr 20 15:13:26.582537 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:26.582511 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-52gvw" Apr 20 15:13:26.732640 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:26.732597 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpkcs\" (UniqueName: \"kubernetes.io/projected/d2bd5487-94a0-499f-9d01-c82c49cc7513-kube-api-access-zpkcs\") pod \"d2bd5487-94a0-499f-9d01-c82c49cc7513\" (UID: \"d2bd5487-94a0-499f-9d01-c82c49cc7513\") " Apr 20 15:13:26.734768 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:26.734737 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2bd5487-94a0-499f-9d01-c82c49cc7513-kube-api-access-zpkcs" (OuterVolumeSpecName: "kube-api-access-zpkcs") pod "d2bd5487-94a0-499f-9d01-c82c49cc7513" (UID: "d2bd5487-94a0-499f-9d01-c82c49cc7513"). InnerVolumeSpecName "kube-api-access-zpkcs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:13:26.833979 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:26.833939 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zpkcs\" (UniqueName: \"kubernetes.io/projected/d2bd5487-94a0-499f-9d01-c82c49cc7513-kube-api-access-zpkcs\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 20 15:13:27.183114 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.183028 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0525599-7d65-4901-8999-d4a2ca2ea99b" path="/var/lib/kubelet/pods/d0525599-7d65-4901-8999-d4a2ca2ea99b/volumes" Apr 20 15:13:27.200393 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.200359 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-lflh6"] Apr 20 15:13:27.200775 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.200761 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d0525599-7d65-4901-8999-d4a2ca2ea99b" containerName="authorino" Apr 20 15:13:27.200833 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.200776 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0525599-7d65-4901-8999-d4a2ca2ea99b" containerName="authorino" Apr 20 15:13:27.200833 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.200793 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d2bd5487-94a0-499f-9d01-c82c49cc7513" containerName="authorino" Apr 20 15:13:27.200833 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.200799 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2bd5487-94a0-499f-9d01-c82c49cc7513" containerName="authorino" Apr 20 15:13:27.200923 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.200848 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d2bd5487-94a0-499f-9d01-c82c49cc7513" containerName="authorino" Apr 20 15:13:27.200923 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.200860 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d0525599-7d65-4901-8999-d4a2ca2ea99b" containerName="authorino" Apr 20 15:13:27.202797 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.202779 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-lflh6" Apr 20 15:13:27.205612 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.205597 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-hs4dp\"" Apr 20 15:13:27.214666 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.214644 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-lflh6"] Apr 20 15:13:27.301244 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.301208 2575 generic.go:358] "Generic (PLEG): container finished" podID="d2bd5487-94a0-499f-9d01-c82c49cc7513" containerID="ebdcfcb5f33f0cab069bee8c371585b2fe6a61f6e6007f53699149f57e2fe76f" exitCode=0 Apr 20 15:13:27.301702 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.301255 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-52gvw" Apr 20 15:13:27.301702 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.301312 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-52gvw" event={"ID":"d2bd5487-94a0-499f-9d01-c82c49cc7513","Type":"ContainerDied","Data":"ebdcfcb5f33f0cab069bee8c371585b2fe6a61f6e6007f53699149f57e2fe76f"} Apr 20 15:13:27.301702 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.301341 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-52gvw" event={"ID":"d2bd5487-94a0-499f-9d01-c82c49cc7513","Type":"ContainerDied","Data":"7282c545f9b3121a78dccb54f568247577fa8f1598433b9a646ea3e5a071b9c6"} Apr 20 15:13:27.301702 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.301358 2575 scope.go:117] "RemoveContainer" containerID="ebdcfcb5f33f0cab069bee8c371585b2fe6a61f6e6007f53699149f57e2fe76f" Apr 20 15:13:27.309161 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.309146 2575 scope.go:117] "RemoveContainer" containerID="ebdcfcb5f33f0cab069bee8c371585b2fe6a61f6e6007f53699149f57e2fe76f" Apr 20 15:13:27.309452 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:13:27.309434 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebdcfcb5f33f0cab069bee8c371585b2fe6a61f6e6007f53699149f57e2fe76f\": container with ID starting with ebdcfcb5f33f0cab069bee8c371585b2fe6a61f6e6007f53699149f57e2fe76f not found: ID does not exist" containerID="ebdcfcb5f33f0cab069bee8c371585b2fe6a61f6e6007f53699149f57e2fe76f" Apr 20 15:13:27.309508 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.309461 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebdcfcb5f33f0cab069bee8c371585b2fe6a61f6e6007f53699149f57e2fe76f"} err="failed to get container status \"ebdcfcb5f33f0cab069bee8c371585b2fe6a61f6e6007f53699149f57e2fe76f\": rpc error: code = NotFound desc = could not find container \"ebdcfcb5f33f0cab069bee8c371585b2fe6a61f6e6007f53699149f57e2fe76f\": container with ID starting with ebdcfcb5f33f0cab069bee8c371585b2fe6a61f6e6007f53699149f57e2fe76f not found: ID does not exist" Apr 20 15:13:27.328619 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.328593 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-52gvw"] Apr 20 15:13:27.331206 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.331182 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-52gvw"] Apr 20 15:13:27.337991 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.337974 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqmgx\" (UniqueName: \"kubernetes.io/projected/32e3d05b-1538-4c62-94c7-5fd18f090bd4-kube-api-access-jqmgx\") pod \"maas-controller-6d4c8f55f9-lflh6\" (UID: \"32e3d05b-1538-4c62-94c7-5fd18f090bd4\") " pod="opendatahub/maas-controller-6d4c8f55f9-lflh6" Apr 20 15:13:27.383873 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.383833 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-677bc9fcf9-77ws5"] Apr 20 15:13:27.386537 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.386519 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-677bc9fcf9-77ws5" Apr 20 15:13:27.397693 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.397669 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-677bc9fcf9-77ws5"] Apr 20 15:13:27.438891 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.438810 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jqmgx\" (UniqueName: \"kubernetes.io/projected/32e3d05b-1538-4c62-94c7-5fd18f090bd4-kube-api-access-jqmgx\") pod \"maas-controller-6d4c8f55f9-lflh6\" (UID: \"32e3d05b-1538-4c62-94c7-5fd18f090bd4\") " pod="opendatahub/maas-controller-6d4c8f55f9-lflh6" Apr 20 15:13:27.448258 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.448232 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqmgx\" (UniqueName: \"kubernetes.io/projected/32e3d05b-1538-4c62-94c7-5fd18f090bd4-kube-api-access-jqmgx\") pod \"maas-controller-6d4c8f55f9-lflh6\" (UID: \"32e3d05b-1538-4c62-94c7-5fd18f090bd4\") " pod="opendatahub/maas-controller-6d4c8f55f9-lflh6" Apr 20 15:13:27.498916 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.498879 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-677bc9fcf9-77ws5"] Apr 20 15:13:27.499162 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:13:27.499143 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-dr2gv], unattached volumes=[], failed to process volumes=[]: context canceled" pod="opendatahub/maas-controller-677bc9fcf9-77ws5" podUID="4991668b-127c-4e30-a683-8358c2bbf40e" Apr 20 15:13:27.513248 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.513214 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-lflh6" Apr 20 15:13:27.528054 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.528025 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6654fc9c8b-g7cwt"] Apr 20 15:13:27.530770 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.530750 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6654fc9c8b-g7cwt" Apr 20 15:13:27.539820 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.539727 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr2gv\" (UniqueName: \"kubernetes.io/projected/4991668b-127c-4e30-a683-8358c2bbf40e-kube-api-access-dr2gv\") pod \"maas-controller-677bc9fcf9-77ws5\" (UID: \"4991668b-127c-4e30-a683-8358c2bbf40e\") " pod="opendatahub/maas-controller-677bc9fcf9-77ws5" Apr 20 15:13:27.540555 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.540534 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6654fc9c8b-g7cwt"] Apr 20 15:13:27.640363 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.640329 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gd44\" (UniqueName: \"kubernetes.io/projected/223f1f2a-88a3-45ca-a5e3-c1f7b4825a42-kube-api-access-5gd44\") pod \"maas-controller-6654fc9c8b-g7cwt\" (UID: \"223f1f2a-88a3-45ca-a5e3-c1f7b4825a42\") " pod="opendatahub/maas-controller-6654fc9c8b-g7cwt" Apr 20 15:13:27.640532 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.640373 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dr2gv\" (UniqueName: \"kubernetes.io/projected/4991668b-127c-4e30-a683-8358c2bbf40e-kube-api-access-dr2gv\") pod \"maas-controller-677bc9fcf9-77ws5\" (UID: \"4991668b-127c-4e30-a683-8358c2bbf40e\") " pod="opendatahub/maas-controller-677bc9fcf9-77ws5" Apr 20 15:13:27.648945 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.648915 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr2gv\" (UniqueName: \"kubernetes.io/projected/4991668b-127c-4e30-a683-8358c2bbf40e-kube-api-access-dr2gv\") pod \"maas-controller-677bc9fcf9-77ws5\" (UID: \"4991668b-127c-4e30-a683-8358c2bbf40e\") " pod="opendatahub/maas-controller-677bc9fcf9-77ws5" Apr 20 15:13:27.741904 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.741806 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5gd44\" (UniqueName: \"kubernetes.io/projected/223f1f2a-88a3-45ca-a5e3-c1f7b4825a42-kube-api-access-5gd44\") pod \"maas-controller-6654fc9c8b-g7cwt\" (UID: \"223f1f2a-88a3-45ca-a5e3-c1f7b4825a42\") " pod="opendatahub/maas-controller-6654fc9c8b-g7cwt" Apr 20 15:13:27.749785 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.749753 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gd44\" (UniqueName: \"kubernetes.io/projected/223f1f2a-88a3-45ca-a5e3-c1f7b4825a42-kube-api-access-5gd44\") pod \"maas-controller-6654fc9c8b-g7cwt\" (UID: \"223f1f2a-88a3-45ca-a5e3-c1f7b4825a42\") " pod="opendatahub/maas-controller-6654fc9c8b-g7cwt" Apr 20 15:13:27.843960 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.843931 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-lflh6"] Apr 20 15:13:27.845501 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.845479 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6654fc9c8b-g7cwt" Apr 20 15:13:27.845702 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:13:27.845679 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32e3d05b_1538_4c62_94c7_5fd18f090bd4.slice/crio-5c94c605dae8825f3df2d2015d824cc7479724b36063963cfb49d6ec22c27c38 WatchSource:0}: Error finding container 5c94c605dae8825f3df2d2015d824cc7479724b36063963cfb49d6ec22c27c38: Status 404 returned error can't find the container with id 5c94c605dae8825f3df2d2015d824cc7479724b36063963cfb49d6ec22c27c38 Apr 20 15:13:27.964720 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:27.964695 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6654fc9c8b-g7cwt"] Apr 20 15:13:27.967188 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:13:27.967162 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod223f1f2a_88a3_45ca_a5e3_c1f7b4825a42.slice/crio-bde6c4353521182d48786b923fa347df6583ecc4c34864ff0c5be7eeda5cd0bd WatchSource:0}: Error finding container bde6c4353521182d48786b923fa347df6583ecc4c34864ff0c5be7eeda5cd0bd: Status 404 returned error can't find the container with id bde6c4353521182d48786b923fa347df6583ecc4c34864ff0c5be7eeda5cd0bd Apr 20 15:13:28.310521 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:28.310471 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-lflh6" event={"ID":"32e3d05b-1538-4c62-94c7-5fd18f090bd4","Type":"ContainerStarted","Data":"5c94c605dae8825f3df2d2015d824cc7479724b36063963cfb49d6ec22c27c38"} Apr 20 15:13:28.312513 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:28.312423 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6654fc9c8b-g7cwt" event={"ID":"223f1f2a-88a3-45ca-a5e3-c1f7b4825a42","Type":"ContainerStarted","Data":"bde6c4353521182d48786b923fa347df6583ecc4c34864ff0c5be7eeda5cd0bd"} Apr 20 15:13:28.312513 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:28.312488 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-677bc9fcf9-77ws5" Apr 20 15:13:28.319094 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:28.319003 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-677bc9fcf9-77ws5" Apr 20 15:13:28.448168 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:28.448028 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dr2gv\" (UniqueName: \"kubernetes.io/projected/4991668b-127c-4e30-a683-8358c2bbf40e-kube-api-access-dr2gv\") pod \"4991668b-127c-4e30-a683-8358c2bbf40e\" (UID: \"4991668b-127c-4e30-a683-8358c2bbf40e\") " Apr 20 15:13:28.451793 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:28.451748 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4991668b-127c-4e30-a683-8358c2bbf40e-kube-api-access-dr2gv" (OuterVolumeSpecName: "kube-api-access-dr2gv") pod "4991668b-127c-4e30-a683-8358c2bbf40e" (UID: "4991668b-127c-4e30-a683-8358c2bbf40e"). InnerVolumeSpecName "kube-api-access-dr2gv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:13:28.549826 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:28.549787 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dr2gv\" (UniqueName: \"kubernetes.io/projected/4991668b-127c-4e30-a683-8358c2bbf40e-kube-api-access-dr2gv\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 20 15:13:29.186234 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:29.186196 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2bd5487-94a0-499f-9d01-c82c49cc7513" path="/var/lib/kubelet/pods/d2bd5487-94a0-499f-9d01-c82c49cc7513/volumes" Apr 20 15:13:29.316986 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:29.316504 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-677bc9fcf9-77ws5" Apr 20 15:13:29.348058 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:29.348021 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-677bc9fcf9-77ws5"] Apr 20 15:13:29.353234 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:29.353204 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-677bc9fcf9-77ws5"] Apr 20 15:13:31.183717 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:31.183624 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4991668b-127c-4e30-a683-8358c2bbf40e" path="/var/lib/kubelet/pods/4991668b-127c-4e30-a683-8358c2bbf40e/volumes" Apr 20 15:13:31.326601 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:31.326569 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-lflh6" event={"ID":"32e3d05b-1538-4c62-94c7-5fd18f090bd4","Type":"ContainerStarted","Data":"793e37b0bca78ca8aeec2485baaad71a3edbe936c0ca345fce2a7bf874670642"} Apr 20 15:13:31.326759 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:31.326653 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-lflh6" Apr 20 15:13:31.327876 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:31.327854 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6654fc9c8b-g7cwt" event={"ID":"223f1f2a-88a3-45ca-a5e3-c1f7b4825a42","Type":"ContainerStarted","Data":"52d03e1cc8a467d3183a4cd438503d51e44ae4b06937abaed18179aab97b5f94"} Apr 20 15:13:31.327983 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:31.327961 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6654fc9c8b-g7cwt" Apr 20 15:13:31.343401 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:31.343355 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-lflh6" podStartSLOduration=1.3710231990000001 podStartE2EDuration="4.343342482s" podCreationTimestamp="2026-04-20 15:13:27 +0000 UTC" firstStartedPulling="2026-04-20 15:13:27.847069251 +0000 UTC m=+665.219655948" lastFinishedPulling="2026-04-20 15:13:30.819388546 +0000 UTC m=+668.191975231" observedRunningTime="2026-04-20 15:13:31.341789367 +0000 UTC m=+668.714376073" watchObservedRunningTime="2026-04-20 15:13:31.343342482 +0000 UTC m=+668.715929185" Apr 20 15:13:31.357558 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:31.357497 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6654fc9c8b-g7cwt" podStartSLOduration=1.505852815 podStartE2EDuration="4.357482928s" podCreationTimestamp="2026-04-20 15:13:27 +0000 UTC" firstStartedPulling="2026-04-20 15:13:27.968508517 +0000 UTC m=+665.341095201" lastFinishedPulling="2026-04-20 15:13:30.820138628 +0000 UTC m=+668.192725314" observedRunningTime="2026-04-20 15:13:31.35699803 +0000 UTC m=+668.729584744" watchObservedRunningTime="2026-04-20 15:13:31.357482928 +0000 UTC m=+668.730069632" Apr 20 15:13:42.337216 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:42.337180 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-6654fc9c8b-g7cwt" Apr 20 15:13:42.337644 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:42.337243 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-6d4c8f55f9-lflh6" Apr 20 15:13:42.395573 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:42.395539 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-lflh6"] Apr 20 15:13:42.395771 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:42.395748 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-lflh6" podUID="32e3d05b-1538-4c62-94c7-5fd18f090bd4" containerName="manager" containerID="cri-o://793e37b0bca78ca8aeec2485baaad71a3edbe936c0ca345fce2a7bf874670642" gracePeriod=10 Apr 20 15:13:42.641729 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:42.641706 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-lflh6" Apr 20 15:13:42.780806 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:42.780767 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqmgx\" (UniqueName: \"kubernetes.io/projected/32e3d05b-1538-4c62-94c7-5fd18f090bd4-kube-api-access-jqmgx\") pod \"32e3d05b-1538-4c62-94c7-5fd18f090bd4\" (UID: \"32e3d05b-1538-4c62-94c7-5fd18f090bd4\") " Apr 20 15:13:42.783070 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:42.783043 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32e3d05b-1538-4c62-94c7-5fd18f090bd4-kube-api-access-jqmgx" (OuterVolumeSpecName: "kube-api-access-jqmgx") pod "32e3d05b-1538-4c62-94c7-5fd18f090bd4" (UID: "32e3d05b-1538-4c62-94c7-5fd18f090bd4"). InnerVolumeSpecName "kube-api-access-jqmgx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:13:42.881743 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:42.881656 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jqmgx\" (UniqueName: \"kubernetes.io/projected/32e3d05b-1538-4c62-94c7-5fd18f090bd4-kube-api-access-jqmgx\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 20 15:13:43.370845 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:43.370813 2575 generic.go:358] "Generic (PLEG): container finished" podID="32e3d05b-1538-4c62-94c7-5fd18f090bd4" containerID="793e37b0bca78ca8aeec2485baaad71a3edbe936c0ca345fce2a7bf874670642" exitCode=0 Apr 20 15:13:43.371340 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:43.370854 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-lflh6" event={"ID":"32e3d05b-1538-4c62-94c7-5fd18f090bd4","Type":"ContainerDied","Data":"793e37b0bca78ca8aeec2485baaad71a3edbe936c0ca345fce2a7bf874670642"} Apr 20 15:13:43.371340 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:43.370874 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-lflh6" Apr 20 15:13:43.371340 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:43.370896 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-lflh6" event={"ID":"32e3d05b-1538-4c62-94c7-5fd18f090bd4","Type":"ContainerDied","Data":"5c94c605dae8825f3df2d2015d824cc7479724b36063963cfb49d6ec22c27c38"} Apr 20 15:13:43.371340 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:43.370911 2575 scope.go:117] "RemoveContainer" containerID="793e37b0bca78ca8aeec2485baaad71a3edbe936c0ca345fce2a7bf874670642" Apr 20 15:13:43.379326 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:43.379309 2575 scope.go:117] "RemoveContainer" containerID="793e37b0bca78ca8aeec2485baaad71a3edbe936c0ca345fce2a7bf874670642" Apr 20 15:13:43.379565 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:13:43.379545 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"793e37b0bca78ca8aeec2485baaad71a3edbe936c0ca345fce2a7bf874670642\": container with ID starting with 793e37b0bca78ca8aeec2485baaad71a3edbe936c0ca345fce2a7bf874670642 not found: ID does not exist" containerID="793e37b0bca78ca8aeec2485baaad71a3edbe936c0ca345fce2a7bf874670642" Apr 20 15:13:43.379607 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:43.379575 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"793e37b0bca78ca8aeec2485baaad71a3edbe936c0ca345fce2a7bf874670642"} err="failed to get container status \"793e37b0bca78ca8aeec2485baaad71a3edbe936c0ca345fce2a7bf874670642\": rpc error: code = NotFound desc = could not find container \"793e37b0bca78ca8aeec2485baaad71a3edbe936c0ca345fce2a7bf874670642\": container with ID starting with 793e37b0bca78ca8aeec2485baaad71a3edbe936c0ca345fce2a7bf874670642 not found: ID does not exist" Apr 20 15:13:43.387607 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:43.387585 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-lflh6"] Apr 20 15:13:43.391810 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:43.391787 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-lflh6"] Apr 20 15:13:45.182128 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:45.182097 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32e3d05b-1538-4c62-94c7-5fd18f090bd4" path="/var/lib/kubelet/pods/32e3d05b-1538-4c62-94c7-5fd18f090bd4/volumes" Apr 20 15:13:56.623321 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:56.623260 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6654fc9c8b-g7cwt"] Apr 20 15:13:56.623909 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:56.623602 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6654fc9c8b-g7cwt" podUID="223f1f2a-88a3-45ca-a5e3-c1f7b4825a42" containerName="manager" containerID="cri-o://52d03e1cc8a467d3183a4cd438503d51e44ae4b06937abaed18179aab97b5f94" gracePeriod=10 Apr 20 15:13:56.873850 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:56.873780 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6654fc9c8b-g7cwt" Apr 20 15:13:56.896129 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:56.896097 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gd44\" (UniqueName: \"kubernetes.io/projected/223f1f2a-88a3-45ca-a5e3-c1f7b4825a42-kube-api-access-5gd44\") pod \"223f1f2a-88a3-45ca-a5e3-c1f7b4825a42\" (UID: \"223f1f2a-88a3-45ca-a5e3-c1f7b4825a42\") " Apr 20 15:13:56.898404 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:56.898374 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/223f1f2a-88a3-45ca-a5e3-c1f7b4825a42-kube-api-access-5gd44" (OuterVolumeSpecName: "kube-api-access-5gd44") pod "223f1f2a-88a3-45ca-a5e3-c1f7b4825a42" (UID: "223f1f2a-88a3-45ca-a5e3-c1f7b4825a42"). InnerVolumeSpecName "kube-api-access-5gd44". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:13:56.997108 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:56.997074 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5gd44\" (UniqueName: \"kubernetes.io/projected/223f1f2a-88a3-45ca-a5e3-c1f7b4825a42-kube-api-access-5gd44\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 20 15:13:57.419828 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:57.419789 2575 generic.go:358] "Generic (PLEG): container finished" podID="223f1f2a-88a3-45ca-a5e3-c1f7b4825a42" containerID="52d03e1cc8a467d3183a4cd438503d51e44ae4b06937abaed18179aab97b5f94" exitCode=0 Apr 20 15:13:57.420005 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:57.419845 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6654fc9c8b-g7cwt" event={"ID":"223f1f2a-88a3-45ca-a5e3-c1f7b4825a42","Type":"ContainerDied","Data":"52d03e1cc8a467d3183a4cd438503d51e44ae4b06937abaed18179aab97b5f94"} Apr 20 15:13:57.420005 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:57.419858 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6654fc9c8b-g7cwt" Apr 20 15:13:57.420005 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:57.419879 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6654fc9c8b-g7cwt" event={"ID":"223f1f2a-88a3-45ca-a5e3-c1f7b4825a42","Type":"ContainerDied","Data":"bde6c4353521182d48786b923fa347df6583ecc4c34864ff0c5be7eeda5cd0bd"} Apr 20 15:13:57.420005 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:57.419900 2575 scope.go:117] "RemoveContainer" containerID="52d03e1cc8a467d3183a4cd438503d51e44ae4b06937abaed18179aab97b5f94" Apr 20 15:13:57.428207 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:57.428185 2575 scope.go:117] "RemoveContainer" containerID="52d03e1cc8a467d3183a4cd438503d51e44ae4b06937abaed18179aab97b5f94" Apr 20 15:13:57.428525 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:13:57.428495 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52d03e1cc8a467d3183a4cd438503d51e44ae4b06937abaed18179aab97b5f94\": container with ID starting with 52d03e1cc8a467d3183a4cd438503d51e44ae4b06937abaed18179aab97b5f94 not found: ID does not exist" containerID="52d03e1cc8a467d3183a4cd438503d51e44ae4b06937abaed18179aab97b5f94" Apr 20 15:13:57.428606 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:57.428534 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52d03e1cc8a467d3183a4cd438503d51e44ae4b06937abaed18179aab97b5f94"} err="failed to get container status \"52d03e1cc8a467d3183a4cd438503d51e44ae4b06937abaed18179aab97b5f94\": rpc error: code = NotFound desc = could not find container \"52d03e1cc8a467d3183a4cd438503d51e44ae4b06937abaed18179aab97b5f94\": container with ID starting with 52d03e1cc8a467d3183a4cd438503d51e44ae4b06937abaed18179aab97b5f94 not found: ID does not exist" Apr 20 15:13:57.436396 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:57.436363 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6654fc9c8b-g7cwt"] Apr 20 15:13:57.439927 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:57.439865 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6654fc9c8b-g7cwt"] Apr 20 15:13:59.182654 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:13:59.182617 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="223f1f2a-88a3-45ca-a5e3-c1f7b4825a42" path="/var/lib/kubelet/pods/223f1f2a-88a3-45ca-a5e3-c1f7b4825a42/volumes" Apr 20 15:14:03.191861 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:14:03.191827 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-6bb78778bd-974xh"] Apr 20 15:14:03.192236 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:14:03.192209 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="223f1f2a-88a3-45ca-a5e3-c1f7b4825a42" containerName="manager" Apr 20 15:14:03.192236 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:14:03.192219 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="223f1f2a-88a3-45ca-a5e3-c1f7b4825a42" containerName="manager" Apr 20 15:14:03.192236 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:14:03.192232 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32e3d05b-1538-4c62-94c7-5fd18f090bd4" containerName="manager" Apr 20 15:14:03.192367 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:14:03.192238 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="32e3d05b-1538-4c62-94c7-5fd18f090bd4" containerName="manager" Apr 20 15:14:03.192367 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:14:03.192303 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="32e3d05b-1538-4c62-94c7-5fd18f090bd4" containerName="manager" Apr 20 15:14:03.192367 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:14:03.192314 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="223f1f2a-88a3-45ca-a5e3-c1f7b4825a42" containerName="manager" Apr 20 15:14:03.196542 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:14:03.196522 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6bb78778bd-974xh" Apr 20 15:14:03.200210 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:14:03.200185 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 20 15:14:03.200210 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:14:03.200232 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 20 15:14:03.200210 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:14:03.200192 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-5m5vb\"" Apr 20 15:14:03.202593 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:14:03.202573 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-6bb78778bd-974xh"] Apr 20 15:14:03.255833 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:14:03.255799 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hp2z\" (UniqueName: \"kubernetes.io/projected/5f39468e-cab7-461a-9f30-65a1fd4628c6-kube-api-access-6hp2z\") pod \"maas-api-6bb78778bd-974xh\" (UID: \"5f39468e-cab7-461a-9f30-65a1fd4628c6\") " pod="opendatahub/maas-api-6bb78778bd-974xh" Apr 20 15:14:03.256002 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:14:03.255871 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/5f39468e-cab7-461a-9f30-65a1fd4628c6-maas-api-tls\") pod \"maas-api-6bb78778bd-974xh\" (UID: \"5f39468e-cab7-461a-9f30-65a1fd4628c6\") " pod="opendatahub/maas-api-6bb78778bd-974xh" Apr 20 15:14:03.357131 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:14:03.357094 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6hp2z\" (UniqueName: \"kubernetes.io/projected/5f39468e-cab7-461a-9f30-65a1fd4628c6-kube-api-access-6hp2z\") pod \"maas-api-6bb78778bd-974xh\" (UID: \"5f39468e-cab7-461a-9f30-65a1fd4628c6\") " pod="opendatahub/maas-api-6bb78778bd-974xh" Apr 20 15:14:03.357343 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:14:03.357183 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/5f39468e-cab7-461a-9f30-65a1fd4628c6-maas-api-tls\") pod \"maas-api-6bb78778bd-974xh\" (UID: \"5f39468e-cab7-461a-9f30-65a1fd4628c6\") " pod="opendatahub/maas-api-6bb78778bd-974xh" Apr 20 15:14:03.359616 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:14:03.359591 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/5f39468e-cab7-461a-9f30-65a1fd4628c6-maas-api-tls\") pod \"maas-api-6bb78778bd-974xh\" (UID: \"5f39468e-cab7-461a-9f30-65a1fd4628c6\") " pod="opendatahub/maas-api-6bb78778bd-974xh" Apr 20 15:14:03.364617 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:14:03.364593 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hp2z\" (UniqueName: \"kubernetes.io/projected/5f39468e-cab7-461a-9f30-65a1fd4628c6-kube-api-access-6hp2z\") pod \"maas-api-6bb78778bd-974xh\" (UID: \"5f39468e-cab7-461a-9f30-65a1fd4628c6\") " pod="opendatahub/maas-api-6bb78778bd-974xh" Apr 20 15:14:03.508920 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:14:03.508835 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6bb78778bd-974xh" Apr 20 15:14:03.835790 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:14:03.835756 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-6bb78778bd-974xh"] Apr 20 15:14:03.839251 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:14:03.839220 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f39468e_cab7_461a_9f30_65a1fd4628c6.slice/crio-3ec88ae82c49f61a7b41b9489791cab828f6ba80f550f194e3afbeccbc69e6cf WatchSource:0}: Error finding container 3ec88ae82c49f61a7b41b9489791cab828f6ba80f550f194e3afbeccbc69e6cf: Status 404 returned error can't find the container with id 3ec88ae82c49f61a7b41b9489791cab828f6ba80f550f194e3afbeccbc69e6cf Apr 20 15:14:04.445138 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:14:04.445105 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6bb78778bd-974xh" event={"ID":"5f39468e-cab7-461a-9f30-65a1fd4628c6","Type":"ContainerStarted","Data":"3ec88ae82c49f61a7b41b9489791cab828f6ba80f550f194e3afbeccbc69e6cf"} Apr 20 15:14:06.453864 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:14:06.453782 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6bb78778bd-974xh" event={"ID":"5f39468e-cab7-461a-9f30-65a1fd4628c6","Type":"ContainerStarted","Data":"997a86ed73ff19523777a363eb05b31fa70fc2cf1409b68117d0535f8f5fe39b"} Apr 20 15:14:06.454209 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:14:06.453913 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-6bb78778bd-974xh" Apr 20 15:14:06.470586 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:14:06.470516 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-6bb78778bd-974xh" podStartSLOduration=1.532112579 podStartE2EDuration="3.470496297s" podCreationTimestamp="2026-04-20 15:14:03 +0000 UTC" firstStartedPulling="2026-04-20 15:14:03.84057674 +0000 UTC m=+701.213163427" lastFinishedPulling="2026-04-20 15:14:05.77896046 +0000 UTC m=+703.151547145" observedRunningTime="2026-04-20 15:14:06.469093473 +0000 UTC m=+703.841680190" watchObservedRunningTime="2026-04-20 15:14:06.470496297 +0000 UTC m=+703.843083000" Apr 20 15:14:12.462696 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:14:12.462666 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-6bb78778bd-974xh" Apr 20 15:16:04.195881 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:16:04.195845 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7d759bd859-rqrzj"] Apr 20 15:16:04.199414 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:16:04.199392 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7d759bd859-rqrzj" Apr 20 15:16:04.207085 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:16:04.207059 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7d759bd859-rqrzj"] Apr 20 15:16:04.303391 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:16:04.303359 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnnkr\" (UniqueName: \"kubernetes.io/projected/f72bef06-c4a6-4097-ba90-6d6985ab2168-kube-api-access-jnnkr\") pod \"authorino-7d759bd859-rqrzj\" (UID: \"f72bef06-c4a6-4097-ba90-6d6985ab2168\") " pod="kuadrant-system/authorino-7d759bd859-rqrzj" Apr 20 15:16:04.303391 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:16:04.303396 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/f72bef06-c4a6-4097-ba90-6d6985ab2168-tls-cert\") pod \"authorino-7d759bd859-rqrzj\" (UID: \"f72bef06-c4a6-4097-ba90-6d6985ab2168\") " pod="kuadrant-system/authorino-7d759bd859-rqrzj" Apr 20 15:16:04.404330 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:16:04.404295 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jnnkr\" (UniqueName: \"kubernetes.io/projected/f72bef06-c4a6-4097-ba90-6d6985ab2168-kube-api-access-jnnkr\") pod \"authorino-7d759bd859-rqrzj\" (UID: \"f72bef06-c4a6-4097-ba90-6d6985ab2168\") " pod="kuadrant-system/authorino-7d759bd859-rqrzj" Apr 20 15:16:04.404330 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:16:04.404337 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/f72bef06-c4a6-4097-ba90-6d6985ab2168-tls-cert\") pod \"authorino-7d759bd859-rqrzj\" (UID: \"f72bef06-c4a6-4097-ba90-6d6985ab2168\") " pod="kuadrant-system/authorino-7d759bd859-rqrzj" Apr 20 15:16:04.406644 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:16:04.406616 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/f72bef06-c4a6-4097-ba90-6d6985ab2168-tls-cert\") pod \"authorino-7d759bd859-rqrzj\" (UID: \"f72bef06-c4a6-4097-ba90-6d6985ab2168\") " pod="kuadrant-system/authorino-7d759bd859-rqrzj" Apr 20 15:16:04.413888 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:16:04.413868 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnnkr\" (UniqueName: \"kubernetes.io/projected/f72bef06-c4a6-4097-ba90-6d6985ab2168-kube-api-access-jnnkr\") pod \"authorino-7d759bd859-rqrzj\" (UID: \"f72bef06-c4a6-4097-ba90-6d6985ab2168\") " pod="kuadrant-system/authorino-7d759bd859-rqrzj" Apr 20 15:16:04.509150 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:16:04.509055 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7d759bd859-rqrzj" Apr 20 15:16:04.648189 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:16:04.648134 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7d759bd859-rqrzj"] Apr 20 15:16:04.650749 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:16:04.650720 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf72bef06_c4a6_4097_ba90_6d6985ab2168.slice/crio-d72eb398da1a417ca3dd8d6b4eb5c458afea439c9fac96b0d894dd6df1a0f50c WatchSource:0}: Error finding container d72eb398da1a417ca3dd8d6b4eb5c458afea439c9fac96b0d894dd6df1a0f50c: Status 404 returned error can't find the container with id d72eb398da1a417ca3dd8d6b4eb5c458afea439c9fac96b0d894dd6df1a0f50c Apr 20 15:16:04.652345 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:16:04.652325 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 15:16:04.863347 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:16:04.863311 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7d759bd859-rqrzj" event={"ID":"f72bef06-c4a6-4097-ba90-6d6985ab2168","Type":"ContainerStarted","Data":"d72eb398da1a417ca3dd8d6b4eb5c458afea439c9fac96b0d894dd6df1a0f50c"} Apr 20 15:16:05.868151 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:16:05.868096 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7d759bd859-rqrzj" event={"ID":"f72bef06-c4a6-4097-ba90-6d6985ab2168","Type":"ContainerStarted","Data":"554fd5fa427be00dab8826f17c467336014eccd32742827f301c4b794a410fbf"} Apr 20 15:16:05.888884 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:16:05.888835 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7d759bd859-rqrzj" podStartSLOduration=1.408884582 podStartE2EDuration="1.888820921s" podCreationTimestamp="2026-04-20 15:16:04 +0000 UTC" firstStartedPulling="2026-04-20 15:16:04.652456921 +0000 UTC m=+822.025043603" lastFinishedPulling="2026-04-20 15:16:05.132393259 +0000 UTC m=+822.504979942" observedRunningTime="2026-04-20 15:16:05.886607617 +0000 UTC m=+823.259194313" watchObservedRunningTime="2026-04-20 15:16:05.888820921 +0000 UTC m=+823.261407690" Apr 20 15:16:05.919193 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:16:05.919162 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7c6cfd5998-d85mn"] Apr 20 15:16:05.919418 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:16:05.919394 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7c6cfd5998-d85mn" podUID="30475fd6-292b-4162-85fd-be084af09804" containerName="authorino" containerID="cri-o://e1c53d06968114dcbbe7e231d718d0c0ebc797291cdeb5d1fd73f07aea39bca7" gracePeriod=30 Apr 20 15:16:06.159732 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:16:06.159704 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7c6cfd5998-d85mn" Apr 20 15:16:06.220082 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:16:06.220051 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/30475fd6-292b-4162-85fd-be084af09804-tls-cert\") pod \"30475fd6-292b-4162-85fd-be084af09804\" (UID: \"30475fd6-292b-4162-85fd-be084af09804\") " Apr 20 15:16:06.220240 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:16:06.220100 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfvcb\" (UniqueName: \"kubernetes.io/projected/30475fd6-292b-4162-85fd-be084af09804-kube-api-access-tfvcb\") pod \"30475fd6-292b-4162-85fd-be084af09804\" (UID: \"30475fd6-292b-4162-85fd-be084af09804\") " Apr 20 15:16:06.222109 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:16:06.222072 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30475fd6-292b-4162-85fd-be084af09804-kube-api-access-tfvcb" (OuterVolumeSpecName: "kube-api-access-tfvcb") pod "30475fd6-292b-4162-85fd-be084af09804" (UID: "30475fd6-292b-4162-85fd-be084af09804"). InnerVolumeSpecName "kube-api-access-tfvcb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:16:06.230491 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:16:06.230465 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30475fd6-292b-4162-85fd-be084af09804-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "30475fd6-292b-4162-85fd-be084af09804" (UID: "30475fd6-292b-4162-85fd-be084af09804"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 15:16:06.321237 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:16:06.321200 2575 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/30475fd6-292b-4162-85fd-be084af09804-tls-cert\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 20 15:16:06.321237 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:16:06.321230 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tfvcb\" (UniqueName: \"kubernetes.io/projected/30475fd6-292b-4162-85fd-be084af09804-kube-api-access-tfvcb\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 20 15:16:06.873154 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:16:06.873114 2575 generic.go:358] "Generic (PLEG): container finished" podID="30475fd6-292b-4162-85fd-be084af09804" containerID="e1c53d06968114dcbbe7e231d718d0c0ebc797291cdeb5d1fd73f07aea39bca7" exitCode=0 Apr 20 15:16:06.873569 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:16:06.873167 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7c6cfd5998-d85mn" Apr 20 15:16:06.873569 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:16:06.873202 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7c6cfd5998-d85mn" event={"ID":"30475fd6-292b-4162-85fd-be084af09804","Type":"ContainerDied","Data":"e1c53d06968114dcbbe7e231d718d0c0ebc797291cdeb5d1fd73f07aea39bca7"} Apr 20 15:16:06.873569 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:16:06.873239 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7c6cfd5998-d85mn" event={"ID":"30475fd6-292b-4162-85fd-be084af09804","Type":"ContainerDied","Data":"7df2dd2b0599f9124fcf9a838ca34d82b223763bbe4ad264101e6a2498fd9f8c"} Apr 20 15:16:06.873569 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:16:06.873255 2575 scope.go:117] "RemoveContainer" containerID="e1c53d06968114dcbbe7e231d718d0c0ebc797291cdeb5d1fd73f07aea39bca7" Apr 20 15:16:06.881593 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:16:06.881575 2575 scope.go:117] "RemoveContainer" containerID="e1c53d06968114dcbbe7e231d718d0c0ebc797291cdeb5d1fd73f07aea39bca7" Apr 20 15:16:06.881857 ip-10-0-133-198 kubenswrapper[2575]: E0420 15:16:06.881836 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1c53d06968114dcbbe7e231d718d0c0ebc797291cdeb5d1fd73f07aea39bca7\": container with ID starting with e1c53d06968114dcbbe7e231d718d0c0ebc797291cdeb5d1fd73f07aea39bca7 not found: ID does not exist" containerID="e1c53d06968114dcbbe7e231d718d0c0ebc797291cdeb5d1fd73f07aea39bca7" Apr 20 15:16:06.881931 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:16:06.881864 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1c53d06968114dcbbe7e231d718d0c0ebc797291cdeb5d1fd73f07aea39bca7"} err="failed to get container status \"e1c53d06968114dcbbe7e231d718d0c0ebc797291cdeb5d1fd73f07aea39bca7\": rpc error: code = NotFound desc = could not find container \"e1c53d06968114dcbbe7e231d718d0c0ebc797291cdeb5d1fd73f07aea39bca7\": container with ID starting with e1c53d06968114dcbbe7e231d718d0c0ebc797291cdeb5d1fd73f07aea39bca7 not found: ID does not exist" Apr 20 15:16:06.901204 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:16:06.901180 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7c6cfd5998-d85mn"] Apr 20 15:16:06.905639 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:16:06.905620 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7c6cfd5998-d85mn"] Apr 20 15:16:07.182262 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:16:07.182181 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30475fd6-292b-4162-85fd-be084af09804" path="/var/lib/kubelet/pods/30475fd6-292b-4162-85fd-be084af09804/volumes" Apr 20 15:17:23.121066 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:17:23.121040 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pbfc9_ab96425d-f444-4af7-9052-1bddacccef53/console-operator/1.log" Apr 20 15:17:23.121588 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:17:23.121500 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pbfc9_ab96425d-f444-4af7-9052-1bddacccef53/console-operator/1.log" Apr 20 15:17:29.175730 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:17:29.175701 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-7458b9fb6c-nkj6g"] Apr 20 15:17:29.176188 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:17:29.176149 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30475fd6-292b-4162-85fd-be084af09804" containerName="authorino" Apr 20 15:17:29.176188 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:17:29.176168 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="30475fd6-292b-4162-85fd-be084af09804" containerName="authorino" Apr 20 15:17:29.176336 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:17:29.176244 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="30475fd6-292b-4162-85fd-be084af09804" containerName="authorino" Apr 20 15:17:29.179350 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:17:29.179327 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7458b9fb6c-nkj6g" Apr 20 15:17:29.182909 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:17:29.182884 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-hs4dp\"" Apr 20 15:17:29.191210 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:17:29.191188 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7458b9fb6c-nkj6g"] Apr 20 15:17:29.278055 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:17:29.278021 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjm64\" (UniqueName: \"kubernetes.io/projected/83ac0ba7-fd23-4a2b-87c4-59878001bbc5-kube-api-access-kjm64\") pod \"maas-controller-7458b9fb6c-nkj6g\" (UID: \"83ac0ba7-fd23-4a2b-87c4-59878001bbc5\") " pod="opendatahub/maas-controller-7458b9fb6c-nkj6g" Apr 20 15:17:29.379419 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:17:29.379375 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kjm64\" (UniqueName: \"kubernetes.io/projected/83ac0ba7-fd23-4a2b-87c4-59878001bbc5-kube-api-access-kjm64\") pod \"maas-controller-7458b9fb6c-nkj6g\" (UID: \"83ac0ba7-fd23-4a2b-87c4-59878001bbc5\") " pod="opendatahub/maas-controller-7458b9fb6c-nkj6g" Apr 20 15:17:29.388558 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:17:29.388533 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjm64\" (UniqueName: \"kubernetes.io/projected/83ac0ba7-fd23-4a2b-87c4-59878001bbc5-kube-api-access-kjm64\") pod \"maas-controller-7458b9fb6c-nkj6g\" (UID: \"83ac0ba7-fd23-4a2b-87c4-59878001bbc5\") " pod="opendatahub/maas-controller-7458b9fb6c-nkj6g" Apr 20 15:17:29.490220 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:17:29.490134 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7458b9fb6c-nkj6g" Apr 20 15:17:29.616442 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:17:29.616415 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7458b9fb6c-nkj6g"] Apr 20 15:17:29.618398 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:17:29.618360 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83ac0ba7_fd23_4a2b_87c4_59878001bbc5.slice/crio-27991d6f48f448ef7f54957a2f41b08a6737666bb151bce22980ea4c2366d87f WatchSource:0}: Error finding container 27991d6f48f448ef7f54957a2f41b08a6737666bb151bce22980ea4c2366d87f: Status 404 returned error can't find the container with id 27991d6f48f448ef7f54957a2f41b08a6737666bb151bce22980ea4c2366d87f Apr 20 15:17:30.165464 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:17:30.165428 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7458b9fb6c-nkj6g" event={"ID":"83ac0ba7-fd23-4a2b-87c4-59878001bbc5","Type":"ContainerStarted","Data":"1c6a0c2f3e512d989a48d1776734d42c235c5fad7330afe3676c0fe894d3b3c2"} Apr 20 15:17:30.165464 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:17:30.165470 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7458b9fb6c-nkj6g" event={"ID":"83ac0ba7-fd23-4a2b-87c4-59878001bbc5","Type":"ContainerStarted","Data":"27991d6f48f448ef7f54957a2f41b08a6737666bb151bce22980ea4c2366d87f"} Apr 20 15:17:30.165676 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:17:30.165508 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-7458b9fb6c-nkj6g" Apr 20 15:17:30.185353 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:17:30.185302 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-7458b9fb6c-nkj6g" podStartSLOduration=0.764950102 podStartE2EDuration="1.185259821s" podCreationTimestamp="2026-04-20 15:17:29 +0000 UTC" firstStartedPulling="2026-04-20 15:17:29.619797976 +0000 UTC m=+906.992384662" lastFinishedPulling="2026-04-20 15:17:30.040107698 +0000 UTC m=+907.412694381" observedRunningTime="2026-04-20 15:17:30.183207748 +0000 UTC m=+907.555794464" watchObservedRunningTime="2026-04-20 15:17:30.185259821 +0000 UTC m=+907.557846526" Apr 20 15:17:41.175787 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:17:41.175756 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-7458b9fb6c-nkj6g" Apr 20 15:22:23.146062 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:22:23.146036 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pbfc9_ab96425d-f444-4af7-9052-1bddacccef53/console-operator/1.log" Apr 20 15:22:23.148233 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:22:23.148207 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pbfc9_ab96425d-f444-4af7-9052-1bddacccef53/console-operator/1.log" Apr 20 15:27:23.171699 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:27:23.171658 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pbfc9_ab96425d-f444-4af7-9052-1bddacccef53/console-operator/1.log" Apr 20 15:27:23.174119 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:27:23.174093 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pbfc9_ab96425d-f444-4af7-9052-1bddacccef53/console-operator/1.log" Apr 20 15:32:23.203923 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:32:23.203892 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pbfc9_ab96425d-f444-4af7-9052-1bddacccef53/console-operator/1.log" Apr 20 15:32:23.206795 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:32:23.206773 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pbfc9_ab96425d-f444-4af7-9052-1bddacccef53/console-operator/1.log" Apr 20 15:37:23.228431 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:37:23.228399 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pbfc9_ab96425d-f444-4af7-9052-1bddacccef53/console-operator/1.log" Apr 20 15:37:23.232070 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:37:23.232048 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pbfc9_ab96425d-f444-4af7-9052-1bddacccef53/console-operator/1.log" Apr 20 15:38:23.935511 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:23.935429 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-7d759bd859-rqrzj_f72bef06-c4a6-4097-ba90-6d6985ab2168/authorino/0.log" Apr 20 15:38:27.901956 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:27.901905 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-6bb78778bd-974xh_5f39468e-cab7-461a-9f30-65a1fd4628c6/maas-api/0.log" Apr 20 15:38:28.021957 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:28.021900 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-7458b9fb6c-nkj6g_83ac0ba7-fd23-4a2b-87c4-59878001bbc5/manager/0.log" Apr 20 15:38:28.372651 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:28.372606 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6cdf6786d9-gc2mf_c3aa9a68-e4f8-494e-903a-b68cd8e83b71/manager/0.log" Apr 20 15:38:29.842333 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:29.842297 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-7d759bd859-rqrzj_f72bef06-c4a6-4097-ba90-6d6985ab2168/authorino/0.log" Apr 20 15:38:30.659842 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:30.659807 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-j6s5f_110c0249-fda7-4bfa-bfd5-a1e2c2a5d5c3/manager/0.log" Apr 20 15:38:31.341497 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:31.341463 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-7fcf5d587f-zx95c_bdc3ef4e-6631-4f08-8712-1d65faaf30c4/kube-auth-proxy/0.log" Apr 20 15:38:36.319574 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:36.319538 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-58p5q/must-gather-gq9k4"] Apr 20 15:38:36.323297 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:36.323263 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58p5q/must-gather-gq9k4" Apr 20 15:38:36.325681 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:36.325657 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-58p5q\"/\"kube-root-ca.crt\"" Apr 20 15:38:36.326585 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:36.326569 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-58p5q\"/\"default-dockercfg-qgplg\"" Apr 20 15:38:36.326658 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:36.326608 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-58p5q\"/\"openshift-service-ca.crt\"" Apr 20 15:38:36.336433 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:36.336409 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-58p5q/must-gather-gq9k4"] Apr 20 15:38:36.354818 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:36.354788 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h6cz\" (UniqueName: \"kubernetes.io/projected/3deb4d8d-8699-42eb-9c28-a9b4a8177ebb-kube-api-access-8h6cz\") pod \"must-gather-gq9k4\" (UID: \"3deb4d8d-8699-42eb-9c28-a9b4a8177ebb\") " pod="openshift-must-gather-58p5q/must-gather-gq9k4" Apr 20 15:38:36.354996 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:36.354835 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3deb4d8d-8699-42eb-9c28-a9b4a8177ebb-must-gather-output\") pod \"must-gather-gq9k4\" (UID: \"3deb4d8d-8699-42eb-9c28-a9b4a8177ebb\") " pod="openshift-must-gather-58p5q/must-gather-gq9k4" Apr 20 15:38:36.455896 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:36.455859 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8h6cz\" (UniqueName: \"kubernetes.io/projected/3deb4d8d-8699-42eb-9c28-a9b4a8177ebb-kube-api-access-8h6cz\") pod \"must-gather-gq9k4\" (UID: \"3deb4d8d-8699-42eb-9c28-a9b4a8177ebb\") " pod="openshift-must-gather-58p5q/must-gather-gq9k4" Apr 20 15:38:36.456045 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:36.455912 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3deb4d8d-8699-42eb-9c28-a9b4a8177ebb-must-gather-output\") pod \"must-gather-gq9k4\" (UID: \"3deb4d8d-8699-42eb-9c28-a9b4a8177ebb\") " pod="openshift-must-gather-58p5q/must-gather-gq9k4" Apr 20 15:38:36.456220 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:36.456205 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3deb4d8d-8699-42eb-9c28-a9b4a8177ebb-must-gather-output\") pod \"must-gather-gq9k4\" (UID: \"3deb4d8d-8699-42eb-9c28-a9b4a8177ebb\") " pod="openshift-must-gather-58p5q/must-gather-gq9k4" Apr 20 15:38:36.463586 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:36.463565 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h6cz\" (UniqueName: \"kubernetes.io/projected/3deb4d8d-8699-42eb-9c28-a9b4a8177ebb-kube-api-access-8h6cz\") pod \"must-gather-gq9k4\" (UID: \"3deb4d8d-8699-42eb-9c28-a9b4a8177ebb\") " pod="openshift-must-gather-58p5q/must-gather-gq9k4" Apr 20 15:38:36.632909 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:36.632831 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58p5q/must-gather-gq9k4" Apr 20 15:38:36.761680 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:36.761652 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-58p5q/must-gather-gq9k4"] Apr 20 15:38:36.764308 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:38:36.764246 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3deb4d8d_8699_42eb_9c28_a9b4a8177ebb.slice/crio-9d295c9ebcb2ef9feeb9fae7cdab6564e683a739d82c67ebbade6a1b0117a3fd WatchSource:0}: Error finding container 9d295c9ebcb2ef9feeb9fae7cdab6564e683a739d82c67ebbade6a1b0117a3fd: Status 404 returned error can't find the container with id 9d295c9ebcb2ef9feeb9fae7cdab6564e683a739d82c67ebbade6a1b0117a3fd Apr 20 15:38:36.766127 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:36.766109 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 15:38:37.523030 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:37.522986 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58p5q/must-gather-gq9k4" event={"ID":"3deb4d8d-8699-42eb-9c28-a9b4a8177ebb","Type":"ContainerStarted","Data":"9d295c9ebcb2ef9feeb9fae7cdab6564e683a739d82c67ebbade6a1b0117a3fd"} Apr 20 15:38:38.529392 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:38.529350 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58p5q/must-gather-gq9k4" event={"ID":"3deb4d8d-8699-42eb-9c28-a9b4a8177ebb","Type":"ContainerStarted","Data":"af0f8d39dcc703c33e60b76bfdb02fd7d09409ec49231259eae932decee0a5a2"} Apr 20 15:38:38.529860 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:38.529399 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58p5q/must-gather-gq9k4" event={"ID":"3deb4d8d-8699-42eb-9c28-a9b4a8177ebb","Type":"ContainerStarted","Data":"e5f2cfbf47d7aa020b04300f747c5879b4f4f133defec6d9a38437b435b58167"} Apr 20 15:38:38.548179 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:38.546538 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-58p5q/must-gather-gq9k4" podStartSLOduration=1.694265472 podStartE2EDuration="2.546517068s" podCreationTimestamp="2026-04-20 15:38:36 +0000 UTC" firstStartedPulling="2026-04-20 15:38:36.766246652 +0000 UTC m=+2174.138833334" lastFinishedPulling="2026-04-20 15:38:37.618498245 +0000 UTC m=+2174.991084930" observedRunningTime="2026-04-20 15:38:38.544156832 +0000 UTC m=+2175.916743537" watchObservedRunningTime="2026-04-20 15:38:38.546517068 +0000 UTC m=+2175.919103774" Apr 20 15:38:39.202601 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:39.202566 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-2nhxg_c7922475-7fe5-4f88-a1fd-a1bd0359f7c6/global-pull-secret-syncer/0.log" Apr 20 15:38:39.376587 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:39.376544 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-t94xd_86bce86e-ed68-41f9-be80-acf37bfb646f/konnectivity-agent/0.log" Apr 20 15:38:39.422850 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:39.422814 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-198.ec2.internal_4cf8424ac7d796a78f96e62791daed1d/haproxy/0.log" Apr 20 15:38:44.148146 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:44.148009 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-7d759bd859-rqrzj_f72bef06-c4a6-4097-ba90-6d6985ab2168/authorino/0.log" Apr 20 15:38:44.447788 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:44.447700 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-j6s5f_110c0249-fda7-4bfa-bfd5-a1e2c2a5d5c3/manager/0.log" Apr 20 15:38:45.890153 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:45.890110 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f34f657e-9dd8-45c3-8621-d8e62ad289fc/alertmanager/0.log" Apr 20 15:38:45.914180 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:45.914149 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f34f657e-9dd8-45c3-8621-d8e62ad289fc/config-reloader/0.log" Apr 20 15:38:45.933072 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:45.933046 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f34f657e-9dd8-45c3-8621-d8e62ad289fc/kube-rbac-proxy-web/0.log" Apr 20 15:38:45.953419 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:45.953390 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f34f657e-9dd8-45c3-8621-d8e62ad289fc/kube-rbac-proxy/0.log" Apr 20 15:38:45.974676 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:45.974647 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f34f657e-9dd8-45c3-8621-d8e62ad289fc/kube-rbac-proxy-metric/0.log" Apr 20 15:38:45.996023 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:45.995986 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f34f657e-9dd8-45c3-8621-d8e62ad289fc/prom-label-proxy/0.log" Apr 20 15:38:46.017294 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:46.017201 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f34f657e-9dd8-45c3-8621-d8e62ad289fc/init-config-reloader/0.log" Apr 20 15:38:46.054258 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:46.054227 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-j22cl_af67f37d-b332-44b7-8678-28aa45d26ed9/cluster-monitoring-operator/0.log" Apr 20 15:38:46.088324 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:46.088288 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-7rp8b_0525c669-b74e-43d2-a613-4a95d5c51bdd/kube-state-metrics/0.log" Apr 20 15:38:46.118463 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:46.118432 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-7rp8b_0525c669-b74e-43d2-a613-4a95d5c51bdd/kube-rbac-proxy-main/0.log" Apr 20 15:38:46.140506 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:46.140473 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-7rp8b_0525c669-b74e-43d2-a613-4a95d5c51bdd/kube-rbac-proxy-self/0.log" Apr 20 15:38:46.359989 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:46.359926 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gbvlq_cb4d0256-b738-4dd1-bf9b-d24347291878/node-exporter/0.log" Apr 20 15:38:46.379362 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:46.379331 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gbvlq_cb4d0256-b738-4dd1-bf9b-d24347291878/kube-rbac-proxy/0.log" Apr 20 15:38:46.403046 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:46.403018 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gbvlq_cb4d0256-b738-4dd1-bf9b-d24347291878/init-textfile/0.log" Apr 20 15:38:46.426391 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:46.426361 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-jwgdc_b2e813cc-e62f-44d2-a6cb-ede8248e09ac/kube-rbac-proxy-main/0.log" Apr 20 15:38:46.450062 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:46.450036 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-jwgdc_b2e813cc-e62f-44d2-a6cb-ede8248e09ac/kube-rbac-proxy-self/0.log" Apr 20 15:38:46.472052 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:46.472020 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-jwgdc_b2e813cc-e62f-44d2-a6cb-ede8248e09ac/openshift-state-metrics/0.log" Apr 20 15:38:46.504285 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:46.504243 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e66edc20-fc2f-47ec-b46e-a12689d026bb/prometheus/0.log" Apr 20 15:38:46.522904 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:46.522878 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e66edc20-fc2f-47ec-b46e-a12689d026bb/config-reloader/0.log" Apr 20 15:38:46.544699 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:46.544664 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e66edc20-fc2f-47ec-b46e-a12689d026bb/thanos-sidecar/0.log" Apr 20 15:38:46.566409 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:46.566371 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e66edc20-fc2f-47ec-b46e-a12689d026bb/kube-rbac-proxy-web/0.log" Apr 20 15:38:46.589453 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:46.589413 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e66edc20-fc2f-47ec-b46e-a12689d026bb/kube-rbac-proxy/0.log" Apr 20 15:38:46.613989 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:46.613856 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e66edc20-fc2f-47ec-b46e-a12689d026bb/kube-rbac-proxy-thanos/0.log" Apr 20 15:38:46.633663 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:46.633632 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e66edc20-fc2f-47ec-b46e-a12689d026bb/init-config-reloader/0.log" Apr 20 15:38:46.731834 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:46.731801 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7c58c6bbd9-4gmg2_550f14b6-7142-4cc6-958e-fcf427dd9fe0/telemeter-client/0.log" Apr 20 15:38:46.753347 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:46.753318 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7c58c6bbd9-4gmg2_550f14b6-7142-4cc6-958e-fcf427dd9fe0/reload/0.log" Apr 20 15:38:46.773681 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:46.773647 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7c58c6bbd9-4gmg2_550f14b6-7142-4cc6-958e-fcf427dd9fe0/kube-rbac-proxy/0.log" Apr 20 15:38:46.823262 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:46.823230 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7bd788d4ff-jmqxb_e9beb5ce-32fe-4694-9470-0cd472f5523d/thanos-query/0.log" Apr 20 15:38:46.842729 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:46.842701 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7bd788d4ff-jmqxb_e9beb5ce-32fe-4694-9470-0cd472f5523d/kube-rbac-proxy-web/0.log" Apr 20 15:38:46.865843 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:46.865766 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7bd788d4ff-jmqxb_e9beb5ce-32fe-4694-9470-0cd472f5523d/kube-rbac-proxy/0.log" Apr 20 15:38:46.885938 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:46.885899 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7bd788d4ff-jmqxb_e9beb5ce-32fe-4694-9470-0cd472f5523d/prom-label-proxy/0.log" Apr 20 15:38:46.906420 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:46.906391 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7bd788d4ff-jmqxb_e9beb5ce-32fe-4694-9470-0cd472f5523d/kube-rbac-proxy-rules/0.log" Apr 20 15:38:46.927484 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:46.927455 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7bd788d4ff-jmqxb_e9beb5ce-32fe-4694-9470-0cd472f5523d/kube-rbac-proxy-metrics/0.log" Apr 20 15:38:47.871676 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:47.871638 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-58p5q/perf-node-gather-daemonset-2v58p"] Apr 20 15:38:47.879156 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:47.879120 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-2v58p" Apr 20 15:38:47.881511 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:47.881479 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-58p5q/perf-node-gather-daemonset-2v58p"] Apr 20 15:38:47.979288 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:47.979234 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfm2s\" (UniqueName: \"kubernetes.io/projected/5564f238-a42f-43d1-b751-3f6e7ba1fe56-kube-api-access-mfm2s\") pod \"perf-node-gather-daemonset-2v58p\" (UID: \"5564f238-a42f-43d1-b751-3f6e7ba1fe56\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-2v58p" Apr 20 15:38:47.979861 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:47.979832 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5564f238-a42f-43d1-b751-3f6e7ba1fe56-proc\") pod \"perf-node-gather-daemonset-2v58p\" (UID: \"5564f238-a42f-43d1-b751-3f6e7ba1fe56\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-2v58p" Apr 20 15:38:47.980125 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:47.980104 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5564f238-a42f-43d1-b751-3f6e7ba1fe56-podres\") pod \"perf-node-gather-daemonset-2v58p\" (UID: \"5564f238-a42f-43d1-b751-3f6e7ba1fe56\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-2v58p" Apr 20 15:38:47.980263 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:47.980244 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5564f238-a42f-43d1-b751-3f6e7ba1fe56-lib-modules\") pod \"perf-node-gather-daemonset-2v58p\" (UID: \"5564f238-a42f-43d1-b751-3f6e7ba1fe56\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-2v58p" Apr 20 15:38:47.980428 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:47.980412 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5564f238-a42f-43d1-b751-3f6e7ba1fe56-sys\") pod \"perf-node-gather-daemonset-2v58p\" (UID: \"5564f238-a42f-43d1-b751-3f6e7ba1fe56\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-2v58p" Apr 20 15:38:48.081549 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:48.081505 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5564f238-a42f-43d1-b751-3f6e7ba1fe56-proc\") pod \"perf-node-gather-daemonset-2v58p\" (UID: \"5564f238-a42f-43d1-b751-3f6e7ba1fe56\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-2v58p" Apr 20 15:38:48.081739 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:48.081613 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5564f238-a42f-43d1-b751-3f6e7ba1fe56-podres\") pod \"perf-node-gather-daemonset-2v58p\" (UID: \"5564f238-a42f-43d1-b751-3f6e7ba1fe56\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-2v58p" Apr 20 15:38:48.081739 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:48.081643 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5564f238-a42f-43d1-b751-3f6e7ba1fe56-lib-modules\") pod \"perf-node-gather-daemonset-2v58p\" (UID: \"5564f238-a42f-43d1-b751-3f6e7ba1fe56\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-2v58p" Apr 20 15:38:48.081739 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:48.081666 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5564f238-a42f-43d1-b751-3f6e7ba1fe56-proc\") pod \"perf-node-gather-daemonset-2v58p\" (UID: \"5564f238-a42f-43d1-b751-3f6e7ba1fe56\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-2v58p" Apr 20 15:38:48.081739 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:48.081689 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5564f238-a42f-43d1-b751-3f6e7ba1fe56-sys\") pod \"perf-node-gather-daemonset-2v58p\" (UID: \"5564f238-a42f-43d1-b751-3f6e7ba1fe56\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-2v58p" Apr 20 15:38:48.081739 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:48.081739 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5564f238-a42f-43d1-b751-3f6e7ba1fe56-sys\") pod \"perf-node-gather-daemonset-2v58p\" (UID: \"5564f238-a42f-43d1-b751-3f6e7ba1fe56\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-2v58p" Apr 20 15:38:48.081957 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:48.081781 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mfm2s\" (UniqueName: \"kubernetes.io/projected/5564f238-a42f-43d1-b751-3f6e7ba1fe56-kube-api-access-mfm2s\") pod \"perf-node-gather-daemonset-2v58p\" (UID: \"5564f238-a42f-43d1-b751-3f6e7ba1fe56\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-2v58p" Apr 20 15:38:48.081957 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:48.081799 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5564f238-a42f-43d1-b751-3f6e7ba1fe56-podres\") pod \"perf-node-gather-daemonset-2v58p\" (UID: \"5564f238-a42f-43d1-b751-3f6e7ba1fe56\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-2v58p" Apr 20 15:38:48.081957 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:48.081840 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5564f238-a42f-43d1-b751-3f6e7ba1fe56-lib-modules\") pod \"perf-node-gather-daemonset-2v58p\" (UID: \"5564f238-a42f-43d1-b751-3f6e7ba1fe56\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-2v58p" Apr 20 15:38:48.089556 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:48.089528 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfm2s\" (UniqueName: \"kubernetes.io/projected/5564f238-a42f-43d1-b751-3f6e7ba1fe56-kube-api-access-mfm2s\") pod \"perf-node-gather-daemonset-2v58p\" (UID: \"5564f238-a42f-43d1-b751-3f6e7ba1fe56\") " pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-2v58p" Apr 20 15:38:48.194587 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:48.194499 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-2v58p" Apr 20 15:38:48.368071 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:48.368025 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-58p5q/perf-node-gather-daemonset-2v58p"] Apr 20 15:38:48.374583 ip-10-0-133-198 kubenswrapper[2575]: W0420 15:38:48.374544 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5564f238_a42f_43d1_b751_3f6e7ba1fe56.slice/crio-fc5f352a55a5ad5c992cf15a394c4ddfdd0d85d4f51e723b8aec4c53e5a3d4bf WatchSource:0}: Error finding container fc5f352a55a5ad5c992cf15a394c4ddfdd0d85d4f51e723b8aec4c53e5a3d4bf: Status 404 returned error can't find the container with id fc5f352a55a5ad5c992cf15a394c4ddfdd0d85d4f51e723b8aec4c53e5a3d4bf Apr 20 15:38:48.470347 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:48.470316 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pbfc9_ab96425d-f444-4af7-9052-1bddacccef53/console-operator/1.log" Apr 20 15:38:48.483785 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:48.483731 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pbfc9_ab96425d-f444-4af7-9052-1bddacccef53/console-operator/2.log" Apr 20 15:38:48.573965 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:48.573918 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-2v58p" event={"ID":"5564f238-a42f-43d1-b751-3f6e7ba1fe56","Type":"ContainerStarted","Data":"94bc65af8b4b47db2559f351619ec008749d1446f9db8e288f572552ed3e9d29"} Apr 20 15:38:48.573965 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:48.573970 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-2v58p" event={"ID":"5564f238-a42f-43d1-b751-3f6e7ba1fe56","Type":"ContainerStarted","Data":"fc5f352a55a5ad5c992cf15a394c4ddfdd0d85d4f51e723b8aec4c53e5a3d4bf"} Apr 20 15:38:48.574852 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:48.574824 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-2v58p" Apr 20 15:38:48.592537 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:48.592488 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-2v58p" podStartSLOduration=1.592473391 podStartE2EDuration="1.592473391s" podCreationTimestamp="2026-04-20 15:38:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:38:48.589486637 +0000 UTC m=+2185.962073344" watchObservedRunningTime="2026-04-20 15:38:48.592473391 +0000 UTC m=+2185.965060099" Apr 20 15:38:50.328577 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:50.328545 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xp9tj_bfea901a-e1df-46c7-b211-b94f978562b5/dns/0.log" Apr 20 15:38:50.347683 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:50.347655 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xp9tj_bfea901a-e1df-46c7-b211-b94f978562b5/kube-rbac-proxy/0.log" Apr 20 15:38:50.412603 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:50.412572 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-wfw5f_2e79beec-c5db-41c7-a60e-c759696b1d60/dns-node-resolver/0.log" Apr 20 15:38:50.881789 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:50.881757 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-flrxv_86824264-d16e-4d82-854b-f1f5bc86483c/node-ca/0.log" Apr 20 15:38:51.862869 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:51.862833 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-7fcf5d587f-zx95c_bdc3ef4e-6631-4f08-8712-1d65faaf30c4/kube-auth-proxy/0.log" Apr 20 15:38:52.472191 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:52.472151 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-gcg7h_cfa73a34-d39a-4a89-b936-de5c6399f787/serve-healthcheck-canary/0.log" Apr 20 15:38:52.945005 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:52.944979 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6jz9t_486a1f57-c24b-4698-b65e-1c79387c2c19/kube-rbac-proxy/0.log" Apr 20 15:38:52.964168 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:52.964121 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6jz9t_486a1f57-c24b-4698-b65e-1c79387c2c19/exporter/0.log" Apr 20 15:38:52.985504 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:52.985480 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6jz9t_486a1f57-c24b-4698-b65e-1c79387c2c19/extractor/0.log" Apr 20 15:38:54.948145 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:54.948115 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-6bb78778bd-974xh_5f39468e-cab7-461a-9f30-65a1fd4628c6/maas-api/0.log" Apr 20 15:38:55.031731 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:55.031693 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-7458b9fb6c-nkj6g_83ac0ba7-fd23-4a2b-87c4-59878001bbc5/manager/0.log" Apr 20 15:38:55.108107 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:55.108072 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6cdf6786d9-gc2mf_c3aa9a68-e4f8-494e-903a-b68cd8e83b71/manager/0.log" Apr 20 15:38:55.594835 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:55.594806 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-58p5q/perf-node-gather-daemonset-2v58p" Apr 20 15:38:56.356971 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:38:56.356944 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-6b8584f779-8g8x7_0c749bc3-dcb0-4ba1-80b3-819ef3a7f61d/manager/0.log" Apr 20 15:39:01.072308 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:39:01.072262 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-8489c_d6a53b42-a8ae-4454-b200-47881e42577a/kube-storage-version-migrator-operator/1.log" Apr 20 15:39:01.074095 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:39:01.074061 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-8489c_d6a53b42-a8ae-4454-b200-47881e42577a/kube-storage-version-migrator-operator/0.log" Apr 20 15:39:01.991359 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:39:01.991331 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5mqkf_68cb8276-452d-4742-adc2-9eb67152cc05/kube-multus-additional-cni-plugins/0.log" Apr 20 15:39:02.013963 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:39:02.013922 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5mqkf_68cb8276-452d-4742-adc2-9eb67152cc05/egress-router-binary-copy/0.log" Apr 20 15:39:02.033447 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:39:02.033412 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5mqkf_68cb8276-452d-4742-adc2-9eb67152cc05/cni-plugins/0.log" Apr 20 15:39:02.052725 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:39:02.052681 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5mqkf_68cb8276-452d-4742-adc2-9eb67152cc05/bond-cni-plugin/0.log" Apr 20 15:39:02.072801 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:39:02.072776 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5mqkf_68cb8276-452d-4742-adc2-9eb67152cc05/routeoverride-cni/0.log" Apr 20 15:39:02.093246 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:39:02.093216 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5mqkf_68cb8276-452d-4742-adc2-9eb67152cc05/whereabouts-cni-bincopy/0.log" Apr 20 15:39:02.112794 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:39:02.112765 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5mqkf_68cb8276-452d-4742-adc2-9eb67152cc05/whereabouts-cni/0.log" Apr 20 15:39:02.586208 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:39:02.586171 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-z9bz2_6318e638-1067-4e23-94f4-dad4de00297a/kube-multus/0.log" Apr 20 15:39:02.660812 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:39:02.660781 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-lnqrj_ee584f46-b9aa-46b2-a060-01c6f4e256e9/network-metrics-daemon/0.log" Apr 20 15:39:02.681407 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:39:02.681367 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-lnqrj_ee584f46-b9aa-46b2-a060-01c6f4e256e9/kube-rbac-proxy/0.log" Apr 20 15:39:03.965571 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:39:03.965542 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sdgqx_6d67f43f-b926-4199-ae7a-ccf686190d9b/ovn-controller/0.log" Apr 20 15:39:04.004883 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:39:04.004847 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sdgqx_6d67f43f-b926-4199-ae7a-ccf686190d9b/ovn-acl-logging/0.log" Apr 20 15:39:04.030373 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:39:04.030335 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sdgqx_6d67f43f-b926-4199-ae7a-ccf686190d9b/kube-rbac-proxy-node/0.log" Apr 20 15:39:04.054762 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:39:04.054722 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sdgqx_6d67f43f-b926-4199-ae7a-ccf686190d9b/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 15:39:04.075210 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:39:04.075171 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sdgqx_6d67f43f-b926-4199-ae7a-ccf686190d9b/northd/0.log" Apr 20 15:39:04.107822 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:39:04.107791 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sdgqx_6d67f43f-b926-4199-ae7a-ccf686190d9b/nbdb/0.log" Apr 20 15:39:04.148952 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:39:04.148925 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sdgqx_6d67f43f-b926-4199-ae7a-ccf686190d9b/sbdb/0.log" Apr 20 15:39:04.356336 ip-10-0-133-198 kubenswrapper[2575]: I0420 15:39:04.356301 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sdgqx_6d67f43f-b926-4199-ae7a-ccf686190d9b/ovnkube-controller/0.log"