Apr 16 23:47:47.472934 ip-10-0-128-98 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 23:47:47.472948 ip-10-0-128-98 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 23:47:47.472958 ip-10-0-128-98 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 23:47:47.473326 ip-10-0-128-98 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 23:47:57.569497 ip-10-0-128-98 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 23:47:57.569513 ip-10-0-128-98 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 6506f847f8934661a797e7d5ddfa285b -- Apr 16 23:50:27.665438 ip-10-0-128-98 systemd[1]: Starting Kubernetes Kubelet... Apr 16 23:50:28.129501 ip-10-0-128-98 kubenswrapper[2562]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 23:50:28.129501 ip-10-0-128-98 kubenswrapper[2562]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 23:50:28.129501 ip-10-0-128-98 kubenswrapper[2562]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 23:50:28.129501 ip-10-0-128-98 kubenswrapper[2562]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 23:50:28.129501 ip-10-0-128-98 kubenswrapper[2562]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 23:50:28.130217 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.130122 2562 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 23:50:28.132461 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132447 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 23:50:28.132498 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132461 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 23:50:28.132498 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132466 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 23:50:28.132498 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132469 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 23:50:28.132498 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132472 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 23:50:28.132498 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132476 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 23:50:28.132498 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132480 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 23:50:28.132498 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132483 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 23:50:28.132498 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132486 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 23:50:28.132498 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132489 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 23:50:28.132498 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132492 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 23:50:28.132498 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132496 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 23:50:28.132498 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132501 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 23:50:28.132791 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132506 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 23:50:28.132791 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132510 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 16 23:50:28.132791 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132514 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 23:50:28.132791 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132522 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 23:50:28.132791 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132527 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 23:50:28.132791 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132531 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 23:50:28.132791 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132534 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 23:50:28.132791 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132537 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 23:50:28.132791 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132540 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 23:50:28.132791 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132543 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 23:50:28.132791 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132545 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 23:50:28.132791 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132548 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 23:50:28.132791 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132551 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 23:50:28.132791 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132553 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 23:50:28.132791 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132556 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 23:50:28.132791 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132558 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 23:50:28.132791 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132561 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 23:50:28.132791 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132563 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 23:50:28.132791 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132566 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 23:50:28.132791 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132568 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 23:50:28.133280 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132571 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 23:50:28.133280 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132574 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 23:50:28.133280 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132576 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 23:50:28.133280 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132579 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 23:50:28.133280 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132581 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 23:50:28.133280 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132584 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 23:50:28.133280 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132587 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 23:50:28.133280 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132589 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 23:50:28.133280 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132592 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 23:50:28.133280 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132595 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 23:50:28.133280 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132597 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 23:50:28.133280 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132599 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 23:50:28.133280 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132602 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 23:50:28.133280 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132606 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 23:50:28.133280 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132609 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 23:50:28.133280 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132612 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 23:50:28.133280 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132617 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 23:50:28.133280 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132619 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 23:50:28.133280 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132622 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 23:50:28.133280 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132625 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 23:50:28.133775 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132628 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 23:50:28.133775 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132631 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 23:50:28.133775 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132633 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 23:50:28.133775 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132636 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 23:50:28.133775 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132638 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 23:50:28.133775 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132641 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 23:50:28.133775 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132644 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 23:50:28.133775 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132646 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 23:50:28.133775 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132649 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 23:50:28.133775 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132651 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 23:50:28.133775 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132654 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 23:50:28.133775 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132656 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 23:50:28.133775 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132659 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 23:50:28.133775 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132661 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 23:50:28.133775 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132664 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 23:50:28.133775 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132676 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 23:50:28.133775 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132679 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 23:50:28.133775 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132682 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 23:50:28.133775 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132684 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 23:50:28.134279 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132687 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 23:50:28.134279 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132691 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 23:50:28.134279 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132694 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 23:50:28.134279 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132698 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 23:50:28.134279 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132703 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 23:50:28.134279 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132706 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 23:50:28.134279 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132709 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 23:50:28.134279 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132712 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 23:50:28.134279 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132715 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 23:50:28.134279 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132718 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 23:50:28.134279 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132720 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 23:50:28.134279 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132723 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 23:50:28.134279 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132725 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 23:50:28.134279 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.132728 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 23:50:28.134279 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133096 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 23:50:28.134279 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133101 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 23:50:28.134279 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133104 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 23:50:28.134279 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133107 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 23:50:28.134279 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133110 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 23:50:28.134279 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133113 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 23:50:28.134757 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133116 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 23:50:28.134757 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133118 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 23:50:28.134757 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133121 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 23:50:28.134757 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133123 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 23:50:28.134757 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133126 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 23:50:28.134757 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133129 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 23:50:28.134757 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133132 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 23:50:28.134757 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133136 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 23:50:28.134757 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133139 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 23:50:28.134757 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133142 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 23:50:28.134757 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133145 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 23:50:28.134757 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133147 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 23:50:28.134757 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133150 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 23:50:28.134757 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133153 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 23:50:28.134757 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133156 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 23:50:28.134757 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133159 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 23:50:28.134757 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133161 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 16 23:50:28.134757 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133164 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 23:50:28.134757 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133167 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 23:50:28.135248 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133170 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 23:50:28.135248 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133173 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 23:50:28.135248 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133176 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 23:50:28.135248 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133178 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 23:50:28.135248 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133181 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 23:50:28.135248 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133183 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 23:50:28.135248 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133199 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 23:50:28.135248 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133202 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 23:50:28.135248 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133205 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 23:50:28.135248 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133207 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 23:50:28.135248 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133210 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 23:50:28.135248 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133213 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 23:50:28.135248 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133215 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 23:50:28.135248 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133218 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 23:50:28.135248 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133220 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 23:50:28.135248 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133223 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 23:50:28.135248 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133226 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 23:50:28.135248 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133228 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 23:50:28.135248 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133231 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 23:50:28.135248 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133233 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 23:50:28.135749 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133236 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 23:50:28.135749 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133239 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 23:50:28.135749 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133241 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 23:50:28.135749 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133244 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 23:50:28.135749 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133246 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 23:50:28.135749 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133249 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 23:50:28.135749 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133253 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 23:50:28.135749 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133255 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 23:50:28.135749 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133258 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 23:50:28.135749 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133260 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 23:50:28.135749 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133263 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 23:50:28.135749 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133266 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 23:50:28.135749 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133269 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 23:50:28.135749 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133271 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 23:50:28.135749 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133274 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 23:50:28.135749 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133276 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 23:50:28.135749 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133279 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 23:50:28.135749 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133281 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 23:50:28.135749 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133285 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 23:50:28.135749 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133288 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 23:50:28.136257 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133291 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 23:50:28.136257 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133293 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 23:50:28.136257 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133296 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 23:50:28.136257 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133299 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 23:50:28.136257 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133302 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 23:50:28.136257 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133304 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 23:50:28.136257 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133307 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 23:50:28.136257 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133309 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 23:50:28.136257 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133312 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 23:50:28.136257 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133315 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 23:50:28.136257 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133317 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 23:50:28.136257 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133320 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 23:50:28.136257 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133323 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 23:50:28.136257 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133328 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 23:50:28.136257 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133331 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 23:50:28.136257 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133333 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 23:50:28.136257 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133336 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 23:50:28.136257 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133341 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 23:50:28.136257 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133344 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 23:50:28.136737 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133348 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 23:50:28.136737 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.133351 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 23:50:28.136737 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.133952 2562 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 23:50:28.136737 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.133964 2562 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 23:50:28.136737 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.133972 2562 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 23:50:28.136737 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.133976 2562 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 23:50:28.136737 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.133981 2562 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 23:50:28.136737 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.133984 2562 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 23:50:28.136737 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.133989 2562 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 23:50:28.136737 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.133993 2562 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 23:50:28.136737 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.133996 2562 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 23:50:28.136737 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.133999 2562 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 23:50:28.136737 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134002 2562 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 23:50:28.136737 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134005 2562 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 23:50:28.136737 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134008 2562 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 23:50:28.136737 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134011 2562 flags.go:64] FLAG: --cgroup-root="" Apr 16 23:50:28.136737 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134015 2562 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 23:50:28.136737 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134018 2562 flags.go:64] FLAG: --client-ca-file="" Apr 16 23:50:28.136737 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134021 2562 flags.go:64] FLAG: --cloud-config="" Apr 16 23:50:28.136737 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134024 2562 flags.go:64] FLAG: --cloud-provider="external" Apr 16 23:50:28.136737 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134027 2562 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 23:50:28.136737 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134031 2562 flags.go:64] FLAG: --cluster-domain="" Apr 16 23:50:28.136737 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134034 2562 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 23:50:28.136737 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134037 2562 flags.go:64] FLAG: --config-dir="" Apr 16 23:50:28.137343 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134040 2562 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 23:50:28.137343 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134043 2562 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 23:50:28.137343 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134047 2562 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 23:50:28.137343 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134050 2562 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 23:50:28.137343 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134053 2562 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 23:50:28.137343 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134056 2562 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 23:50:28.137343 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134059 2562 flags.go:64] FLAG: --contention-profiling="false" Apr 16 23:50:28.137343 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134062 2562 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 23:50:28.137343 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134065 2562 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 23:50:28.137343 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134068 2562 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 23:50:28.137343 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134071 2562 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 23:50:28.137343 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134075 2562 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 23:50:28.137343 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134078 2562 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 23:50:28.137343 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134081 2562 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 23:50:28.137343 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134084 2562 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 23:50:28.137343 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134087 2562 flags.go:64] FLAG: --enable-server="true" Apr 16 23:50:28.137343 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134090 2562 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 23:50:28.137343 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134094 2562 flags.go:64] FLAG: --event-burst="100" Apr 16 23:50:28.137343 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134097 2562 flags.go:64] FLAG: --event-qps="50" Apr 16 23:50:28.137343 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134100 2562 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 23:50:28.137343 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134103 2562 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 23:50:28.137343 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134106 2562 flags.go:64] FLAG: --eviction-hard="" Apr 16 23:50:28.137343 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134110 2562 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 23:50:28.137343 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134113 2562 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 23:50:28.137916 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134116 2562 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 23:50:28.137916 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134119 2562 flags.go:64] FLAG: --eviction-soft="" Apr 16 23:50:28.137916 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134122 2562 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 23:50:28.137916 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134125 2562 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 23:50:28.137916 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134128 2562 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 23:50:28.137916 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134131 2562 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 23:50:28.137916 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134134 2562 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 23:50:28.137916 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134137 2562 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 23:50:28.137916 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134139 2562 flags.go:64] FLAG: --feature-gates="" Apr 16 23:50:28.137916 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134143 2562 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 23:50:28.137916 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134146 2562 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 23:50:28.137916 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134149 2562 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 23:50:28.137916 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134152 2562 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 23:50:28.137916 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134156 2562 flags.go:64] FLAG: --healthz-port="10248" Apr 16 23:50:28.137916 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134159 2562 flags.go:64] FLAG: --help="false" Apr 16 23:50:28.137916 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134161 2562 flags.go:64] FLAG: --hostname-override="ip-10-0-128-98.ec2.internal" Apr 16 23:50:28.137916 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134164 2562 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 23:50:28.137916 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134167 2562 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 23:50:28.137916 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134170 2562 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 23:50:28.137916 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134173 2562 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 23:50:28.137916 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134177 2562 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 23:50:28.137916 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134180 2562 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 23:50:28.137916 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134183 2562 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 23:50:28.138485 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134185 2562 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 23:50:28.138485 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134205 2562 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 23:50:28.138485 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134210 2562 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 23:50:28.138485 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134215 2562 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 23:50:28.138485 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134219 2562 flags.go:64] FLAG: --kube-reserved="" Apr 16 23:50:28.138485 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134222 2562 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 23:50:28.138485 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134225 2562 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 23:50:28.138485 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134228 2562 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 23:50:28.138485 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134231 2562 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 23:50:28.138485 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134236 2562 flags.go:64] FLAG: --lock-file="" Apr 16 23:50:28.138485 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134239 2562 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 23:50:28.138485 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134242 2562 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 23:50:28.138485 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134245 2562 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 23:50:28.138485 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134251 2562 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 23:50:28.138485 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134253 2562 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 23:50:28.138485 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134256 2562 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 23:50:28.138485 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134259 2562 flags.go:64] FLAG: --logging-format="text" Apr 16 23:50:28.138485 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134262 2562 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 23:50:28.138485 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134266 2562 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 23:50:28.138485 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134269 2562 flags.go:64] FLAG: --manifest-url="" Apr 16 23:50:28.138485 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134271 2562 flags.go:64] FLAG: --manifest-url-header="" Apr 16 23:50:28.138485 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134276 2562 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 23:50:28.138485 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134279 2562 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 23:50:28.138485 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134284 2562 flags.go:64] FLAG: --max-pods="110" Apr 16 23:50:28.138485 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134287 2562 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 23:50:28.139112 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134290 2562 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 23:50:28.139112 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134294 2562 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 23:50:28.139112 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134297 2562 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 23:50:28.139112 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134300 2562 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 23:50:28.139112 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134303 2562 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 23:50:28.139112 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134306 2562 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 23:50:28.139112 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134313 2562 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 23:50:28.139112 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134316 2562 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 23:50:28.139112 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134319 2562 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 23:50:28.139112 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134322 2562 flags.go:64] FLAG: --pod-cidr="" Apr 16 23:50:28.139112 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134325 2562 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 23:50:28.139112 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134330 2562 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 23:50:28.139112 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134333 2562 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 23:50:28.139112 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134337 2562 flags.go:64] FLAG: --pods-per-core="0" Apr 16 23:50:28.139112 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134340 2562 flags.go:64] FLAG: --port="10250" Apr 16 23:50:28.139112 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134343 2562 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 23:50:28.139112 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134347 2562 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0f728a9090009429c" Apr 16 23:50:28.139112 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134350 2562 flags.go:64] FLAG: --qos-reserved="" Apr 16 23:50:28.139112 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134353 2562 flags.go:64] FLAG: --read-only-port="10255" Apr 16 23:50:28.139112 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134356 2562 flags.go:64] FLAG: --register-node="true" Apr 16 23:50:28.139112 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134359 2562 flags.go:64] FLAG: --register-schedulable="true" Apr 16 23:50:28.139112 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134362 2562 flags.go:64] FLAG: --register-with-taints="" Apr 16 23:50:28.139112 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134365 2562 flags.go:64] FLAG: --registry-burst="10" Apr 16 23:50:28.139112 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134368 2562 flags.go:64] FLAG: --registry-qps="5" Apr 16 23:50:28.139733 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134371 2562 flags.go:64] FLAG: --reserved-cpus="" Apr 16 23:50:28.139733 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134373 2562 flags.go:64] FLAG: --reserved-memory="" Apr 16 23:50:28.139733 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134377 2562 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 23:50:28.139733 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134380 2562 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 23:50:28.139733 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134383 2562 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 23:50:28.139733 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134386 2562 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 23:50:28.139733 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134389 2562 flags.go:64] FLAG: --runonce="false" Apr 16 23:50:28.139733 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134392 2562 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 23:50:28.139733 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134395 2562 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 23:50:28.139733 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134398 2562 flags.go:64] FLAG: --seccomp-default="false" Apr 16 23:50:28.139733 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134401 2562 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 23:50:28.139733 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134404 2562 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 23:50:28.139733 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134407 2562 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 23:50:28.139733 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134414 2562 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 23:50:28.139733 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134417 2562 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 23:50:28.139733 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134420 2562 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 23:50:28.139733 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134423 2562 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 23:50:28.139733 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134425 2562 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 23:50:28.139733 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134428 2562 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 23:50:28.139733 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134431 2562 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 23:50:28.139733 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134434 2562 flags.go:64] FLAG: --system-cgroups="" Apr 16 23:50:28.139733 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134437 2562 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 23:50:28.139733 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134443 2562 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 23:50:28.139733 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134445 2562 flags.go:64] FLAG: --tls-cert-file="" Apr 16 23:50:28.139733 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134449 2562 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 23:50:28.140369 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134454 2562 flags.go:64] FLAG: --tls-min-version="" Apr 16 23:50:28.140369 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134457 2562 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 23:50:28.140369 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134459 2562 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 23:50:28.140369 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134462 2562 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 23:50:28.140369 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134465 2562 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 23:50:28.140369 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134468 2562 flags.go:64] FLAG: --v="2" Apr 16 23:50:28.140369 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134473 2562 flags.go:64] FLAG: --version="false" Apr 16 23:50:28.140369 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134477 2562 flags.go:64] FLAG: --vmodule="" Apr 16 23:50:28.140369 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134481 2562 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 23:50:28.140369 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134484 2562 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 23:50:28.140369 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134577 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 23:50:28.140369 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134581 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 23:50:28.140369 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134585 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 23:50:28.140369 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134588 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 23:50:28.140369 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134591 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 23:50:28.140369 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134594 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 23:50:28.140369 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134597 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 23:50:28.140369 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134600 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 23:50:28.140369 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134602 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 23:50:28.140369 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134605 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 23:50:28.140369 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134609 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 23:50:28.140369 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134612 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 23:50:28.140917 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134614 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 23:50:28.140917 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134617 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 23:50:28.140917 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134619 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 23:50:28.140917 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134622 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 23:50:28.140917 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134625 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 23:50:28.140917 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134628 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 23:50:28.140917 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134630 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 23:50:28.140917 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134633 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 23:50:28.140917 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134635 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 23:50:28.140917 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134640 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 23:50:28.140917 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134642 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 23:50:28.140917 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134645 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 23:50:28.140917 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134648 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 23:50:28.140917 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134651 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 23:50:28.140917 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134654 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 23:50:28.140917 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134657 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 23:50:28.140917 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134659 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 16 23:50:28.140917 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134662 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 23:50:28.140917 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134664 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 23:50:28.140917 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134668 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 23:50:28.141441 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134672 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 23:50:28.141441 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134674 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 23:50:28.141441 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134677 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 23:50:28.141441 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134680 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 23:50:28.141441 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134683 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 23:50:28.141441 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134687 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 23:50:28.141441 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134689 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 23:50:28.141441 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134692 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 23:50:28.141441 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134694 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 23:50:28.141441 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134697 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 23:50:28.141441 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134701 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 23:50:28.141441 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134704 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 23:50:28.141441 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134706 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 23:50:28.141441 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134709 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 23:50:28.141441 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134711 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 23:50:28.141441 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134714 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 23:50:28.141441 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134716 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 23:50:28.141441 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134719 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 23:50:28.141441 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134721 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 23:50:28.141441 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134723 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 23:50:28.141935 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134726 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 23:50:28.141935 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134730 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 23:50:28.141935 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134732 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 23:50:28.141935 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134735 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 23:50:28.141935 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134737 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 23:50:28.141935 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134740 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 23:50:28.141935 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134742 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 23:50:28.141935 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134745 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 23:50:28.141935 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134747 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 23:50:28.141935 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134749 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 23:50:28.141935 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134752 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 23:50:28.141935 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134755 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 23:50:28.141935 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134757 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 23:50:28.141935 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134760 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 23:50:28.141935 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134762 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 23:50:28.141935 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134765 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 23:50:28.141935 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134768 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 23:50:28.141935 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134770 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 23:50:28.141935 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134773 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 23:50:28.142429 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134776 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 23:50:28.142429 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134778 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 23:50:28.142429 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134781 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 23:50:28.142429 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134784 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 23:50:28.142429 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134787 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 23:50:28.142429 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134790 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 23:50:28.142429 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134792 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 23:50:28.142429 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134795 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 23:50:28.142429 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134797 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 23:50:28.142429 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134800 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 23:50:28.142429 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134802 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 23:50:28.142429 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134805 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 23:50:28.142429 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134808 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 23:50:28.142429 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134810 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 23:50:28.142429 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.134815 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 23:50:28.142429 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.134820 2562 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 23:50:28.142829 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.141985 2562 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 23:50:28.142829 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.142002 2562 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 23:50:28.142829 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142048 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 23:50:28.142829 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142054 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 23:50:28.142829 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142059 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 23:50:28.142829 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142062 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 23:50:28.142829 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142066 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 23:50:28.142829 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142069 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 23:50:28.142829 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142072 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 23:50:28.142829 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142075 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 23:50:28.142829 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142078 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 23:50:28.142829 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142081 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 23:50:28.142829 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142083 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 23:50:28.142829 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142086 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 23:50:28.142829 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142088 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 23:50:28.142829 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142091 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 23:50:28.142829 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142093 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 23:50:28.142829 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142096 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 16 23:50:28.142829 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142099 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 23:50:28.142829 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142101 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 23:50:28.143389 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142104 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 23:50:28.143389 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142107 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 23:50:28.143389 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142110 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 23:50:28.143389 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142112 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 23:50:28.143389 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142115 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 23:50:28.143389 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142117 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 23:50:28.143389 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142120 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 23:50:28.143389 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142123 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 23:50:28.143389 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142126 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 23:50:28.143389 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142128 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 23:50:28.143389 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142131 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 23:50:28.143389 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142133 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 23:50:28.143389 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142136 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 23:50:28.143389 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142140 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 23:50:28.143389 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142143 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 23:50:28.143389 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142146 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 23:50:28.143389 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142148 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 23:50:28.143389 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142151 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 23:50:28.143389 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142169 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 23:50:28.143389 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142173 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 23:50:28.143939 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142176 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 23:50:28.143939 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142179 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 23:50:28.143939 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142182 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 23:50:28.143939 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142185 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 23:50:28.143939 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142198 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 23:50:28.143939 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142202 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 23:50:28.143939 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142205 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 23:50:28.143939 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142208 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 23:50:28.143939 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142211 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 23:50:28.143939 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142214 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 23:50:28.143939 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142216 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 23:50:28.143939 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142219 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 23:50:28.143939 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142221 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 23:50:28.143939 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142224 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 23:50:28.143939 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142226 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 23:50:28.143939 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142229 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 23:50:28.143939 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142232 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 23:50:28.143939 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142235 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 23:50:28.143939 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142238 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 23:50:28.144420 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142240 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 23:50:28.144420 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142242 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 23:50:28.144420 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142245 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 23:50:28.144420 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142248 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 23:50:28.144420 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142250 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 23:50:28.144420 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142253 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 23:50:28.144420 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142256 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 23:50:28.144420 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142259 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 23:50:28.144420 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142262 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 23:50:28.144420 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142265 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 23:50:28.144420 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142267 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 23:50:28.144420 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142270 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 23:50:28.144420 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142273 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 23:50:28.144420 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142275 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 23:50:28.144420 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142278 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 23:50:28.144420 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142281 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 23:50:28.144420 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142283 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 23:50:28.144420 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142286 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 23:50:28.144420 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142288 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 23:50:28.144420 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142291 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 23:50:28.144925 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142293 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 23:50:28.144925 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142296 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 23:50:28.144925 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142299 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 23:50:28.144925 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142301 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 23:50:28.144925 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142304 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 23:50:28.144925 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142306 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 23:50:28.144925 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142309 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 23:50:28.144925 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142311 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 23:50:28.144925 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142314 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 23:50:28.144925 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.142319 2562 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 23:50:28.144925 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142419 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 23:50:28.144925 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142424 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 23:50:28.144925 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142427 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 23:50:28.144925 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142430 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 23:50:28.144925 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142433 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 23:50:28.144925 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142436 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 23:50:28.145359 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142438 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 23:50:28.145359 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142441 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 23:50:28.145359 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142444 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 23:50:28.145359 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142447 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 23:50:28.145359 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142450 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 23:50:28.145359 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142453 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 23:50:28.145359 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142456 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 23:50:28.145359 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142458 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 23:50:28.145359 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142461 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 23:50:28.145359 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142464 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 23:50:28.145359 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142468 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 23:50:28.145359 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142472 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 23:50:28.145359 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142475 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 23:50:28.145359 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142478 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 23:50:28.145359 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142480 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 23:50:28.145359 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142483 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 23:50:28.145359 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142486 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 23:50:28.145359 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142488 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 23:50:28.145359 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142491 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 23:50:28.145840 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142493 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 23:50:28.145840 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142496 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 23:50:28.145840 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142498 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 23:50:28.145840 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142501 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 23:50:28.145840 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142503 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 23:50:28.145840 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142506 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 23:50:28.145840 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142508 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 23:50:28.145840 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142511 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 23:50:28.145840 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142513 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 23:50:28.145840 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142516 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 23:50:28.145840 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142518 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 23:50:28.145840 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142521 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 23:50:28.145840 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142523 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 23:50:28.145840 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142526 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 23:50:28.145840 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142529 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 23:50:28.145840 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142532 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 23:50:28.145840 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142535 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 23:50:28.145840 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142537 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 23:50:28.145840 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142540 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 23:50:28.146317 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142543 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 23:50:28.146317 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142545 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 23:50:28.146317 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142548 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 23:50:28.146317 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142551 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 23:50:28.146317 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142553 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 23:50:28.146317 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142555 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 23:50:28.146317 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142558 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 23:50:28.146317 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142560 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 23:50:28.146317 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142563 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 23:50:28.146317 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142565 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 23:50:28.146317 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142568 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 16 23:50:28.146317 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142570 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 23:50:28.146317 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142573 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 23:50:28.146317 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142575 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 23:50:28.146317 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142578 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 23:50:28.146317 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142580 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 23:50:28.146317 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142582 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 23:50:28.146317 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142585 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 23:50:28.146317 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142587 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 23:50:28.146317 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142590 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 23:50:28.146816 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142592 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 23:50:28.146816 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142596 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 23:50:28.146816 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142599 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 23:50:28.146816 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142602 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 23:50:28.146816 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142604 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 23:50:28.146816 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142607 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 23:50:28.146816 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142609 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 23:50:28.146816 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142611 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 23:50:28.146816 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142614 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 23:50:28.146816 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142617 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 23:50:28.146816 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142619 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 23:50:28.146816 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142622 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 23:50:28.146816 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142624 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 23:50:28.146816 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142627 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 23:50:28.146816 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142630 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 23:50:28.146816 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142632 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 23:50:28.146816 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142635 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 23:50:28.146816 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142637 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 23:50:28.146816 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142639 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 23:50:28.146816 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142642 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 23:50:28.147306 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142644 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 23:50:28.147306 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:28.142647 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 23:50:28.147306 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.142652 2562 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 23:50:28.147306 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.143308 2562 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 23:50:28.147306 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.145258 2562 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 23:50:28.147306 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.146287 2562 server.go:1019] "Starting client certificate rotation" Apr 16 23:50:28.147306 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.146376 2562 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 23:50:28.147306 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.146405 2562 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 23:50:28.171481 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.171464 2562 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 23:50:28.177029 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.176947 2562 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 23:50:28.189978 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.189956 2562 log.go:25] "Validated CRI v1 runtime API" Apr 16 23:50:28.195779 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.195765 2562 log.go:25] "Validated CRI v1 image API" Apr 16 23:50:28.197017 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.196997 2562 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 23:50:28.201431 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.201412 2562 fs.go:135] Filesystem UUIDs: map[27976ded-7c45-446c-9450-d2d2274d3c32:/dev/nvme0n1p3 5b23e400-7b77-4511-8b34-e73cb540d776:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 16 23:50:28.201514 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.201430 2562 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 23:50:28.201794 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.201769 2562 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 23:50:28.207450 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.207347 2562 manager.go:217] Machine: {Timestamp:2026-04-16 23:50:28.205690075 +0000 UTC m=+0.414454741 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3094074 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec23faf9c5dc2c420924eaa02c77ded4 SystemUUID:ec23faf9-c5dc-2c42-0924-eaa02c77ded4 BootID:6506f847-f893-4661-a797-e7d5ddfa285b Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:fd:e2:c1:e6:39 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:fd:e2:c1:e6:39 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:5e:25:f8:77:57:00 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 23:50:28.207450 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.207444 2562 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 23:50:28.207593 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.207511 2562 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 23:50:28.209928 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.209909 2562 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 23:50:28.210063 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.209930 2562 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-98.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 23:50:28.210105 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.210069 2562 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 23:50:28.210105 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.210078 2562 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 23:50:28.210105 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.210091 2562 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 23:50:28.211567 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.211557 2562 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 23:50:28.213261 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.213251 2562 state_mem.go:36] "Initialized new in-memory state store" Apr 16 23:50:28.213375 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.213366 2562 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 23:50:28.216151 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.216142 2562 kubelet.go:491] "Attempting to sync node with API server" Apr 16 23:50:28.216184 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.216158 2562 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 23:50:28.216184 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.216172 2562 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 23:50:28.216184 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.216184 2562 kubelet.go:397] "Adding apiserver pod source" Apr 16 23:50:28.216317 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.216216 2562 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 23:50:28.217273 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.217262 2562 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 23:50:28.217309 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.217280 2562 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 23:50:28.220313 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.220299 2562 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 23:50:28.220560 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.220546 2562 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-k4gfz" Apr 16 23:50:28.222061 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.222048 2562 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 23:50:28.223421 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.223410 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 23:50:28.223481 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.223427 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 23:50:28.223481 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.223434 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 23:50:28.223481 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.223439 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 23:50:28.223481 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.223445 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 23:50:28.223481 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.223450 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 23:50:28.223481 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.223455 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 23:50:28.223481 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.223461 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 23:50:28.223481 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.223468 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 23:50:28.223481 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.223474 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 23:50:28.223736 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.223490 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 23:50:28.223736 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.223499 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 23:50:28.226551 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.226533 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 23:50:28.226658 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.226596 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 23:50:28.227729 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.227709 2562 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-k4gfz" Apr 16 23:50:28.230807 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.230789 2562 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 23:50:28.230903 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.230838 2562 server.go:1295] "Started kubelet" Apr 16 23:50:28.230903 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.230852 2562 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-98.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 23:50:28.230997 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.230962 2562 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 23:50:28.231036 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.231011 2562 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 23:50:28.231215 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.231170 2562 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 23:50:28.231801 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:28.231754 2562 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-98.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 23:50:28.231801 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:28.231753 2562 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 23:50:28.231958 ip-10-0-128-98 systemd[1]: Started Kubernetes Kubelet. Apr 16 23:50:28.232202 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.232172 2562 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 23:50:28.234011 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.233994 2562 server.go:317] "Adding debug handlers to kubelet server" Apr 16 23:50:28.237408 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.237393 2562 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 23:50:28.237483 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.237411 2562 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 23:50:28.238098 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.237959 2562 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 23:50:28.238098 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.237981 2562 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 23:50:28.238098 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.237984 2562 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 23:50:28.238300 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.238109 2562 reconstruct.go:97] "Volume reconstruction finished" Apr 16 23:50:28.238300 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.238119 2562 reconciler.go:26] "Reconciler: start to sync state" Apr 16 23:50:28.239907 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.239885 2562 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 23:50:28.240066 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:28.240046 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-98.ec2.internal\" not found" Apr 16 23:50:28.240185 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.240164 2562 factory.go:55] Registering systemd factory Apr 16 23:50:28.240286 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.240205 2562 factory.go:223] Registration of the systemd container factory successfully Apr 16 23:50:28.240884 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.240867 2562 factory.go:153] Registering CRI-O factory Apr 16 23:50:28.240884 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.240886 2562 factory.go:223] Registration of the crio container factory successfully Apr 16 23:50:28.241014 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.240934 2562 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 23:50:28.241014 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.240968 2562 factory.go:103] Registering Raw factory Apr 16 23:50:28.241014 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.240982 2562 manager.go:1196] Started watching for new ooms in manager Apr 16 23:50:28.242173 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.242158 2562 manager.go:319] Starting recovery of all containers Apr 16 23:50:28.243073 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:28.242667 2562 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-128-98.ec2.internal\" not found" node="ip-10-0-128-98.ec2.internal" Apr 16 23:50:28.253719 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.253594 2562 manager.go:324] Recovery completed Apr 16 23:50:28.258596 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.258582 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 23:50:28.261010 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.260993 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-98.ec2.internal" event="NodeHasSufficientMemory" Apr 16 23:50:28.261065 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.261028 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-98.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 23:50:28.261065 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.261045 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-98.ec2.internal" event="NodeHasSufficientPID" Apr 16 23:50:28.261532 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.261514 2562 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 23:50:28.261532 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.261527 2562 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 23:50:28.261667 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.261545 2562 state_mem.go:36] "Initialized new in-memory state store" Apr 16 23:50:28.263858 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.263844 2562 policy_none.go:49] "None policy: Start" Apr 16 23:50:28.263923 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.263863 2562 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 23:50:28.263923 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.263875 2562 state_mem.go:35] "Initializing new in-memory state store" Apr 16 23:50:28.295742 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.295729 2562 manager.go:341] "Starting Device Plugin manager" Apr 16 23:50:28.309136 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:28.295754 2562 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 23:50:28.309136 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.295763 2562 server.go:85] "Starting device plugin registration server" Apr 16 23:50:28.309136 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.295989 2562 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 23:50:28.309136 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.296001 2562 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 23:50:28.309136 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.296128 2562 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 23:50:28.309136 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.296217 2562 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 23:50:28.309136 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.296228 2562 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 23:50:28.309136 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:28.296806 2562 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 23:50:28.309136 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:28.296882 2562 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-98.ec2.internal\" not found" Apr 16 23:50:28.346946 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.346917 2562 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 23:50:28.348012 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.347995 2562 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 23:50:28.348095 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.348021 2562 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 23:50:28.348095 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.348040 2562 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 23:50:28.348095 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.348046 2562 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 23:50:28.348095 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:28.348077 2562 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 23:50:28.350743 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.350724 2562 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 23:50:28.396707 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.396655 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 23:50:28.397582 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.397568 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-98.ec2.internal" event="NodeHasSufficientMemory" Apr 16 23:50:28.397645 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.397596 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-98.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 23:50:28.397645 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.397609 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-98.ec2.internal" event="NodeHasSufficientPID" Apr 16 23:50:28.397645 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.397639 2562 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-98.ec2.internal" Apr 16 23:50:28.406426 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.406407 2562 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-98.ec2.internal" Apr 16 23:50:28.406480 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:28.406430 2562 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-98.ec2.internal\": node \"ip-10-0-128-98.ec2.internal\" not found" Apr 16 23:50:28.419877 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:28.419856 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-98.ec2.internal\" not found" Apr 16 23:50:28.448381 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.448362 2562 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-128-98.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-98.ec2.internal"] Apr 16 23:50:28.448450 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.448434 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 23:50:28.449134 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.449120 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-98.ec2.internal" event="NodeHasSufficientMemory" Apr 16 23:50:28.449213 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.449145 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-98.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 23:50:28.449213 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.449154 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-98.ec2.internal" event="NodeHasSufficientPID" Apr 16 23:50:28.450501 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.450490 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 23:50:28.450623 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.450608 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-98.ec2.internal" Apr 16 23:50:28.450658 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.450634 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 23:50:28.451137 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.451117 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-98.ec2.internal" event="NodeHasSufficientMemory" Apr 16 23:50:28.451233 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.451146 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-98.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 23:50:28.451233 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.451157 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-98.ec2.internal" event="NodeHasSufficientPID" Apr 16 23:50:28.451233 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.451117 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-98.ec2.internal" event="NodeHasSufficientMemory" Apr 16 23:50:28.451233 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.451219 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-98.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 23:50:28.451233 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.451232 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-98.ec2.internal" event="NodeHasSufficientPID" Apr 16 23:50:28.452388 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.452372 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-98.ec2.internal" Apr 16 23:50:28.452434 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.452406 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 23:50:28.452995 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.452979 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-98.ec2.internal" event="NodeHasSufficientMemory" Apr 16 23:50:28.453060 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.453005 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-98.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 23:50:28.453060 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.453016 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-98.ec2.internal" event="NodeHasSufficientPID" Apr 16 23:50:28.467710 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:28.467691 2562 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-98.ec2.internal\" not found" node="ip-10-0-128-98.ec2.internal" Apr 16 23:50:28.471335 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:28.471320 2562 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-98.ec2.internal\" not found" node="ip-10-0-128-98.ec2.internal" Apr 16 23:50:28.520175 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:28.520156 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-98.ec2.internal\" not found" Apr 16 23:50:28.539359 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.539339 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1143ec080bc3618a70cd0c3fccc6942e-config\") pod \"kube-apiserver-proxy-ip-10-0-128-98.ec2.internal\" (UID: \"1143ec080bc3618a70cd0c3fccc6942e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-98.ec2.internal" Apr 16 23:50:28.620642 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:28.620619 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-98.ec2.internal\" not found" Apr 16 23:50:28.639962 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.639942 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1143ec080bc3618a70cd0c3fccc6942e-config\") pod \"kube-apiserver-proxy-ip-10-0-128-98.ec2.internal\" (UID: \"1143ec080bc3618a70cd0c3fccc6942e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-98.ec2.internal" Apr 16 23:50:28.640025 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.639985 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1143ec080bc3618a70cd0c3fccc6942e-config\") pod \"kube-apiserver-proxy-ip-10-0-128-98.ec2.internal\" (UID: \"1143ec080bc3618a70cd0c3fccc6942e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-98.ec2.internal" Apr 16 23:50:28.640072 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.640057 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/04daa51138142523c442144d3720f116-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-98.ec2.internal\" (UID: \"04daa51138142523c442144d3720f116\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-98.ec2.internal" Apr 16 23:50:28.640107 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.640082 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/04daa51138142523c442144d3720f116-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-98.ec2.internal\" (UID: \"04daa51138142523c442144d3720f116\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-98.ec2.internal" Apr 16 23:50:28.721376 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:28.721318 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-98.ec2.internal\" not found" Apr 16 23:50:28.740645 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.740626 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/04daa51138142523c442144d3720f116-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-98.ec2.internal\" (UID: \"04daa51138142523c442144d3720f116\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-98.ec2.internal" Apr 16 23:50:28.740708 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.740650 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/04daa51138142523c442144d3720f116-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-98.ec2.internal\" (UID: \"04daa51138142523c442144d3720f116\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-98.ec2.internal" Apr 16 23:50:28.740708 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.740676 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/04daa51138142523c442144d3720f116-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-98.ec2.internal\" (UID: \"04daa51138142523c442144d3720f116\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-98.ec2.internal" Apr 16 23:50:28.740770 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.740714 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/04daa51138142523c442144d3720f116-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-98.ec2.internal\" (UID: \"04daa51138142523c442144d3720f116\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-98.ec2.internal" Apr 16 23:50:28.769802 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.769784 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-98.ec2.internal" Apr 16 23:50:28.773228 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:28.773214 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-98.ec2.internal" Apr 16 23:50:28.821797 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:28.821770 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-98.ec2.internal\" not found" Apr 16 23:50:28.922316 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:28.922290 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-98.ec2.internal\" not found" Apr 16 23:50:29.022748 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:29.022726 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-98.ec2.internal\" not found" Apr 16 23:50:29.123296 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:29.123269 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-98.ec2.internal\" not found" Apr 16 23:50:29.145733 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:29.145716 2562 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 23:50:29.146237 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:29.145859 2562 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 23:50:29.146237 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:29.145904 2562 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 23:50:29.224266 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:29.224240 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-98.ec2.internal\" not found" Apr 16 23:50:29.231433 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:29.231394 2562 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 23:45:28 +0000 UTC" deadline="2027-10-19 09:12:06.826817756 +0000 UTC" Apr 16 23:50:29.231523 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:29.231436 2562 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13209h21m37.595388389s" Apr 16 23:50:29.234135 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:29.234106 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1143ec080bc3618a70cd0c3fccc6942e.slice/crio-3d79adebe13ea84fff8912b88abb0cfc8e356eb69b7c404b5c6663b75c0fde60 WatchSource:0}: Error finding container 3d79adebe13ea84fff8912b88abb0cfc8e356eb69b7c404b5c6663b75c0fde60: Status 404 returned error can't find the container with id 3d79adebe13ea84fff8912b88abb0cfc8e356eb69b7c404b5c6663b75c0fde60 Apr 16 23:50:29.234564 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:29.234543 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04daa51138142523c442144d3720f116.slice/crio-840edeed9f8ec657ca4c37b7bc7d2cb7662108ce38523c71aebaa018a2aeb3c7 WatchSource:0}: Error finding container 840edeed9f8ec657ca4c37b7bc7d2cb7662108ce38523c71aebaa018a2aeb3c7: Status 404 returned error can't find the container with id 840edeed9f8ec657ca4c37b7bc7d2cb7662108ce38523c71aebaa018a2aeb3c7 Apr 16 23:50:29.237827 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:29.237809 2562 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 23:50:29.238861 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:29.238849 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 23:50:29.249238 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:29.249220 2562 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 23:50:29.267950 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:29.267907 2562 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-kb6bj" Apr 16 23:50:29.273020 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:29.272998 2562 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-kb6bj" Apr 16 23:50:29.285640 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:29.285627 2562 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 23:50:29.324777 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:29.324745 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-98.ec2.internal\" not found" Apr 16 23:50:29.351244 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:29.351202 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-98.ec2.internal" event={"ID":"04daa51138142523c442144d3720f116","Type":"ContainerStarted","Data":"840edeed9f8ec657ca4c37b7bc7d2cb7662108ce38523c71aebaa018a2aeb3c7"} Apr 16 23:50:29.351978 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:29.351954 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-98.ec2.internal" event={"ID":"1143ec080bc3618a70cd0c3fccc6942e","Type":"ContainerStarted","Data":"3d79adebe13ea84fff8912b88abb0cfc8e356eb69b7c404b5c6663b75c0fde60"} Apr 16 23:50:29.425150 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:29.425130 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-98.ec2.internal\" not found" Apr 16 23:50:29.435695 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:29.435678 2562 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 23:50:29.438156 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:29.438143 2562 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-98.ec2.internal" Apr 16 23:50:29.447915 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:29.447900 2562 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 23:50:29.448791 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:29.448779 2562 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-98.ec2.internal" Apr 16 23:50:29.457295 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:29.457282 2562 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 23:50:30.037640 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.037612 2562 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 23:50:30.126371 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.126339 2562 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 23:50:30.217348 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.217321 2562 apiserver.go:52] "Watching apiserver" Apr 16 23:50:30.223296 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.223272 2562 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 23:50:30.223739 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.223718 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-h2hdz","kube-system/kube-apiserver-proxy-ip-10-0-128-98.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wzxkh","openshift-multus/multus-additional-cni-plugins-kwxk9","openshift-multus/multus-mxms2","openshift-multus/network-metrics-daemon-f6rhw","openshift-network-diagnostics/network-check-target-k9xr6","openshift-cluster-node-tuning-operator/tuned-rdxqw","openshift-dns/node-resolver-mhvhh","openshift-image-registry/node-ca-hb7s9","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-98.ec2.internal","openshift-network-operator/iptables-alerter-jsjxw","openshift-ovn-kubernetes/ovnkube-node-frzl2"] Apr 16 23:50:30.225431 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.225404 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9xr6" Apr 16 23:50:30.225553 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:30.225491 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9xr6" podUID="9df8e85f-0f0a-41bb-af0a-cae68056131b" Apr 16 23:50:30.226699 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.226680 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wzxkh" Apr 16 23:50:30.227920 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.227895 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kwxk9" Apr 16 23:50:30.228859 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.228838 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 23:50:30.228950 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.228872 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 23:50:30.228950 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.228944 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-zf2rz\"" Apr 16 23:50:30.229157 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.229139 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.229442 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.229339 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 23:50:30.229707 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.229688 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 23:50:30.229803 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.229694 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-fj65d\"" Apr 16 23:50:30.230028 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.230011 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 23:50:30.230110 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.230056 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 23:50:30.230159 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.230122 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 23:50:30.230159 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.230013 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 23:50:30.230801 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.230781 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 23:50:30.231347 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.231331 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-46v7t\"" Apr 16 23:50:30.231433 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.231406 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6rhw" Apr 16 23:50:30.231499 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:30.231469 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6rhw" podUID="86301427-2e66-4c3a-ab22-55ef8ddc5580" Apr 16 23:50:30.231555 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.231529 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-h2hdz" Apr 16 23:50:30.232679 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.232657 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.233282 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.233261 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 23:50:30.233369 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.233344 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-wtjp4\"" Apr 16 23:50:30.233493 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.233474 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 23:50:30.233891 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.233873 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mhvhh" Apr 16 23:50:30.234373 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.234355 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 23:50:30.234662 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.234638 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 23:50:30.234745 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.234665 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-pz5vg\"" Apr 16 23:50:30.235776 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.235751 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-gj2zl\"" Apr 16 23:50:30.236409 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.235894 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 23:50:30.236409 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.235930 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 23:50:30.236409 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.236060 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hb7s9" Apr 16 23:50:30.237342 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.237324 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-jsjxw" Apr 16 23:50:30.238602 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.238343 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 23:50:30.238688 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.238614 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 23:50:30.239277 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.239257 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 23:50:30.240572 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.240551 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-9zcd5\"" Apr 16 23:50:30.240986 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.240965 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 23:50:30.241096 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.241033 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 23:50:30.241096 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.241060 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.242120 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.241361 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-lnb8h\"" Apr 16 23:50:30.242120 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.241962 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 23:50:30.243717 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.243699 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 23:50:30.243804 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.243792 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 23:50:30.243898 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.243880 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 23:50:30.244120 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.243882 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 23:50:30.244230 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.243964 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 23:50:30.244230 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.244156 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 23:50:30.245063 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.245044 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-tc9gp\"" Apr 16 23:50:30.247540 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.247519 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f1f1c9a5-595b-45a3-b80e-6c39f43d9778-device-dir\") pod \"aws-ebs-csi-driver-node-wzxkh\" (UID: \"f1f1c9a5-595b-45a3-b80e-6c39f43d9778\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wzxkh" Apr 16 23:50:30.247634 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.247557 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpwvv\" (UniqueName: \"kubernetes.io/projected/94b51208-78d1-423e-a0fd-88a34e175744-kube-api-access-zpwvv\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.247634 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.247583 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-var-lib-kubelet\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.247634 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.247609 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz7gx\" (UniqueName: \"kubernetes.io/projected/e55c5371-047f-4464-8977-501dfa689dd1-kube-api-access-kz7gx\") pod \"node-resolver-mhvhh\" (UID: \"e55c5371-047f-4464-8977-501dfa689dd1\") " pod="openshift-dns/node-resolver-mhvhh" Apr 16 23:50:30.247634 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.247633 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-host-var-lib-kubelet\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.247849 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.247656 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-multus-cni-dir\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.247849 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.247680 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f1f1c9a5-595b-45a3-b80e-6c39f43d9778-socket-dir\") pod \"aws-ebs-csi-driver-node-wzxkh\" (UID: \"f1f1c9a5-595b-45a3-b80e-6c39f43d9778\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wzxkh" Apr 16 23:50:30.247849 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.247702 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shqzd\" (UniqueName: \"kubernetes.io/projected/e3f26c03-4fef-458e-8cf1-6d18222c3545-kube-api-access-shqzd\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.247849 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.247736 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86301427-2e66-4c3a-ab22-55ef8ddc5580-metrics-certs\") pod \"network-metrics-daemon-f6rhw\" (UID: \"86301427-2e66-4c3a-ab22-55ef8ddc5580\") " pod="openshift-multus/network-metrics-daemon-f6rhw" Apr 16 23:50:30.247849 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.247763 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/94b51208-78d1-423e-a0fd-88a34e175744-ovnkube-script-lib\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.247849 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.247782 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/987f3171-af5b-41d5-91e1-d31f667a8755-cnibin\") pod \"multus-additional-cni-plugins-kwxk9\" (UID: \"987f3171-af5b-41d5-91e1-d31f667a8755\") " pod="openshift-multus/multus-additional-cni-plugins-kwxk9" Apr 16 23:50:30.247849 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.247803 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/987f3171-af5b-41d5-91e1-d31f667a8755-cni-binary-copy\") pod \"multus-additional-cni-plugins-kwxk9\" (UID: \"987f3171-af5b-41d5-91e1-d31f667a8755\") " pod="openshift-multus/multus-additional-cni-plugins-kwxk9" Apr 16 23:50:30.247849 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.247828 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-lib-modules\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.247849 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.247849 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-host-kubelet\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.248255 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.247872 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-run-systemd\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.248255 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.247895 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/94b51208-78d1-423e-a0fd-88a34e175744-ovnkube-config\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.248255 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.247933 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/94b51208-78d1-423e-a0fd-88a34e175744-env-overrides\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.248255 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.247982 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/987f3171-af5b-41d5-91e1-d31f667a8755-system-cni-dir\") pod \"multus-additional-cni-plugins-kwxk9\" (UID: \"987f3171-af5b-41d5-91e1-d31f667a8755\") " pod="openshift-multus/multus-additional-cni-plugins-kwxk9" Apr 16 23:50:30.248255 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.248018 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-etc-sysconfig\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.248255 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.248066 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-run\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.248255 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.248092 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-tmp\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.248255 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.248115 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/22ac9fa0-d332-4e5c-99d7-e1ac90e53356-iptables-alerter-script\") pod \"iptables-alerter-jsjxw\" (UID: \"22ac9fa0-d332-4e5c-99d7-e1ac90e53356\") " pod="openshift-network-operator/iptables-alerter-jsjxw" Apr 16 23:50:30.248255 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.248145 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw77d\" (UniqueName: \"kubernetes.io/projected/22ac9fa0-d332-4e5c-99d7-e1ac90e53356-kube-api-access-mw77d\") pod \"iptables-alerter-jsjxw\" (UID: \"22ac9fa0-d332-4e5c-99d7-e1ac90e53356\") " pod="openshift-network-operator/iptables-alerter-jsjxw" Apr 16 23:50:30.248255 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.248159 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-host-run-netns\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.248255 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.248201 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-os-release\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.248255 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.248221 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e3f26c03-4fef-458e-8cf1-6d18222c3545-cni-binary-copy\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.248255 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.248235 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-hostroot\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.248255 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.248249 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-etc-systemd\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.248844 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.248263 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bthtp\" (UniqueName: \"kubernetes.io/projected/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-kube-api-access-bthtp\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.248844 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.248300 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-systemd-units\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.248844 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.248337 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/987f3171-af5b-41d5-91e1-d31f667a8755-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kwxk9\" (UID: \"987f3171-af5b-41d5-91e1-d31f667a8755\") " pod="openshift-multus/multus-additional-cni-plugins-kwxk9" Apr 16 23:50:30.248844 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.248362 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/987f3171-af5b-41d5-91e1-d31f667a8755-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kwxk9\" (UID: \"987f3171-af5b-41d5-91e1-d31f667a8755\") " pod="openshift-multus/multus-additional-cni-plugins-kwxk9" Apr 16 23:50:30.248844 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.248402 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-sys\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.248844 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.248441 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hxxv\" (UniqueName: \"kubernetes.io/projected/ac3d5de2-ce49-4327-89f8-f4642c3bc2f3-kube-api-access-9hxxv\") pod \"node-ca-hb7s9\" (UID: \"ac3d5de2-ce49-4327-89f8-f4642c3bc2f3\") " pod="openshift-image-registry/node-ca-hb7s9" Apr 16 23:50:30.248844 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.248463 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/22ac9fa0-d332-4e5c-99d7-e1ac90e53356-host-slash\") pod \"iptables-alerter-jsjxw\" (UID: \"22ac9fa0-d332-4e5c-99d7-e1ac90e53356\") " pod="openshift-network-operator/iptables-alerter-jsjxw" Apr 16 23:50:30.248844 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.248481 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-log-socket\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.248844 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.248506 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-host-cni-bin\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.248844 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.248529 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-multus-socket-dir-parent\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.248844 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.248552 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-host-run-netns\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.248844 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.248575 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-host-var-lib-cni-bin\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.248844 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.248598 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e6f655b8-f886-471e-9fd2-4685907346a7-konnectivity-ca\") pod \"konnectivity-agent-h2hdz\" (UID: \"e6f655b8-f886-471e-9fd2-4685907346a7\") " pod="kube-system/konnectivity-agent-h2hdz" Apr 16 23:50:30.248844 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.248621 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-system-cni-dir\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.248844 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.248644 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-cnibin\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.248844 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.248664 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-host-run-multus-certs\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.248844 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.248700 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-etc-kubernetes\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.249493 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.248728 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-etc-modprobe-d\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.249493 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.248749 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-etc-tuned\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.249493 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.248812 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e55c5371-047f-4464-8977-501dfa689dd1-tmp-dir\") pod \"node-resolver-mhvhh\" (UID: \"e55c5371-047f-4464-8977-501dfa689dd1\") " pod="openshift-dns/node-resolver-mhvhh" Apr 16 23:50:30.249493 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.248849 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ac3d5de2-ce49-4327-89f8-f4642c3bc2f3-host\") pod \"node-ca-hb7s9\" (UID: \"ac3d5de2-ce49-4327-89f8-f4642c3bc2f3\") " pod="openshift-image-registry/node-ca-hb7s9" Apr 16 23:50:30.249493 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.248873 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9g9l\" (UniqueName: \"kubernetes.io/projected/86301427-2e66-4c3a-ab22-55ef8ddc5580-kube-api-access-d9g9l\") pod \"network-metrics-daemon-f6rhw\" (UID: \"86301427-2e66-4c3a-ab22-55ef8ddc5580\") " pod="openshift-multus/network-metrics-daemon-f6rhw" Apr 16 23:50:30.249493 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.248895 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-host-slash\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.249493 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.248918 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-run-openvswitch\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.249493 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.248952 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-host-run-ovn-kubernetes\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.249493 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.248977 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-etc-kubernetes\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.249493 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.249005 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-etc-sysctl-d\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.249493 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.249029 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f1f1c9a5-595b-45a3-b80e-6c39f43d9778-registration-dir\") pod \"aws-ebs-csi-driver-node-wzxkh\" (UID: \"f1f1c9a5-595b-45a3-b80e-6c39f43d9778\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wzxkh" Apr 16 23:50:30.249493 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.249062 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-etc-openvswitch\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.249493 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.249085 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-run-ovn\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.249493 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.249108 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.249493 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.249132 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/987f3171-af5b-41d5-91e1-d31f667a8755-os-release\") pod \"multus-additional-cni-plugins-kwxk9\" (UID: \"987f3171-af5b-41d5-91e1-d31f667a8755\") " pod="openshift-multus/multus-additional-cni-plugins-kwxk9" Apr 16 23:50:30.249493 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.249171 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dzlh\" (UniqueName: \"kubernetes.io/projected/987f3171-af5b-41d5-91e1-d31f667a8755-kube-api-access-7dzlh\") pod \"multus-additional-cni-plugins-kwxk9\" (UID: \"987f3171-af5b-41d5-91e1-d31f667a8755\") " pod="openshift-multus/multus-additional-cni-plugins-kwxk9" Apr 16 23:50:30.250031 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.249217 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-host-var-lib-cni-multus\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.250031 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.249240 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e3f26c03-4fef-458e-8cf1-6d18222c3545-multus-daemon-config\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.250031 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.249256 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e55c5371-047f-4464-8977-501dfa689dd1-hosts-file\") pod \"node-resolver-mhvhh\" (UID: \"e55c5371-047f-4464-8977-501dfa689dd1\") " pod="openshift-dns/node-resolver-mhvhh" Apr 16 23:50:30.250031 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.249280 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f1f1c9a5-595b-45a3-b80e-6c39f43d9778-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wzxkh\" (UID: \"f1f1c9a5-595b-45a3-b80e-6c39f43d9778\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wzxkh" Apr 16 23:50:30.250031 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.249299 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f1f1c9a5-595b-45a3-b80e-6c39f43d9778-etc-selinux\") pod \"aws-ebs-csi-driver-node-wzxkh\" (UID: \"f1f1c9a5-595b-45a3-b80e-6c39f43d9778\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wzxkh" Apr 16 23:50:30.250031 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.249319 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-host-cni-netd\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.250031 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.249341 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/987f3171-af5b-41d5-91e1-d31f667a8755-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kwxk9\" (UID: \"987f3171-af5b-41d5-91e1-d31f667a8755\") " pod="openshift-multus/multus-additional-cni-plugins-kwxk9" Apr 16 23:50:30.250031 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.249362 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-multus-conf-dir\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.250031 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.249384 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-etc-sysctl-conf\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.250031 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.249405 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkvpw\" (UniqueName: \"kubernetes.io/projected/f1f1c9a5-595b-45a3-b80e-6c39f43d9778-kube-api-access-fkvpw\") pod \"aws-ebs-csi-driver-node-wzxkh\" (UID: \"f1f1c9a5-595b-45a3-b80e-6c39f43d9778\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wzxkh" Apr 16 23:50:30.250031 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.249419 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-var-lib-openvswitch\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.250031 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.249439 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/94b51208-78d1-423e-a0fd-88a34e175744-ovn-node-metrics-cert\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.250031 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.249457 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f1f1c9a5-595b-45a3-b80e-6c39f43d9778-sys-fs\") pod \"aws-ebs-csi-driver-node-wzxkh\" (UID: \"f1f1c9a5-595b-45a3-b80e-6c39f43d9778\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wzxkh" Apr 16 23:50:30.250031 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.249499 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-node-log\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.250031 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.249530 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqmck\" (UniqueName: \"kubernetes.io/projected/9df8e85f-0f0a-41bb-af0a-cae68056131b-kube-api-access-nqmck\") pod \"network-check-target-k9xr6\" (UID: \"9df8e85f-0f0a-41bb-af0a-cae68056131b\") " pod="openshift-network-diagnostics/network-check-target-k9xr6" Apr 16 23:50:30.250031 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.249554 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-host\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.250598 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.249577 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-host-run-k8s-cni-cncf-io\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.250598 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.249599 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ac3d5de2-ce49-4327-89f8-f4642c3bc2f3-serviceca\") pod \"node-ca-hb7s9\" (UID: \"ac3d5de2-ce49-4327-89f8-f4642c3bc2f3\") " pod="openshift-image-registry/node-ca-hb7s9" Apr 16 23:50:30.250598 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.249622 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e6f655b8-f886-471e-9fd2-4685907346a7-agent-certs\") pod \"konnectivity-agent-h2hdz\" (UID: \"e6f655b8-f886-471e-9fd2-4685907346a7\") " pod="kube-system/konnectivity-agent-h2hdz" Apr 16 23:50:30.273571 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.273538 2562 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 23:45:29 +0000 UTC" deadline="2027-10-31 07:29:04.245216724 +0000 UTC" Apr 16 23:50:30.273571 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.273564 2562 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13495h38m33.971655836s" Apr 16 23:50:30.338671 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.338634 2562 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 23:50:30.352344 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.349951 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-etc-sysctl-d\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.352344 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.349997 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f1f1c9a5-595b-45a3-b80e-6c39f43d9778-registration-dir\") pod \"aws-ebs-csi-driver-node-wzxkh\" (UID: \"f1f1c9a5-595b-45a3-b80e-6c39f43d9778\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wzxkh" Apr 16 23:50:30.352344 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.350030 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-etc-openvswitch\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.352344 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.350062 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-run-ovn\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.352344 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.350087 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.352344 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.350119 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/987f3171-af5b-41d5-91e1-d31f667a8755-os-release\") pod \"multus-additional-cni-plugins-kwxk9\" (UID: \"987f3171-af5b-41d5-91e1-d31f667a8755\") " pod="openshift-multus/multus-additional-cni-plugins-kwxk9" Apr 16 23:50:30.352344 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.350148 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7dzlh\" (UniqueName: \"kubernetes.io/projected/987f3171-af5b-41d5-91e1-d31f667a8755-kube-api-access-7dzlh\") pod \"multus-additional-cni-plugins-kwxk9\" (UID: \"987f3171-af5b-41d5-91e1-d31f667a8755\") " pod="openshift-multus/multus-additional-cni-plugins-kwxk9" Apr 16 23:50:30.352344 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.350179 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-host-var-lib-cni-multus\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.352344 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.350229 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e3f26c03-4fef-458e-8cf1-6d18222c3545-multus-daemon-config\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.352344 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.350252 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e55c5371-047f-4464-8977-501dfa689dd1-hosts-file\") pod \"node-resolver-mhvhh\" (UID: \"e55c5371-047f-4464-8977-501dfa689dd1\") " pod="openshift-dns/node-resolver-mhvhh" Apr 16 23:50:30.352344 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.350283 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f1f1c9a5-595b-45a3-b80e-6c39f43d9778-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wzxkh\" (UID: \"f1f1c9a5-595b-45a3-b80e-6c39f43d9778\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wzxkh" Apr 16 23:50:30.352344 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.350314 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f1f1c9a5-595b-45a3-b80e-6c39f43d9778-etc-selinux\") pod \"aws-ebs-csi-driver-node-wzxkh\" (UID: \"f1f1c9a5-595b-45a3-b80e-6c39f43d9778\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wzxkh" Apr 16 23:50:30.352344 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.350342 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-host-cni-netd\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.352344 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.350373 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/987f3171-af5b-41d5-91e1-d31f667a8755-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kwxk9\" (UID: \"987f3171-af5b-41d5-91e1-d31f667a8755\") " pod="openshift-multus/multus-additional-cni-plugins-kwxk9" Apr 16 23:50:30.352344 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.350404 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-multus-conf-dir\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.352344 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.350429 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-etc-sysctl-conf\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.352344 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.350461 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fkvpw\" (UniqueName: \"kubernetes.io/projected/f1f1c9a5-595b-45a3-b80e-6c39f43d9778-kube-api-access-fkvpw\") pod \"aws-ebs-csi-driver-node-wzxkh\" (UID: \"f1f1c9a5-595b-45a3-b80e-6c39f43d9778\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wzxkh" Apr 16 23:50:30.353244 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.350495 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-var-lib-openvswitch\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.353244 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.350525 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/94b51208-78d1-423e-a0fd-88a34e175744-ovn-node-metrics-cert\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.353244 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.350559 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f1f1c9a5-595b-45a3-b80e-6c39f43d9778-sys-fs\") pod \"aws-ebs-csi-driver-node-wzxkh\" (UID: \"f1f1c9a5-595b-45a3-b80e-6c39f43d9778\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wzxkh" Apr 16 23:50:30.353244 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.350590 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-node-log\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.353244 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.350617 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nqmck\" (UniqueName: \"kubernetes.io/projected/9df8e85f-0f0a-41bb-af0a-cae68056131b-kube-api-access-nqmck\") pod \"network-check-target-k9xr6\" (UID: \"9df8e85f-0f0a-41bb-af0a-cae68056131b\") " pod="openshift-network-diagnostics/network-check-target-k9xr6" Apr 16 23:50:30.353244 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.350648 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-host\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.353244 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.350676 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-host-run-k8s-cni-cncf-io\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.353244 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.350705 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ac3d5de2-ce49-4327-89f8-f4642c3bc2f3-serviceca\") pod \"node-ca-hb7s9\" (UID: \"ac3d5de2-ce49-4327-89f8-f4642c3bc2f3\") " pod="openshift-image-registry/node-ca-hb7s9" Apr 16 23:50:30.353244 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.350734 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e6f655b8-f886-471e-9fd2-4685907346a7-agent-certs\") pod \"konnectivity-agent-h2hdz\" (UID: \"e6f655b8-f886-471e-9fd2-4685907346a7\") " pod="kube-system/konnectivity-agent-h2hdz" Apr 16 23:50:30.353244 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.350759 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f1f1c9a5-595b-45a3-b80e-6c39f43d9778-device-dir\") pod \"aws-ebs-csi-driver-node-wzxkh\" (UID: \"f1f1c9a5-595b-45a3-b80e-6c39f43d9778\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wzxkh" Apr 16 23:50:30.353244 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.350790 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zpwvv\" (UniqueName: \"kubernetes.io/projected/94b51208-78d1-423e-a0fd-88a34e175744-kube-api-access-zpwvv\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.353244 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.350820 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-var-lib-kubelet\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.353244 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.350852 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kz7gx\" (UniqueName: \"kubernetes.io/projected/e55c5371-047f-4464-8977-501dfa689dd1-kube-api-access-kz7gx\") pod \"node-resolver-mhvhh\" (UID: \"e55c5371-047f-4464-8977-501dfa689dd1\") " pod="openshift-dns/node-resolver-mhvhh" Apr 16 23:50:30.353244 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.350881 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-host-var-lib-kubelet\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.353244 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.350905 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-multus-cni-dir\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.353244 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.350936 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f1f1c9a5-595b-45a3-b80e-6c39f43d9778-socket-dir\") pod \"aws-ebs-csi-driver-node-wzxkh\" (UID: \"f1f1c9a5-595b-45a3-b80e-6c39f43d9778\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wzxkh" Apr 16 23:50:30.353244 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.350965 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-shqzd\" (UniqueName: \"kubernetes.io/projected/e3f26c03-4fef-458e-8cf1-6d18222c3545-kube-api-access-shqzd\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.354031 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.351001 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86301427-2e66-4c3a-ab22-55ef8ddc5580-metrics-certs\") pod \"network-metrics-daemon-f6rhw\" (UID: \"86301427-2e66-4c3a-ab22-55ef8ddc5580\") " pod="openshift-multus/network-metrics-daemon-f6rhw" Apr 16 23:50:30.354031 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.351039 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/94b51208-78d1-423e-a0fd-88a34e175744-ovnkube-script-lib\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.354031 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.351072 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/987f3171-af5b-41d5-91e1-d31f667a8755-cnibin\") pod \"multus-additional-cni-plugins-kwxk9\" (UID: \"987f3171-af5b-41d5-91e1-d31f667a8755\") " pod="openshift-multus/multus-additional-cni-plugins-kwxk9" Apr 16 23:50:30.354031 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.351104 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/987f3171-af5b-41d5-91e1-d31f667a8755-cni-binary-copy\") pod \"multus-additional-cni-plugins-kwxk9\" (UID: \"987f3171-af5b-41d5-91e1-d31f667a8755\") " pod="openshift-multus/multus-additional-cni-plugins-kwxk9" Apr 16 23:50:30.354031 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.351129 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-lib-modules\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.354031 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.351159 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-host-kubelet\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.354031 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.351205 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-run-systemd\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.354031 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.351238 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/94b51208-78d1-423e-a0fd-88a34e175744-ovnkube-config\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.354031 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.351269 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/94b51208-78d1-423e-a0fd-88a34e175744-env-overrides\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.354031 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.351297 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/987f3171-af5b-41d5-91e1-d31f667a8755-system-cni-dir\") pod \"multus-additional-cni-plugins-kwxk9\" (UID: \"987f3171-af5b-41d5-91e1-d31f667a8755\") " pod="openshift-multus/multus-additional-cni-plugins-kwxk9" Apr 16 23:50:30.354031 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.351330 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-etc-sysconfig\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.354031 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.351360 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-run\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.354031 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.351389 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-tmp\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.354031 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.351421 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/22ac9fa0-d332-4e5c-99d7-e1ac90e53356-iptables-alerter-script\") pod \"iptables-alerter-jsjxw\" (UID: \"22ac9fa0-d332-4e5c-99d7-e1ac90e53356\") " pod="openshift-network-operator/iptables-alerter-jsjxw" Apr 16 23:50:30.354031 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.351455 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mw77d\" (UniqueName: \"kubernetes.io/projected/22ac9fa0-d332-4e5c-99d7-e1ac90e53356-kube-api-access-mw77d\") pod \"iptables-alerter-jsjxw\" (UID: \"22ac9fa0-d332-4e5c-99d7-e1ac90e53356\") " pod="openshift-network-operator/iptables-alerter-jsjxw" Apr 16 23:50:30.354031 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.351485 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-host-run-netns\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.354031 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.351515 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-os-release\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.354844 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.351544 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e3f26c03-4fef-458e-8cf1-6d18222c3545-cni-binary-copy\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.354844 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.351575 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-hostroot\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.354844 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.351605 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-etc-systemd\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.354844 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.351632 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bthtp\" (UniqueName: \"kubernetes.io/projected/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-kube-api-access-bthtp\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.354844 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.351677 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-systemd-units\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.354844 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.351710 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/987f3171-af5b-41d5-91e1-d31f667a8755-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kwxk9\" (UID: \"987f3171-af5b-41d5-91e1-d31f667a8755\") " pod="openshift-multus/multus-additional-cni-plugins-kwxk9" Apr 16 23:50:30.354844 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.351745 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/987f3171-af5b-41d5-91e1-d31f667a8755-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kwxk9\" (UID: \"987f3171-af5b-41d5-91e1-d31f667a8755\") " pod="openshift-multus/multus-additional-cni-plugins-kwxk9" Apr 16 23:50:30.354844 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.351773 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-sys\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.354844 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.351801 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9hxxv\" (UniqueName: \"kubernetes.io/projected/ac3d5de2-ce49-4327-89f8-f4642c3bc2f3-kube-api-access-9hxxv\") pod \"node-ca-hb7s9\" (UID: \"ac3d5de2-ce49-4327-89f8-f4642c3bc2f3\") " pod="openshift-image-registry/node-ca-hb7s9" Apr 16 23:50:30.354844 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.351830 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/22ac9fa0-d332-4e5c-99d7-e1ac90e53356-host-slash\") pod \"iptables-alerter-jsjxw\" (UID: \"22ac9fa0-d332-4e5c-99d7-e1ac90e53356\") " pod="openshift-network-operator/iptables-alerter-jsjxw" Apr 16 23:50:30.354844 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.351859 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-log-socket\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.354844 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.351889 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-host-cni-bin\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.354844 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.351993 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-multus-socket-dir-parent\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.354844 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.352003 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-host-cni-bin\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.354844 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.352034 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-host-run-netns\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.354844 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.352067 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-host-var-lib-cni-bin\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.354844 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.352097 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e6f655b8-f886-471e-9fd2-4685907346a7-konnectivity-ca\") pod \"konnectivity-agent-h2hdz\" (UID: \"e6f655b8-f886-471e-9fd2-4685907346a7\") " pod="kube-system/konnectivity-agent-h2hdz" Apr 16 23:50:30.355622 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.352128 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-system-cni-dir\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.355622 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.352143 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-etc-sysctl-d\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.355622 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.352167 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-cnibin\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.355622 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.352222 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-host-run-multus-certs\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.355622 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.352257 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-etc-kubernetes\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.355622 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.352290 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-etc-modprobe-d\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.355622 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.352382 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-etc-tuned\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.355622 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.352430 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e55c5371-047f-4464-8977-501dfa689dd1-tmp-dir\") pod \"node-resolver-mhvhh\" (UID: \"e55c5371-047f-4464-8977-501dfa689dd1\") " pod="openshift-dns/node-resolver-mhvhh" Apr 16 23:50:30.355622 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.352467 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ac3d5de2-ce49-4327-89f8-f4642c3bc2f3-host\") pod \"node-ca-hb7s9\" (UID: \"ac3d5de2-ce49-4327-89f8-f4642c3bc2f3\") " pod="openshift-image-registry/node-ca-hb7s9" Apr 16 23:50:30.355622 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.352499 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-etc-sysctl-conf\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.355622 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.352499 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.355622 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.352553 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f1f1c9a5-595b-45a3-b80e-6c39f43d9778-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wzxkh\" (UID: \"f1f1c9a5-595b-45a3-b80e-6c39f43d9778\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wzxkh" Apr 16 23:50:30.355622 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.352580 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/987f3171-af5b-41d5-91e1-d31f667a8755-os-release\") pod \"multus-additional-cni-plugins-kwxk9\" (UID: \"987f3171-af5b-41d5-91e1-d31f667a8755\") " pod="openshift-multus/multus-additional-cni-plugins-kwxk9" Apr 16 23:50:30.355622 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.352621 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-host-cni-netd\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.355622 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.352657 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-etc-modprobe-d\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.355622 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.352700 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f1f1c9a5-595b-45a3-b80e-6c39f43d9778-socket-dir\") pod \"aws-ebs-csi-driver-node-wzxkh\" (UID: \"f1f1c9a5-595b-45a3-b80e-6c39f43d9778\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wzxkh" Apr 16 23:50:30.355622 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.352718 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-host-var-lib-cni-multus\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.355622 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.352753 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-multus-conf-dir\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.356457 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.352502 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d9g9l\" (UniqueName: \"kubernetes.io/projected/86301427-2e66-4c3a-ab22-55ef8ddc5580-kube-api-access-d9g9l\") pod \"network-metrics-daemon-f6rhw\" (UID: \"86301427-2e66-4c3a-ab22-55ef8ddc5580\") " pod="openshift-multus/network-metrics-daemon-f6rhw" Apr 16 23:50:30.356457 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.352811 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-host-slash\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.356457 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.352899 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-lib-modules\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.356457 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.352966 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-host-kubelet\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.356457 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.353103 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e3f26c03-4fef-458e-8cf1-6d18222c3545-multus-daemon-config\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.356457 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.353210 2562 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 23:50:30.356457 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.353262 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-run-systemd\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.356457 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.353364 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-host-var-lib-kubelet\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.356457 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.353380 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/94b51208-78d1-423e-a0fd-88a34e175744-ovnkube-script-lib\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.356457 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.353410 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-var-lib-kubelet\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.356457 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.352868 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-run-openvswitch\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.356457 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.353444 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f1f1c9a5-595b-45a3-b80e-6c39f43d9778-device-dir\") pod \"aws-ebs-csi-driver-node-wzxkh\" (UID: \"f1f1c9a5-595b-45a3-b80e-6c39f43d9778\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wzxkh" Apr 16 23:50:30.356457 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.353451 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-host-slash\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.356457 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.353462 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-host-run-ovn-kubernetes\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.356457 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.353471 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-run-openvswitch\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.356457 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.353520 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-host-run-ovn-kubernetes\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.356457 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.353541 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/22ac9fa0-d332-4e5c-99d7-e1ac90e53356-iptables-alerter-script\") pod \"iptables-alerter-jsjxw\" (UID: \"22ac9fa0-d332-4e5c-99d7-e1ac90e53356\") " pod="openshift-network-operator/iptables-alerter-jsjxw" Apr 16 23:50:30.356457 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.353559 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e55c5371-047f-4464-8977-501dfa689dd1-hosts-file\") pod \"node-resolver-mhvhh\" (UID: \"e55c5371-047f-4464-8977-501dfa689dd1\") " pod="openshift-dns/node-resolver-mhvhh" Apr 16 23:50:30.357228 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.353593 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-multus-socket-dir-parent\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.357228 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.353613 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-host-run-k8s-cni-cncf-io\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.357228 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.353649 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-host-run-netns\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.357228 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.353697 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-etc-kubernetes\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.357228 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.353813 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-etc-kubernetes\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.357228 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.353837 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/94b51208-78d1-423e-a0fd-88a34e175744-ovnkube-config\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.357228 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.353866 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-host-var-lib-cni-bin\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.357228 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.353873 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-run\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.357228 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.353950 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/987f3171-af5b-41d5-91e1-d31f667a8755-system-cni-dir\") pod \"multus-additional-cni-plugins-kwxk9\" (UID: \"987f3171-af5b-41d5-91e1-d31f667a8755\") " pod="openshift-multus/multus-additional-cni-plugins-kwxk9" Apr 16 23:50:30.357228 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.354061 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-etc-sysconfig\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.357228 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.354070 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ac3d5de2-ce49-4327-89f8-f4642c3bc2f3-serviceca\") pod \"node-ca-hb7s9\" (UID: \"ac3d5de2-ce49-4327-89f8-f4642c3bc2f3\") " pod="openshift-image-registry/node-ca-hb7s9" Apr 16 23:50:30.357228 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.354293 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/987f3171-af5b-41d5-91e1-d31f667a8755-cnibin\") pod \"multus-additional-cni-plugins-kwxk9\" (UID: \"987f3171-af5b-41d5-91e1-d31f667a8755\") " pod="openshift-multus/multus-additional-cni-plugins-kwxk9" Apr 16 23:50:30.357228 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.354467 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-node-log\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.357228 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.354770 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-host-run-netns\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.357228 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.354848 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-os-release\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.357228 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.354849 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/987f3171-af5b-41d5-91e1-d31f667a8755-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kwxk9\" (UID: \"987f3171-af5b-41d5-91e1-d31f667a8755\") " pod="openshift-multus/multus-additional-cni-plugins-kwxk9" Apr 16 23:50:30.357228 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.354960 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-sys\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.357228 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.352224 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f1f1c9a5-595b-45a3-b80e-6c39f43d9778-registration-dir\") pod \"aws-ebs-csi-driver-node-wzxkh\" (UID: \"f1f1c9a5-595b-45a3-b80e-6c39f43d9778\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wzxkh" Apr 16 23:50:30.357956 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.355142 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-etc-kubernetes\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.357956 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.355159 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-cnibin\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.357956 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.355211 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-etc-openvswitch\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.357956 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.355221 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/987f3171-af5b-41d5-91e1-d31f667a8755-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kwxk9\" (UID: \"987f3171-af5b-41d5-91e1-d31f667a8755\") " pod="openshift-multus/multus-additional-cni-plugins-kwxk9" Apr 16 23:50:30.357956 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.355235 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-run-ovn\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.357956 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:30.352554 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:30.357956 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.355274 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-systemd-units\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.357956 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.355283 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-hostroot\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.357956 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:30.355351 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86301427-2e66-4c3a-ab22-55ef8ddc5580-metrics-certs podName:86301427-2e66-4c3a-ab22-55ef8ddc5580 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:30.855298655 +0000 UTC m=+3.064063320 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86301427-2e66-4c3a-ab22-55ef8ddc5580-metrics-certs") pod "network-metrics-daemon-f6rhw" (UID: "86301427-2e66-4c3a-ab22-55ef8ddc5580") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:30.357956 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.355409 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f1f1c9a5-595b-45a3-b80e-6c39f43d9778-sys-fs\") pod \"aws-ebs-csi-driver-node-wzxkh\" (UID: \"f1f1c9a5-595b-45a3-b80e-6c39f43d9778\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wzxkh" Apr 16 23:50:30.357956 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.355453 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-host-run-multus-certs\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.357956 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.355478 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e6f655b8-f886-471e-9fd2-4685907346a7-konnectivity-ca\") pod \"konnectivity-agent-h2hdz\" (UID: \"e6f655b8-f886-471e-9fd2-4685907346a7\") " pod="kube-system/konnectivity-agent-h2hdz" Apr 16 23:50:30.357956 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.355495 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f1f1c9a5-595b-45a3-b80e-6c39f43d9778-etc-selinux\") pod \"aws-ebs-csi-driver-node-wzxkh\" (UID: \"f1f1c9a5-595b-45a3-b80e-6c39f43d9778\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wzxkh" Apr 16 23:50:30.357956 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.355563 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-etc-systemd\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.357956 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.355628 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e55c5371-047f-4464-8977-501dfa689dd1-tmp-dir\") pod \"node-resolver-mhvhh\" (UID: \"e55c5371-047f-4464-8977-501dfa689dd1\") " pod="openshift-dns/node-resolver-mhvhh" Apr 16 23:50:30.357956 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.355870 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-var-lib-openvswitch\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.357956 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.355920 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e3f26c03-4fef-458e-8cf1-6d18222c3545-cni-binary-copy\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.357956 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.355930 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/22ac9fa0-d332-4e5c-99d7-e1ac90e53356-host-slash\") pod \"iptables-alerter-jsjxw\" (UID: \"22ac9fa0-d332-4e5c-99d7-e1ac90e53356\") " pod="openshift-network-operator/iptables-alerter-jsjxw" Apr 16 23:50:30.358516 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.355986 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-host\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.358516 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.356024 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/987f3171-af5b-41d5-91e1-d31f667a8755-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kwxk9\" (UID: \"987f3171-af5b-41d5-91e1-d31f667a8755\") " pod="openshift-multus/multus-additional-cni-plugins-kwxk9" Apr 16 23:50:30.358516 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.356074 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/94b51208-78d1-423e-a0fd-88a34e175744-log-socket\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.358516 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.356092 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ac3d5de2-ce49-4327-89f8-f4642c3bc2f3-host\") pod \"node-ca-hb7s9\" (UID: \"ac3d5de2-ce49-4327-89f8-f4642c3bc2f3\") " pod="openshift-image-registry/node-ca-hb7s9" Apr 16 23:50:30.358516 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.356282 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/987f3171-af5b-41d5-91e1-d31f667a8755-cni-binary-copy\") pod \"multus-additional-cni-plugins-kwxk9\" (UID: \"987f3171-af5b-41d5-91e1-d31f667a8755\") " pod="openshift-multus/multus-additional-cni-plugins-kwxk9" Apr 16 23:50:30.358516 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.356461 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/94b51208-78d1-423e-a0fd-88a34e175744-env-overrides\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.358516 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.356518 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-system-cni-dir\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.358516 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.357069 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e3f26c03-4fef-458e-8cf1-6d18222c3545-multus-cni-dir\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.358516 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.358005 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/94b51208-78d1-423e-a0fd-88a34e175744-ovn-node-metrics-cert\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.358867 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.358674 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-etc-tuned\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.358867 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.358832 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e6f655b8-f886-471e-9fd2-4685907346a7-agent-certs\") pod \"konnectivity-agent-h2hdz\" (UID: \"e6f655b8-f886-471e-9fd2-4685907346a7\") " pod="kube-system/konnectivity-agent-h2hdz" Apr 16 23:50:30.358867 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.358863 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-tmp\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.359940 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:30.359919 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:50:30.360054 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:30.359944 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:50:30.360054 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:30.359959 2562 projected.go:194] Error preparing data for projected volume kube-api-access-nqmck for pod openshift-network-diagnostics/network-check-target-k9xr6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:30.360185 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:30.360148 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9df8e85f-0f0a-41bb-af0a-cae68056131b-kube-api-access-nqmck podName:9df8e85f-0f0a-41bb-af0a-cae68056131b nodeName:}" failed. No retries permitted until 2026-04-16 23:50:30.860118877 +0000 UTC m=+3.068883548 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-nqmck" (UniqueName: "kubernetes.io/projected/9df8e85f-0f0a-41bb-af0a-cae68056131b-kube-api-access-nqmck") pod "network-check-target-k9xr6" (UID: "9df8e85f-0f0a-41bb-af0a-cae68056131b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:30.362317 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.362267 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpwvv\" (UniqueName: \"kubernetes.io/projected/94b51208-78d1-423e-a0fd-88a34e175744-kube-api-access-zpwvv\") pod \"ovnkube-node-frzl2\" (UID: \"94b51208-78d1-423e-a0fd-88a34e175744\") " pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.362717 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.362693 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9g9l\" (UniqueName: \"kubernetes.io/projected/86301427-2e66-4c3a-ab22-55ef8ddc5580-kube-api-access-d9g9l\") pod \"network-metrics-daemon-f6rhw\" (UID: \"86301427-2e66-4c3a-ab22-55ef8ddc5580\") " pod="openshift-multus/network-metrics-daemon-f6rhw" Apr 16 23:50:30.363275 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.363253 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-shqzd\" (UniqueName: \"kubernetes.io/projected/e3f26c03-4fef-458e-8cf1-6d18222c3545-kube-api-access-shqzd\") pod \"multus-mxms2\" (UID: \"e3f26c03-4fef-458e-8cf1-6d18222c3545\") " pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.364338 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.364295 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw77d\" (UniqueName: \"kubernetes.io/projected/22ac9fa0-d332-4e5c-99d7-e1ac90e53356-kube-api-access-mw77d\") pod \"iptables-alerter-jsjxw\" (UID: \"22ac9fa0-d332-4e5c-99d7-e1ac90e53356\") " pod="openshift-network-operator/iptables-alerter-jsjxw" Apr 16 23:50:30.364338 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.364300 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkvpw\" (UniqueName: \"kubernetes.io/projected/f1f1c9a5-595b-45a3-b80e-6c39f43d9778-kube-api-access-fkvpw\") pod \"aws-ebs-csi-driver-node-wzxkh\" (UID: \"f1f1c9a5-595b-45a3-b80e-6c39f43d9778\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wzxkh" Apr 16 23:50:30.364624 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.364592 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz7gx\" (UniqueName: \"kubernetes.io/projected/e55c5371-047f-4464-8977-501dfa689dd1-kube-api-access-kz7gx\") pod \"node-resolver-mhvhh\" (UID: \"e55c5371-047f-4464-8977-501dfa689dd1\") " pod="openshift-dns/node-resolver-mhvhh" Apr 16 23:50:30.364870 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.364850 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bthtp\" (UniqueName: \"kubernetes.io/projected/75cbba73-e06b-42f6-ad6c-4d0216b0faf6-kube-api-access-bthtp\") pod \"tuned-rdxqw\" (UID: \"75cbba73-e06b-42f6-ad6c-4d0216b0faf6\") " pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.365679 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.365657 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dzlh\" (UniqueName: \"kubernetes.io/projected/987f3171-af5b-41d5-91e1-d31f667a8755-kube-api-access-7dzlh\") pod \"multus-additional-cni-plugins-kwxk9\" (UID: \"987f3171-af5b-41d5-91e1-d31f667a8755\") " pod="openshift-multus/multus-additional-cni-plugins-kwxk9" Apr 16 23:50:30.366672 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.366651 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hxxv\" (UniqueName: \"kubernetes.io/projected/ac3d5de2-ce49-4327-89f8-f4642c3bc2f3-kube-api-access-9hxxv\") pod \"node-ca-hb7s9\" (UID: \"ac3d5de2-ce49-4327-89f8-f4642c3bc2f3\") " pod="openshift-image-registry/node-ca-hb7s9" Apr 16 23:50:30.540760 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.540674 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wzxkh" Apr 16 23:50:30.548407 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.548389 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kwxk9" Apr 16 23:50:30.557068 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.557050 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mxms2" Apr 16 23:50:30.562575 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.562556 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-h2hdz" Apr 16 23:50:30.570099 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.570081 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" Apr 16 23:50:30.577677 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.577656 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mhvhh" Apr 16 23:50:30.584180 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.584159 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hb7s9" Apr 16 23:50:30.592652 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.592636 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-jsjxw" Apr 16 23:50:30.597250 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.597176 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:30.825813 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:30.825788 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75cbba73_e06b_42f6_ad6c_4d0216b0faf6.slice/crio-c5d1374b1064aedb6efd6c1d3b23f80cb5f66174d88bac6b6dd385c1dad84d3c WatchSource:0}: Error finding container c5d1374b1064aedb6efd6c1d3b23f80cb5f66174d88bac6b6dd385c1dad84d3c: Status 404 returned error can't find the container with id c5d1374b1064aedb6efd6c1d3b23f80cb5f66174d88bac6b6dd385c1dad84d3c Apr 16 23:50:30.830870 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:30.830841 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod987f3171_af5b_41d5_91e1_d31f667a8755.slice/crio-d7aefc9adfce4a8c73286a47a163bedbe1576aa329d5066afffd2359cd8b13e0 WatchSource:0}: Error finding container d7aefc9adfce4a8c73286a47a163bedbe1576aa329d5066afffd2359cd8b13e0: Status 404 returned error can't find the container with id d7aefc9adfce4a8c73286a47a163bedbe1576aa329d5066afffd2359cd8b13e0 Apr 16 23:50:30.833214 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:30.833179 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode55c5371_047f_4464_8977_501dfa689dd1.slice/crio-2ab5a5a1fd222074b82e9b940e86c9cc85e5aceccd02d5e6e944257b51e26e2c WatchSource:0}: Error finding container 2ab5a5a1fd222074b82e9b940e86c9cc85e5aceccd02d5e6e944257b51e26e2c: Status 404 returned error can't find the container with id 2ab5a5a1fd222074b82e9b940e86c9cc85e5aceccd02d5e6e944257b51e26e2c Apr 16 23:50:30.833894 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:30.833872 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1f1c9a5_595b_45a3_b80e_6c39f43d9778.slice/crio-a50a0c68984fb6d7e32a8c3a7eb1113a1c96e04d38b4b38c15f79a3e52f7b587 WatchSource:0}: Error finding container a50a0c68984fb6d7e32a8c3a7eb1113a1c96e04d38b4b38c15f79a3e52f7b587: Status 404 returned error can't find the container with id a50a0c68984fb6d7e32a8c3a7eb1113a1c96e04d38b4b38c15f79a3e52f7b587 Apr 16 23:50:30.834802 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:30.834776 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6f655b8_f886_471e_9fd2_4685907346a7.slice/crio-fa2fd699df8b3520c6dcd2e8483c0ea1ec13e378202c371f4b9076f007faff6e WatchSource:0}: Error finding container fa2fd699df8b3520c6dcd2e8483c0ea1ec13e378202c371f4b9076f007faff6e: Status 404 returned error can't find the container with id fa2fd699df8b3520c6dcd2e8483c0ea1ec13e378202c371f4b9076f007faff6e Apr 16 23:50:30.836682 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:30.836652 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac3d5de2_ce49_4327_89f8_f4642c3bc2f3.slice/crio-839dfeca753daa3ae78f1055608e77eba208b7bd6ff6f16d977af60c7723ef2d WatchSource:0}: Error finding container 839dfeca753daa3ae78f1055608e77eba208b7bd6ff6f16d977af60c7723ef2d: Status 404 returned error can't find the container with id 839dfeca753daa3ae78f1055608e77eba208b7bd6ff6f16d977af60c7723ef2d Apr 16 23:50:30.837363 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:30.837341 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3f26c03_4fef_458e_8cf1_6d18222c3545.slice/crio-bb921ea294ab60f05afa571c29d48637fef7d8a4c0b77a57a7c73607e4aba695 WatchSource:0}: Error finding container bb921ea294ab60f05afa571c29d48637fef7d8a4c0b77a57a7c73607e4aba695: Status 404 returned error can't find the container with id bb921ea294ab60f05afa571c29d48637fef7d8a4c0b77a57a7c73607e4aba695 Apr 16 23:50:30.839102 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:30.839062 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94b51208_78d1_423e_a0fd_88a34e175744.slice/crio-ec94de1c4013a487b21d6efeec66ef36d8b0da68fced53d118831c211995d73b WatchSource:0}: Error finding container ec94de1c4013a487b21d6efeec66ef36d8b0da68fced53d118831c211995d73b: Status 404 returned error can't find the container with id ec94de1c4013a487b21d6efeec66ef36d8b0da68fced53d118831c211995d73b Apr 16 23:50:30.839698 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:50:30.839677 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22ac9fa0_d332_4e5c_99d7_e1ac90e53356.slice/crio-141d54e7c9504b0b618128a316579810c8d3a6c94706a786f5fa81c266df34a2 WatchSource:0}: Error finding container 141d54e7c9504b0b618128a316579810c8d3a6c94706a786f5fa81c266df34a2: Status 404 returned error can't find the container with id 141d54e7c9504b0b618128a316579810c8d3a6c94706a786f5fa81c266df34a2 Apr 16 23:50:30.858221 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.858067 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86301427-2e66-4c3a-ab22-55ef8ddc5580-metrics-certs\") pod \"network-metrics-daemon-f6rhw\" (UID: \"86301427-2e66-4c3a-ab22-55ef8ddc5580\") " pod="openshift-multus/network-metrics-daemon-f6rhw" Apr 16 23:50:30.858304 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:30.858208 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:30.858304 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:30.858274 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86301427-2e66-4c3a-ab22-55ef8ddc5580-metrics-certs podName:86301427-2e66-4c3a-ab22-55ef8ddc5580 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:31.858261076 +0000 UTC m=+4.067025724 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86301427-2e66-4c3a-ab22-55ef8ddc5580-metrics-certs") pod "network-metrics-daemon-f6rhw" (UID: "86301427-2e66-4c3a-ab22-55ef8ddc5580") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:30.959142 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:30.959118 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nqmck\" (UniqueName: \"kubernetes.io/projected/9df8e85f-0f0a-41bb-af0a-cae68056131b-kube-api-access-nqmck\") pod \"network-check-target-k9xr6\" (UID: \"9df8e85f-0f0a-41bb-af0a-cae68056131b\") " pod="openshift-network-diagnostics/network-check-target-k9xr6" Apr 16 23:50:30.959262 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:30.959249 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:50:30.959323 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:30.959265 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:50:30.959323 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:30.959273 2562 projected.go:194] Error preparing data for projected volume kube-api-access-nqmck for pod openshift-network-diagnostics/network-check-target-k9xr6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:30.959323 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:30.959316 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9df8e85f-0f0a-41bb-af0a-cae68056131b-kube-api-access-nqmck podName:9df8e85f-0f0a-41bb-af0a-cae68056131b nodeName:}" failed. No retries permitted until 2026-04-16 23:50:31.959298692 +0000 UTC m=+4.168063340 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-nqmck" (UniqueName: "kubernetes.io/projected/9df8e85f-0f0a-41bb-af0a-cae68056131b-kube-api-access-nqmck") pod "network-check-target-k9xr6" (UID: "9df8e85f-0f0a-41bb-af0a-cae68056131b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:31.276897 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:31.276794 2562 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 23:45:29 +0000 UTC" deadline="2027-10-31 12:16:23.789845089 +0000 UTC" Apr 16 23:50:31.276897 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:31.276832 2562 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13500h25m52.513016762s" Apr 16 23:50:31.350509 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:31.348771 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6rhw" Apr 16 23:50:31.350509 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:31.348921 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6rhw" podUID="86301427-2e66-4c3a-ab22-55ef8ddc5580" Apr 16 23:50:31.359052 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:31.359025 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mhvhh" event={"ID":"e55c5371-047f-4464-8977-501dfa689dd1","Type":"ContainerStarted","Data":"2ab5a5a1fd222074b82e9b940e86c9cc85e5aceccd02d5e6e944257b51e26e2c"} Apr 16 23:50:31.363924 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:31.363896 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-98.ec2.internal" event={"ID":"1143ec080bc3618a70cd0c3fccc6942e","Type":"ContainerStarted","Data":"07ffd314c84279a52384e0164926f63885adc0fdad26384dd151e1cf8918eea7"} Apr 16 23:50:31.375608 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:31.375554 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mxms2" event={"ID":"e3f26c03-4fef-458e-8cf1-6d18222c3545","Type":"ContainerStarted","Data":"bb921ea294ab60f05afa571c29d48637fef7d8a4c0b77a57a7c73607e4aba695"} Apr 16 23:50:31.395042 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:31.394949 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hb7s9" event={"ID":"ac3d5de2-ce49-4327-89f8-f4642c3bc2f3","Type":"ContainerStarted","Data":"839dfeca753daa3ae78f1055608e77eba208b7bd6ff6f16d977af60c7723ef2d"} Apr 16 23:50:31.405588 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:31.405561 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kwxk9" event={"ID":"987f3171-af5b-41d5-91e1-d31f667a8755","Type":"ContainerStarted","Data":"d7aefc9adfce4a8c73286a47a163bedbe1576aa329d5066afffd2359cd8b13e0"} Apr 16 23:50:31.412183 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:31.412159 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" event={"ID":"75cbba73-e06b-42f6-ad6c-4d0216b0faf6","Type":"ContainerStarted","Data":"c5d1374b1064aedb6efd6c1d3b23f80cb5f66174d88bac6b6dd385c1dad84d3c"} Apr 16 23:50:31.416989 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:31.416965 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-jsjxw" event={"ID":"22ac9fa0-d332-4e5c-99d7-e1ac90e53356","Type":"ContainerStarted","Data":"141d54e7c9504b0b618128a316579810c8d3a6c94706a786f5fa81c266df34a2"} Apr 16 23:50:31.425238 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:31.425215 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" event={"ID":"94b51208-78d1-423e-a0fd-88a34e175744","Type":"ContainerStarted","Data":"ec94de1c4013a487b21d6efeec66ef36d8b0da68fced53d118831c211995d73b"} Apr 16 23:50:31.426935 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:31.426908 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-h2hdz" event={"ID":"e6f655b8-f886-471e-9fd2-4685907346a7","Type":"ContainerStarted","Data":"fa2fd699df8b3520c6dcd2e8483c0ea1ec13e378202c371f4b9076f007faff6e"} Apr 16 23:50:31.439908 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:31.439885 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wzxkh" event={"ID":"f1f1c9a5-595b-45a3-b80e-6c39f43d9778","Type":"ContainerStarted","Data":"a50a0c68984fb6d7e32a8c3a7eb1113a1c96e04d38b4b38c15f79a3e52f7b587"} Apr 16 23:50:31.866551 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:31.866515 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86301427-2e66-4c3a-ab22-55ef8ddc5580-metrics-certs\") pod \"network-metrics-daemon-f6rhw\" (UID: \"86301427-2e66-4c3a-ab22-55ef8ddc5580\") " pod="openshift-multus/network-metrics-daemon-f6rhw" Apr 16 23:50:31.866680 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:31.866668 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:31.866747 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:31.866730 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86301427-2e66-4c3a-ab22-55ef8ddc5580-metrics-certs podName:86301427-2e66-4c3a-ab22-55ef8ddc5580 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:33.866710775 +0000 UTC m=+6.075475435 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86301427-2e66-4c3a-ab22-55ef8ddc5580-metrics-certs") pod "network-metrics-daemon-f6rhw" (UID: "86301427-2e66-4c3a-ab22-55ef8ddc5580") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:31.967668 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:31.967633 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nqmck\" (UniqueName: \"kubernetes.io/projected/9df8e85f-0f0a-41bb-af0a-cae68056131b-kube-api-access-nqmck\") pod \"network-check-target-k9xr6\" (UID: \"9df8e85f-0f0a-41bb-af0a-cae68056131b\") " pod="openshift-network-diagnostics/network-check-target-k9xr6" Apr 16 23:50:31.967882 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:31.967806 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:50:31.967882 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:31.967832 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:50:31.967882 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:31.967846 2562 projected.go:194] Error preparing data for projected volume kube-api-access-nqmck for pod openshift-network-diagnostics/network-check-target-k9xr6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:31.968036 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:31.967909 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9df8e85f-0f0a-41bb-af0a-cae68056131b-kube-api-access-nqmck podName:9df8e85f-0f0a-41bb-af0a-cae68056131b nodeName:}" failed. No retries permitted until 2026-04-16 23:50:33.967889885 +0000 UTC m=+6.176654550 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-nqmck" (UniqueName: "kubernetes.io/projected/9df8e85f-0f0a-41bb-af0a-cae68056131b-kube-api-access-nqmck") pod "network-check-target-k9xr6" (UID: "9df8e85f-0f0a-41bb-af0a-cae68056131b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:32.350869 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:32.350840 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9xr6" Apr 16 23:50:32.351340 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:32.350962 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9xr6" podUID="9df8e85f-0f0a-41bb-af0a-cae68056131b" Apr 16 23:50:32.468894 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:32.467936 2562 generic.go:358] "Generic (PLEG): container finished" podID="04daa51138142523c442144d3720f116" containerID="6072968efa6d723271dc4b7cc673ce516a7ed692bfdfb02fd4c99de1d6dd87fb" exitCode=0 Apr 16 23:50:32.468894 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:32.468851 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-98.ec2.internal" event={"ID":"04daa51138142523c442144d3720f116","Type":"ContainerDied","Data":"6072968efa6d723271dc4b7cc673ce516a7ed692bfdfb02fd4c99de1d6dd87fb"} Apr 16 23:50:32.481614 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:32.480489 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-98.ec2.internal" podStartSLOduration=3.48047374 podStartE2EDuration="3.48047374s" podCreationTimestamp="2026-04-16 23:50:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:50:31.375153286 +0000 UTC m=+3.583917957" watchObservedRunningTime="2026-04-16 23:50:32.48047374 +0000 UTC m=+4.689238410" Apr 16 23:50:33.350225 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:33.349270 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6rhw" Apr 16 23:50:33.350225 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:33.349411 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6rhw" podUID="86301427-2e66-4c3a-ab22-55ef8ddc5580" Apr 16 23:50:33.475128 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:33.475080 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-98.ec2.internal" event={"ID":"04daa51138142523c442144d3720f116","Type":"ContainerStarted","Data":"7773c0fbe546361f4286598a76bd583a3a8dd431c39e0f7fb6f002bd8e31c3ee"} Apr 16 23:50:33.883636 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:33.883599 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86301427-2e66-4c3a-ab22-55ef8ddc5580-metrics-certs\") pod \"network-metrics-daemon-f6rhw\" (UID: \"86301427-2e66-4c3a-ab22-55ef8ddc5580\") " pod="openshift-multus/network-metrics-daemon-f6rhw" Apr 16 23:50:33.883850 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:33.883762 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:33.883850 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:33.883832 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86301427-2e66-4c3a-ab22-55ef8ddc5580-metrics-certs podName:86301427-2e66-4c3a-ab22-55ef8ddc5580 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:37.883813636 +0000 UTC m=+10.092578285 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86301427-2e66-4c3a-ab22-55ef8ddc5580-metrics-certs") pod "network-metrics-daemon-f6rhw" (UID: "86301427-2e66-4c3a-ab22-55ef8ddc5580") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:33.983980 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:33.983935 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nqmck\" (UniqueName: \"kubernetes.io/projected/9df8e85f-0f0a-41bb-af0a-cae68056131b-kube-api-access-nqmck\") pod \"network-check-target-k9xr6\" (UID: \"9df8e85f-0f0a-41bb-af0a-cae68056131b\") " pod="openshift-network-diagnostics/network-check-target-k9xr6" Apr 16 23:50:33.984765 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:33.984164 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:50:33.984765 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:33.984290 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:50:33.984765 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:33.984307 2562 projected.go:194] Error preparing data for projected volume kube-api-access-nqmck for pod openshift-network-diagnostics/network-check-target-k9xr6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:33.984765 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:33.984375 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9df8e85f-0f0a-41bb-af0a-cae68056131b-kube-api-access-nqmck podName:9df8e85f-0f0a-41bb-af0a-cae68056131b nodeName:}" failed. No retries permitted until 2026-04-16 23:50:37.98435404 +0000 UTC m=+10.193118690 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-nqmck" (UniqueName: "kubernetes.io/projected/9df8e85f-0f0a-41bb-af0a-cae68056131b-kube-api-access-nqmck") pod "network-check-target-k9xr6" (UID: "9df8e85f-0f0a-41bb-af0a-cae68056131b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:34.349411 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:34.349291 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9xr6" Apr 16 23:50:34.349562 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:34.349423 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9xr6" podUID="9df8e85f-0f0a-41bb-af0a-cae68056131b" Apr 16 23:50:35.349279 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:35.349242 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6rhw" Apr 16 23:50:35.349748 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:35.349381 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6rhw" podUID="86301427-2e66-4c3a-ab22-55ef8ddc5580" Apr 16 23:50:36.349279 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:36.349240 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9xr6" Apr 16 23:50:36.349888 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:36.349366 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9xr6" podUID="9df8e85f-0f0a-41bb-af0a-cae68056131b" Apr 16 23:50:37.349580 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:37.349073 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6rhw" Apr 16 23:50:37.349580 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:37.349231 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6rhw" podUID="86301427-2e66-4c3a-ab22-55ef8ddc5580" Apr 16 23:50:37.913355 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:37.913322 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86301427-2e66-4c3a-ab22-55ef8ddc5580-metrics-certs\") pod \"network-metrics-daemon-f6rhw\" (UID: \"86301427-2e66-4c3a-ab22-55ef8ddc5580\") " pod="openshift-multus/network-metrics-daemon-f6rhw" Apr 16 23:50:37.913544 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:37.913473 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:37.913544 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:37.913538 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86301427-2e66-4c3a-ab22-55ef8ddc5580-metrics-certs podName:86301427-2e66-4c3a-ab22-55ef8ddc5580 nodeName:}" failed. No retries permitted until 2026-04-16 23:50:45.913521309 +0000 UTC m=+18.122285968 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86301427-2e66-4c3a-ab22-55ef8ddc5580-metrics-certs") pod "network-metrics-daemon-f6rhw" (UID: "86301427-2e66-4c3a-ab22-55ef8ddc5580") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:38.014515 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:38.014483 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nqmck\" (UniqueName: \"kubernetes.io/projected/9df8e85f-0f0a-41bb-af0a-cae68056131b-kube-api-access-nqmck\") pod \"network-check-target-k9xr6\" (UID: \"9df8e85f-0f0a-41bb-af0a-cae68056131b\") " pod="openshift-network-diagnostics/network-check-target-k9xr6" Apr 16 23:50:38.014683 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:38.014664 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:50:38.014758 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:38.014689 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:50:38.014758 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:38.014706 2562 projected.go:194] Error preparing data for projected volume kube-api-access-nqmck for pod openshift-network-diagnostics/network-check-target-k9xr6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:38.014856 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:38.014768 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9df8e85f-0f0a-41bb-af0a-cae68056131b-kube-api-access-nqmck podName:9df8e85f-0f0a-41bb-af0a-cae68056131b nodeName:}" failed. No retries permitted until 2026-04-16 23:50:46.014749017 +0000 UTC m=+18.223513682 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-nqmck" (UniqueName: "kubernetes.io/projected/9df8e85f-0f0a-41bb-af0a-cae68056131b-kube-api-access-nqmck") pod "network-check-target-k9xr6" (UID: "9df8e85f-0f0a-41bb-af0a-cae68056131b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:38.352024 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:38.351952 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9xr6" Apr 16 23:50:38.352484 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:38.352055 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9xr6" podUID="9df8e85f-0f0a-41bb-af0a-cae68056131b" Apr 16 23:50:39.349111 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:39.349064 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6rhw" Apr 16 23:50:39.349293 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:39.349224 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6rhw" podUID="86301427-2e66-4c3a-ab22-55ef8ddc5580" Apr 16 23:50:40.348682 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:40.348645 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9xr6" Apr 16 23:50:40.349108 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:40.348755 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9xr6" podUID="9df8e85f-0f0a-41bb-af0a-cae68056131b" Apr 16 23:50:41.348892 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:41.348861 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6rhw" Apr 16 23:50:41.349354 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:41.348979 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6rhw" podUID="86301427-2e66-4c3a-ab22-55ef8ddc5580" Apr 16 23:50:42.348551 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:42.348517 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9xr6" Apr 16 23:50:42.348721 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:42.348636 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9xr6" podUID="9df8e85f-0f0a-41bb-af0a-cae68056131b" Apr 16 23:50:43.348901 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:43.348874 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6rhw" Apr 16 23:50:43.349286 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:43.348972 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6rhw" podUID="86301427-2e66-4c3a-ab22-55ef8ddc5580" Apr 16 23:50:44.351639 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:44.351610 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9xr6" Apr 16 23:50:44.352072 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:44.351733 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9xr6" podUID="9df8e85f-0f0a-41bb-af0a-cae68056131b" Apr 16 23:50:45.348646 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:45.348609 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6rhw" Apr 16 23:50:45.348876 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:45.348735 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6rhw" podUID="86301427-2e66-4c3a-ab22-55ef8ddc5580" Apr 16 23:50:45.974403 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:45.974340 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86301427-2e66-4c3a-ab22-55ef8ddc5580-metrics-certs\") pod \"network-metrics-daemon-f6rhw\" (UID: \"86301427-2e66-4c3a-ab22-55ef8ddc5580\") " pod="openshift-multus/network-metrics-daemon-f6rhw" Apr 16 23:50:45.974794 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:45.974500 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:45.974794 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:45.974570 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86301427-2e66-4c3a-ab22-55ef8ddc5580-metrics-certs podName:86301427-2e66-4c3a-ab22-55ef8ddc5580 nodeName:}" failed. No retries permitted until 2026-04-16 23:51:01.974552005 +0000 UTC m=+34.183316658 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86301427-2e66-4c3a-ab22-55ef8ddc5580-metrics-certs") pod "network-metrics-daemon-f6rhw" (UID: "86301427-2e66-4c3a-ab22-55ef8ddc5580") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:50:46.075478 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:46.075442 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nqmck\" (UniqueName: \"kubernetes.io/projected/9df8e85f-0f0a-41bb-af0a-cae68056131b-kube-api-access-nqmck\") pod \"network-check-target-k9xr6\" (UID: \"9df8e85f-0f0a-41bb-af0a-cae68056131b\") " pod="openshift-network-diagnostics/network-check-target-k9xr6" Apr 16 23:50:46.075648 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:46.075588 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:50:46.075648 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:46.075609 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:50:46.075648 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:46.075621 2562 projected.go:194] Error preparing data for projected volume kube-api-access-nqmck for pod openshift-network-diagnostics/network-check-target-k9xr6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:46.075804 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:46.075683 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9df8e85f-0f0a-41bb-af0a-cae68056131b-kube-api-access-nqmck podName:9df8e85f-0f0a-41bb-af0a-cae68056131b nodeName:}" failed. No retries permitted until 2026-04-16 23:51:02.075665346 +0000 UTC m=+34.284430014 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-nqmck" (UniqueName: "kubernetes.io/projected/9df8e85f-0f0a-41bb-af0a-cae68056131b-kube-api-access-nqmck") pod "network-check-target-k9xr6" (UID: "9df8e85f-0f0a-41bb-af0a-cae68056131b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:50:46.349308 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:46.349234 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9xr6" Apr 16 23:50:46.349450 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:46.349356 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9xr6" podUID="9df8e85f-0f0a-41bb-af0a-cae68056131b" Apr 16 23:50:47.348859 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:47.348825 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6rhw" Apr 16 23:50:47.349308 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:47.348951 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6rhw" podUID="86301427-2e66-4c3a-ab22-55ef8ddc5580" Apr 16 23:50:48.352097 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:48.351795 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9xr6" Apr 16 23:50:48.352817 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:48.352182 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9xr6" podUID="9df8e85f-0f0a-41bb-af0a-cae68056131b" Apr 16 23:50:48.502116 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:48.502035 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" event={"ID":"94b51208-78d1-423e-a0fd-88a34e175744","Type":"ContainerStarted","Data":"21e3b7a6ec36a056def1d808ff88473836ffb484b1685b37975d9a99450d0f45"} Apr 16 23:50:48.502116 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:48.502077 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" event={"ID":"94b51208-78d1-423e-a0fd-88a34e175744","Type":"ContainerStarted","Data":"4ef01b86900d76d095bc570f52890c7a061e4e3a7fbd037f4afe1ebdcf10bf9d"} Apr 16 23:50:48.502116 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:48.502092 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" event={"ID":"94b51208-78d1-423e-a0fd-88a34e175744","Type":"ContainerStarted","Data":"2d04b7bb5bc3eae97b8251e1e35fc96e265a816e147c086b1bc6b6fdde60b39d"} Apr 16 23:50:48.502116 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:48.502104 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" event={"ID":"94b51208-78d1-423e-a0fd-88a34e175744","Type":"ContainerStarted","Data":"302f9082a6345a644e8c2fe71dc7b074720b3a8b5112bab6d45928a8be6dae7a"} Apr 16 23:50:48.502116 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:48.502116 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" event={"ID":"94b51208-78d1-423e-a0fd-88a34e175744","Type":"ContainerStarted","Data":"6f9aed4820fd55d7c1c7ef48f2ccaa97e99d6dad9e1ef2384ea7f7c4326ad904"} Apr 16 23:50:48.502433 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:48.502128 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" event={"ID":"94b51208-78d1-423e-a0fd-88a34e175744","Type":"ContainerStarted","Data":"402743ce645f65bd36d6d85fa64fef860bca2c1c2a0a9e696a72a0b5661e627b"} Apr 16 23:50:48.503298 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:48.503257 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-h2hdz" event={"ID":"e6f655b8-f886-471e-9fd2-4685907346a7","Type":"ContainerStarted","Data":"7547e9c91ac27ff6e8e1b62a880aa8343ee260ff4ce3e6be7cd2f9b704af62f2"} Apr 16 23:50:48.504538 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:48.504513 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wzxkh" event={"ID":"f1f1c9a5-595b-45a3-b80e-6c39f43d9778","Type":"ContainerStarted","Data":"4f7c5b2cec7b0ffbe907c00f944b340500809c30aebcbe1d54aab91fe1ebf4be"} Apr 16 23:50:48.505680 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:48.505650 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mhvhh" event={"ID":"e55c5371-047f-4464-8977-501dfa689dd1","Type":"ContainerStarted","Data":"d31b77faa56533eaa300134aa93e22838dc6669817ea9c86a04320a2794bb19e"} Apr 16 23:50:48.506957 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:48.506920 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mxms2" event={"ID":"e3f26c03-4fef-458e-8cf1-6d18222c3545","Type":"ContainerStarted","Data":"74e78abc5028ed187c19bbff9b5a187b3fd4e45506f0132c1c32f91c96bfdc6c"} Apr 16 23:50:48.508408 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:48.508377 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hb7s9" event={"ID":"ac3d5de2-ce49-4327-89f8-f4642c3bc2f3","Type":"ContainerStarted","Data":"823643f5a1f007679ec7a5b2fba01a2b4bc64fe1bddc634ddd993daf9a2f097d"} Apr 16 23:50:48.509797 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:48.509776 2562 generic.go:358] "Generic (PLEG): container finished" podID="987f3171-af5b-41d5-91e1-d31f667a8755" containerID="bb67cb8f941228d5a2cbe8388ddc9b32790c5c90cfd5d47346d9bc5d4837ead3" exitCode=0 Apr 16 23:50:48.509882 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:48.509829 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kwxk9" event={"ID":"987f3171-af5b-41d5-91e1-d31f667a8755","Type":"ContainerDied","Data":"bb67cb8f941228d5a2cbe8388ddc9b32790c5c90cfd5d47346d9bc5d4837ead3"} Apr 16 23:50:48.511256 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:48.511155 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" event={"ID":"75cbba73-e06b-42f6-ad6c-4d0216b0faf6","Type":"ContainerStarted","Data":"88b496800c798030e765ec64fc407ff5b8d6884db83ceadc0df853ec509837ea"} Apr 16 23:50:48.516167 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:48.516133 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-98.ec2.internal" podStartSLOduration=19.516122886 podStartE2EDuration="19.516122886s" podCreationTimestamp="2026-04-16 23:50:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:50:33.487243319 +0000 UTC m=+5.696007990" watchObservedRunningTime="2026-04-16 23:50:48.516122886 +0000 UTC m=+20.724887557" Apr 16 23:50:48.516424 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:48.516394 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-h2hdz" podStartSLOduration=3.631038406 podStartE2EDuration="20.516388689s" podCreationTimestamp="2026-04-16 23:50:28 +0000 UTC" firstStartedPulling="2026-04-16 23:50:30.836526701 +0000 UTC m=+3.045291349" lastFinishedPulling="2026-04-16 23:50:47.721876968 +0000 UTC m=+19.930641632" observedRunningTime="2026-04-16 23:50:48.515928387 +0000 UTC m=+20.724693057" watchObservedRunningTime="2026-04-16 23:50:48.516388689 +0000 UTC m=+20.725153359" Apr 16 23:50:48.539244 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:48.539177 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-mhvhh" podStartSLOduration=3.652161623 podStartE2EDuration="20.539166024s" podCreationTimestamp="2026-04-16 23:50:28 +0000 UTC" firstStartedPulling="2026-04-16 23:50:30.834937812 +0000 UTC m=+3.043702462" lastFinishedPulling="2026-04-16 23:50:47.721942216 +0000 UTC m=+19.930706863" observedRunningTime="2026-04-16 23:50:48.527371501 +0000 UTC m=+20.736136171" watchObservedRunningTime="2026-04-16 23:50:48.539166024 +0000 UTC m=+20.747930696" Apr 16 23:50:48.539358 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:48.539281 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-hb7s9" podStartSLOduration=11.625327378 podStartE2EDuration="20.539274618s" podCreationTimestamp="2026-04-16 23:50:28 +0000 UTC" firstStartedPulling="2026-04-16 23:50:30.838628982 +0000 UTC m=+3.047393636" lastFinishedPulling="2026-04-16 23:50:39.752576223 +0000 UTC m=+11.961340876" observedRunningTime="2026-04-16 23:50:48.538806227 +0000 UTC m=+20.747570897" watchObservedRunningTime="2026-04-16 23:50:48.539274618 +0000 UTC m=+20.748039289" Apr 16 23:50:48.577822 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:48.577771 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-mxms2" podStartSLOduration=3.683389903 podStartE2EDuration="20.577756602s" podCreationTimestamp="2026-04-16 23:50:28 +0000 UTC" firstStartedPulling="2026-04-16 23:50:30.839264036 +0000 UTC m=+3.048028684" lastFinishedPulling="2026-04-16 23:50:47.733630722 +0000 UTC m=+19.942395383" observedRunningTime="2026-04-16 23:50:48.576701601 +0000 UTC m=+20.785466272" watchObservedRunningTime="2026-04-16 23:50:48.577756602 +0000 UTC m=+20.786521274" Apr 16 23:50:48.591793 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:48.591757 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-rdxqw" podStartSLOduration=3.68684601 podStartE2EDuration="20.59174757s" podCreationTimestamp="2026-04-16 23:50:28 +0000 UTC" firstStartedPulling="2026-04-16 23:50:30.827823744 +0000 UTC m=+3.036588393" lastFinishedPulling="2026-04-16 23:50:47.732725303 +0000 UTC m=+19.941489953" observedRunningTime="2026-04-16 23:50:48.591311628 +0000 UTC m=+20.800076299" watchObservedRunningTime="2026-04-16 23:50:48.59174757 +0000 UTC m=+20.800512240" Apr 16 23:50:48.813110 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:48.813088 2562 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 23:50:49.307203 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:49.307084 2562 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T23:50:48.813105689Z","UUID":"1353a838-7bd5-4c91-818f-f2afb8cd89bf","Handler":null,"Name":"","Endpoint":""} Apr 16 23:50:49.309251 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:49.309226 2562 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 23:50:49.309358 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:49.309259 2562 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 23:50:49.348304 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:49.348278 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6rhw" Apr 16 23:50:49.348438 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:49.348409 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6rhw" podUID="86301427-2e66-4c3a-ab22-55ef8ddc5580" Apr 16 23:50:49.515529 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:49.515493 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-jsjxw" event={"ID":"22ac9fa0-d332-4e5c-99d7-e1ac90e53356","Type":"ContainerStarted","Data":"4c93139b544a086affc365cba85f2ed23c97bb35949b073d48005f91761f219c"} Apr 16 23:50:49.517323 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:49.517292 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wzxkh" event={"ID":"f1f1c9a5-595b-45a3-b80e-6c39f43d9778","Type":"ContainerStarted","Data":"465911370d42bb74db74187fd38403cd7af7fb650c5595bdfcb06d04179b1bd7"} Apr 16 23:50:50.348653 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:50.348631 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9xr6" Apr 16 23:50:50.348919 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:50.348751 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9xr6" podUID="9df8e85f-0f0a-41bb-af0a-cae68056131b" Apr 16 23:50:50.522894 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:50.522680 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" event={"ID":"94b51208-78d1-423e-a0fd-88a34e175744","Type":"ContainerStarted","Data":"8a630c5c5c88ccac1bbc2fd5088bfcacfe637f4a06bcee07c18977682773db04"} Apr 16 23:50:50.524932 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:50.524903 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wzxkh" event={"ID":"f1f1c9a5-595b-45a3-b80e-6c39f43d9778","Type":"ContainerStarted","Data":"058fe859dfe9633a892fe933c06682bf4c3e474a16f0f91a09a669a9c766eb33"} Apr 16 23:50:50.539868 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:50.539825 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-jsjxw" podStartSLOduration=5.660295202 podStartE2EDuration="22.539809754s" podCreationTimestamp="2026-04-16 23:50:28 +0000 UTC" firstStartedPulling="2026-04-16 23:50:30.842346462 +0000 UTC m=+3.051111110" lastFinishedPulling="2026-04-16 23:50:47.721860999 +0000 UTC m=+19.930625662" observedRunningTime="2026-04-16 23:50:49.527682054 +0000 UTC m=+21.736446720" watchObservedRunningTime="2026-04-16 23:50:50.539809754 +0000 UTC m=+22.748574425" Apr 16 23:50:50.540158 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:50.540136 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wzxkh" podStartSLOduration=3.809902521 podStartE2EDuration="22.540130155s" podCreationTimestamp="2026-04-16 23:50:28 +0000 UTC" firstStartedPulling="2026-04-16 23:50:30.835604416 +0000 UTC m=+3.044369098" lastFinishedPulling="2026-04-16 23:50:49.565832084 +0000 UTC m=+21.774596732" observedRunningTime="2026-04-16 23:50:50.539631357 +0000 UTC m=+22.748396038" watchObservedRunningTime="2026-04-16 23:50:50.540130155 +0000 UTC m=+22.748894825" Apr 16 23:50:51.348641 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:51.348606 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6rhw" Apr 16 23:50:51.348805 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:51.348732 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6rhw" podUID="86301427-2e66-4c3a-ab22-55ef8ddc5580" Apr 16 23:50:52.308567 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:52.308539 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-h2hdz" Apr 16 23:50:52.309297 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:52.309284 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-h2hdz" Apr 16 23:50:52.351006 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:52.350979 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9xr6" Apr 16 23:50:52.351142 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:52.351071 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9xr6" podUID="9df8e85f-0f0a-41bb-af0a-cae68056131b" Apr 16 23:50:52.533290 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:52.533095 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" event={"ID":"94b51208-78d1-423e-a0fd-88a34e175744","Type":"ContainerStarted","Data":"5d44773c74df40b763e91540c846f3b9ca0efecab08759c3e83146f1f639144f"} Apr 16 23:50:52.561717 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:52.561341 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" podStartSLOduration=7.590510635 podStartE2EDuration="24.561323555s" podCreationTimestamp="2026-04-16 23:50:28 +0000 UTC" firstStartedPulling="2026-04-16 23:50:30.840624834 +0000 UTC m=+3.049389486" lastFinishedPulling="2026-04-16 23:50:47.811437751 +0000 UTC m=+20.020202406" observedRunningTime="2026-04-16 23:50:52.556739445 +0000 UTC m=+24.765504110" watchObservedRunningTime="2026-04-16 23:50:52.561323555 +0000 UTC m=+24.770088226" Apr 16 23:50:53.349336 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:53.349165 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6rhw" Apr 16 23:50:53.349899 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:53.349402 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6rhw" podUID="86301427-2e66-4c3a-ab22-55ef8ddc5580" Apr 16 23:50:53.535642 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:53.535616 2562 generic.go:358] "Generic (PLEG): container finished" podID="987f3171-af5b-41d5-91e1-d31f667a8755" containerID="6b97afa77ddcedbdce3a4411366101f3d4779476355d7abb53c39405a7324bc0" exitCode=0 Apr 16 23:50:53.535773 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:53.535706 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kwxk9" event={"ID":"987f3171-af5b-41d5-91e1-d31f667a8755","Type":"ContainerDied","Data":"6b97afa77ddcedbdce3a4411366101f3d4779476355d7abb53c39405a7324bc0"} Apr 16 23:50:53.535913 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:53.535896 2562 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 23:50:53.536828 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:53.536282 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:53.536828 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:53.536305 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:53.549942 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:53.549922 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:53.550032 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:53.550002 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:54.348636 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:54.348610 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9xr6" Apr 16 23:50:54.348741 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:54.348716 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9xr6" podUID="9df8e85f-0f0a-41bb-af0a-cae68056131b" Apr 16 23:50:54.411290 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:54.411262 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-k9xr6"] Apr 16 23:50:54.413830 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:54.413797 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-f6rhw"] Apr 16 23:50:54.413927 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:54.413901 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6rhw" Apr 16 23:50:54.414032 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:54.414014 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6rhw" podUID="86301427-2e66-4c3a-ab22-55ef8ddc5580" Apr 16 23:50:54.539898 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:54.539815 2562 generic.go:358] "Generic (PLEG): container finished" podID="987f3171-af5b-41d5-91e1-d31f667a8755" containerID="cfd7da84054a29d36699d4a8452effbe18d336ae8b31331eb611b354b19fd24a" exitCode=0 Apr 16 23:50:54.540033 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:54.539899 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kwxk9" event={"ID":"987f3171-af5b-41d5-91e1-d31f667a8755","Type":"ContainerDied","Data":"cfd7da84054a29d36699d4a8452effbe18d336ae8b31331eb611b354b19fd24a"} Apr 16 23:50:54.540091 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:54.540075 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9xr6" Apr 16 23:50:54.540208 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:54.540179 2562 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 23:50:54.540298 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:54.540177 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9xr6" podUID="9df8e85f-0f0a-41bb-af0a-cae68056131b" Apr 16 23:50:55.545688 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:55.545526 2562 generic.go:358] "Generic (PLEG): container finished" podID="987f3171-af5b-41d5-91e1-d31f667a8755" containerID="40c5c3e55fa3724a4df588c6e2b4f80e0b86fb12fc526f23dd2f186bb66565b9" exitCode=0 Apr 16 23:50:55.546045 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:55.545609 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kwxk9" event={"ID":"987f3171-af5b-41d5-91e1-d31f667a8755","Type":"ContainerDied","Data":"40c5c3e55fa3724a4df588c6e2b4f80e0b86fb12fc526f23dd2f186bb66565b9"} Apr 16 23:50:55.546045 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:55.545865 2562 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 23:50:56.348276 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:56.348250 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6rhw" Apr 16 23:50:56.348439 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:56.348249 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9xr6" Apr 16 23:50:56.348439 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:56.348378 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6rhw" podUID="86301427-2e66-4c3a-ab22-55ef8ddc5580" Apr 16 23:50:56.348439 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:56.348422 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9xr6" podUID="9df8e85f-0f0a-41bb-af0a-cae68056131b" Apr 16 23:50:58.351156 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:58.350642 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9xr6" Apr 16 23:50:58.351156 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:58.350774 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9xr6" podUID="9df8e85f-0f0a-41bb-af0a-cae68056131b" Apr 16 23:50:58.351775 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:58.351217 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6rhw" Apr 16 23:50:58.351775 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:50:58.351376 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6rhw" podUID="86301427-2e66-4c3a-ab22-55ef8ddc5580" Apr 16 23:50:58.765888 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:58.765858 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:50:58.766107 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:58.766094 2562 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 23:50:58.781173 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:58.781115 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" podUID="94b51208-78d1-423e-a0fd-88a34e175744" containerName="ovnkube-controller" probeResult="failure" output="" Apr 16 23:50:58.784978 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:58.784955 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-h2hdz" Apr 16 23:50:58.785107 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:58.785092 2562 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 23:50:58.785604 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:58.785588 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-h2hdz" Apr 16 23:50:58.790307 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:50:58.790283 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" podUID="94b51208-78d1-423e-a0fd-88a34e175744" containerName="ovnkube-controller" probeResult="failure" output="" Apr 16 23:51:00.349208 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:00.349160 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6rhw" Apr 16 23:51:00.349739 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:00.349218 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9xr6" Apr 16 23:51:00.349739 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:51:00.349303 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6rhw" podUID="86301427-2e66-4c3a-ab22-55ef8ddc5580" Apr 16 23:51:00.349739 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:51:00.349417 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9xr6" podUID="9df8e85f-0f0a-41bb-af0a-cae68056131b" Apr 16 23:51:00.608881 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:00.608811 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-98.ec2.internal" event="NodeReady" Apr 16 23:51:00.609051 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:00.608964 2562 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 23:51:00.645940 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:00.645906 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-dxzbw"] Apr 16 23:51:00.670852 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:00.670823 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-495hb"] Apr 16 23:51:00.671008 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:00.670993 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dxzbw" Apr 16 23:51:00.673078 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:00.673051 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-jh2wm\"" Apr 16 23:51:00.673222 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:00.673098 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 23:51:00.673222 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:00.673180 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 23:51:00.689586 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:00.689562 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dxzbw"] Apr 16 23:51:00.689708 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:00.689590 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-495hb"] Apr 16 23:51:00.689708 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:00.689691 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-495hb" Apr 16 23:51:00.693441 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:00.691688 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-6grx4\"" Apr 16 23:51:00.693441 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:00.691944 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 23:51:00.693441 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:00.692204 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 23:51:00.693441 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:00.692457 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 23:51:00.788140 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:00.788106 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzzzn\" (UniqueName: \"kubernetes.io/projected/6d46b010-8236-429d-af09-8fb8c4618d50-kube-api-access-hzzzn\") pod \"ingress-canary-495hb\" (UID: \"6d46b010-8236-429d-af09-8fb8c4618d50\") " pod="openshift-ingress-canary/ingress-canary-495hb" Apr 16 23:51:00.788309 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:00.788164 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d46b010-8236-429d-af09-8fb8c4618d50-cert\") pod \"ingress-canary-495hb\" (UID: \"6d46b010-8236-429d-af09-8fb8c4618d50\") " pod="openshift-ingress-canary/ingress-canary-495hb" Apr 16 23:51:00.788309 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:00.788207 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtgwz\" (UniqueName: \"kubernetes.io/projected/66c9f5e2-4d93-4cf6-96de-a4ade7175b77-kube-api-access-gtgwz\") pod \"dns-default-dxzbw\" (UID: \"66c9f5e2-4d93-4cf6-96de-a4ade7175b77\") " pod="openshift-dns/dns-default-dxzbw" Apr 16 23:51:00.788309 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:00.788273 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/66c9f5e2-4d93-4cf6-96de-a4ade7175b77-config-volume\") pod \"dns-default-dxzbw\" (UID: \"66c9f5e2-4d93-4cf6-96de-a4ade7175b77\") " pod="openshift-dns/dns-default-dxzbw" Apr 16 23:51:00.788464 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:00.788346 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/66c9f5e2-4d93-4cf6-96de-a4ade7175b77-metrics-tls\") pod \"dns-default-dxzbw\" (UID: \"66c9f5e2-4d93-4cf6-96de-a4ade7175b77\") " pod="openshift-dns/dns-default-dxzbw" Apr 16 23:51:00.788464 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:00.788376 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/66c9f5e2-4d93-4cf6-96de-a4ade7175b77-tmp-dir\") pod \"dns-default-dxzbw\" (UID: \"66c9f5e2-4d93-4cf6-96de-a4ade7175b77\") " pod="openshift-dns/dns-default-dxzbw" Apr 16 23:51:00.888862 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:00.888778 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/66c9f5e2-4d93-4cf6-96de-a4ade7175b77-metrics-tls\") pod \"dns-default-dxzbw\" (UID: \"66c9f5e2-4d93-4cf6-96de-a4ade7175b77\") " pod="openshift-dns/dns-default-dxzbw" Apr 16 23:51:00.888862 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:00.888820 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/66c9f5e2-4d93-4cf6-96de-a4ade7175b77-tmp-dir\") pod \"dns-default-dxzbw\" (UID: \"66c9f5e2-4d93-4cf6-96de-a4ade7175b77\") " pod="openshift-dns/dns-default-dxzbw" Apr 16 23:51:00.888862 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:00.888849 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hzzzn\" (UniqueName: \"kubernetes.io/projected/6d46b010-8236-429d-af09-8fb8c4618d50-kube-api-access-hzzzn\") pod \"ingress-canary-495hb\" (UID: \"6d46b010-8236-429d-af09-8fb8c4618d50\") " pod="openshift-ingress-canary/ingress-canary-495hb" Apr 16 23:51:00.889119 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:00.888881 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d46b010-8236-429d-af09-8fb8c4618d50-cert\") pod \"ingress-canary-495hb\" (UID: \"6d46b010-8236-429d-af09-8fb8c4618d50\") " pod="openshift-ingress-canary/ingress-canary-495hb" Apr 16 23:51:00.889119 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:00.888903 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gtgwz\" (UniqueName: \"kubernetes.io/projected/66c9f5e2-4d93-4cf6-96de-a4ade7175b77-kube-api-access-gtgwz\") pod \"dns-default-dxzbw\" (UID: \"66c9f5e2-4d93-4cf6-96de-a4ade7175b77\") " pod="openshift-dns/dns-default-dxzbw" Apr 16 23:51:00.889119 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:51:00.888910 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:51:00.889119 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:00.888940 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/66c9f5e2-4d93-4cf6-96de-a4ade7175b77-config-volume\") pod \"dns-default-dxzbw\" (UID: \"66c9f5e2-4d93-4cf6-96de-a4ade7175b77\") " pod="openshift-dns/dns-default-dxzbw" Apr 16 23:51:00.889119 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:51:00.888983 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66c9f5e2-4d93-4cf6-96de-a4ade7175b77-metrics-tls podName:66c9f5e2-4d93-4cf6-96de-a4ade7175b77 nodeName:}" failed. No retries permitted until 2026-04-16 23:51:01.388961417 +0000 UTC m=+33.597726065 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/66c9f5e2-4d93-4cf6-96de-a4ade7175b77-metrics-tls") pod "dns-default-dxzbw" (UID: "66c9f5e2-4d93-4cf6-96de-a4ade7175b77") : secret "dns-default-metrics-tls" not found Apr 16 23:51:00.889119 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:51:00.888997 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:51:00.889119 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:51:00.889049 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d46b010-8236-429d-af09-8fb8c4618d50-cert podName:6d46b010-8236-429d-af09-8fb8c4618d50 nodeName:}" failed. No retries permitted until 2026-04-16 23:51:01.38903238 +0000 UTC m=+33.597797038 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d46b010-8236-429d-af09-8fb8c4618d50-cert") pod "ingress-canary-495hb" (UID: "6d46b010-8236-429d-af09-8fb8c4618d50") : secret "canary-serving-cert" not found Apr 16 23:51:00.889485 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:00.889243 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/66c9f5e2-4d93-4cf6-96de-a4ade7175b77-tmp-dir\") pod \"dns-default-dxzbw\" (UID: \"66c9f5e2-4d93-4cf6-96de-a4ade7175b77\") " pod="openshift-dns/dns-default-dxzbw" Apr 16 23:51:00.889546 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:00.889493 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/66c9f5e2-4d93-4cf6-96de-a4ade7175b77-config-volume\") pod \"dns-default-dxzbw\" (UID: \"66c9f5e2-4d93-4cf6-96de-a4ade7175b77\") " pod="openshift-dns/dns-default-dxzbw" Apr 16 23:51:00.899794 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:00.899744 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtgwz\" (UniqueName: \"kubernetes.io/projected/66c9f5e2-4d93-4cf6-96de-a4ade7175b77-kube-api-access-gtgwz\") pod \"dns-default-dxzbw\" (UID: \"66c9f5e2-4d93-4cf6-96de-a4ade7175b77\") " pod="openshift-dns/dns-default-dxzbw" Apr 16 23:51:00.899794 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:00.899785 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzzzn\" (UniqueName: \"kubernetes.io/projected/6d46b010-8236-429d-af09-8fb8c4618d50-kube-api-access-hzzzn\") pod \"ingress-canary-495hb\" (UID: \"6d46b010-8236-429d-af09-8fb8c4618d50\") " pod="openshift-ingress-canary/ingress-canary-495hb" Apr 16 23:51:01.392955 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:01.392920 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/66c9f5e2-4d93-4cf6-96de-a4ade7175b77-metrics-tls\") pod \"dns-default-dxzbw\" (UID: \"66c9f5e2-4d93-4cf6-96de-a4ade7175b77\") " pod="openshift-dns/dns-default-dxzbw" Apr 16 23:51:01.392955 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:01.392964 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d46b010-8236-429d-af09-8fb8c4618d50-cert\") pod \"ingress-canary-495hb\" (UID: \"6d46b010-8236-429d-af09-8fb8c4618d50\") " pod="openshift-ingress-canary/ingress-canary-495hb" Apr 16 23:51:01.393783 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:51:01.393073 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:51:01.393783 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:51:01.393119 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:51:01.393783 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:51:01.393136 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66c9f5e2-4d93-4cf6-96de-a4ade7175b77-metrics-tls podName:66c9f5e2-4d93-4cf6-96de-a4ade7175b77 nodeName:}" failed. No retries permitted until 2026-04-16 23:51:02.393119989 +0000 UTC m=+34.601884637 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/66c9f5e2-4d93-4cf6-96de-a4ade7175b77-metrics-tls") pod "dns-default-dxzbw" (UID: "66c9f5e2-4d93-4cf6-96de-a4ade7175b77") : secret "dns-default-metrics-tls" not found Apr 16 23:51:01.393783 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:51:01.393176 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d46b010-8236-429d-af09-8fb8c4618d50-cert podName:6d46b010-8236-429d-af09-8fb8c4618d50 nodeName:}" failed. No retries permitted until 2026-04-16 23:51:02.393164046 +0000 UTC m=+34.601928694 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d46b010-8236-429d-af09-8fb8c4618d50-cert") pod "ingress-canary-495hb" (UID: "6d46b010-8236-429d-af09-8fb8c4618d50") : secret "canary-serving-cert" not found Apr 16 23:51:01.997225 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:01.997177 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86301427-2e66-4c3a-ab22-55ef8ddc5580-metrics-certs\") pod \"network-metrics-daemon-f6rhw\" (UID: \"86301427-2e66-4c3a-ab22-55ef8ddc5580\") " pod="openshift-multus/network-metrics-daemon-f6rhw" Apr 16 23:51:01.997409 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:51:01.997318 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:51:01.997409 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:51:01.997381 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86301427-2e66-4c3a-ab22-55ef8ddc5580-metrics-certs podName:86301427-2e66-4c3a-ab22-55ef8ddc5580 nodeName:}" failed. No retries permitted until 2026-04-16 23:51:33.997361047 +0000 UTC m=+66.206125695 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86301427-2e66-4c3a-ab22-55ef8ddc5580-metrics-certs") pod "network-metrics-daemon-f6rhw" (UID: "86301427-2e66-4c3a-ab22-55ef8ddc5580") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:51:02.098457 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:02.098424 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nqmck\" (UniqueName: \"kubernetes.io/projected/9df8e85f-0f0a-41bb-af0a-cae68056131b-kube-api-access-nqmck\") pod \"network-check-target-k9xr6\" (UID: \"9df8e85f-0f0a-41bb-af0a-cae68056131b\") " pod="openshift-network-diagnostics/network-check-target-k9xr6" Apr 16 23:51:02.098599 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:51:02.098539 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:51:02.098599 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:51:02.098552 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:51:02.098599 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:51:02.098561 2562 projected.go:194] Error preparing data for projected volume kube-api-access-nqmck for pod openshift-network-diagnostics/network-check-target-k9xr6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:51:02.098694 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:51:02.098607 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9df8e85f-0f0a-41bb-af0a-cae68056131b-kube-api-access-nqmck podName:9df8e85f-0f0a-41bb-af0a-cae68056131b nodeName:}" failed. No retries permitted until 2026-04-16 23:51:34.098595114 +0000 UTC m=+66.307359761 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-nqmck" (UniqueName: "kubernetes.io/projected/9df8e85f-0f0a-41bb-af0a-cae68056131b-kube-api-access-nqmck") pod "network-check-target-k9xr6" (UID: "9df8e85f-0f0a-41bb-af0a-cae68056131b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:51:02.348492 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:02.348417 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6rhw" Apr 16 23:51:02.348645 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:02.348417 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9xr6" Apr 16 23:51:02.352330 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:02.352304 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 23:51:02.352330 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:02.352319 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 23:51:02.352564 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:02.352349 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 23:51:02.352564 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:02.352350 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xjj7v\"" Apr 16 23:51:02.352691 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:02.352634 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-x54vt\"" Apr 16 23:51:02.400891 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:02.400862 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/66c9f5e2-4d93-4cf6-96de-a4ade7175b77-metrics-tls\") pod \"dns-default-dxzbw\" (UID: \"66c9f5e2-4d93-4cf6-96de-a4ade7175b77\") " pod="openshift-dns/dns-default-dxzbw" Apr 16 23:51:02.401228 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:02.400912 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d46b010-8236-429d-af09-8fb8c4618d50-cert\") pod \"ingress-canary-495hb\" (UID: \"6d46b010-8236-429d-af09-8fb8c4618d50\") " pod="openshift-ingress-canary/ingress-canary-495hb" Apr 16 23:51:02.401228 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:51:02.400969 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:51:02.401228 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:51:02.401000 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:51:02.401228 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:51:02.401019 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66c9f5e2-4d93-4cf6-96de-a4ade7175b77-metrics-tls podName:66c9f5e2-4d93-4cf6-96de-a4ade7175b77 nodeName:}" failed. No retries permitted until 2026-04-16 23:51:04.401005715 +0000 UTC m=+36.609770364 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/66c9f5e2-4d93-4cf6-96de-a4ade7175b77-metrics-tls") pod "dns-default-dxzbw" (UID: "66c9f5e2-4d93-4cf6-96de-a4ade7175b77") : secret "dns-default-metrics-tls" not found Apr 16 23:51:02.401228 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:51:02.401044 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d46b010-8236-429d-af09-8fb8c4618d50-cert podName:6d46b010-8236-429d-af09-8fb8c4618d50 nodeName:}" failed. No retries permitted until 2026-04-16 23:51:04.401032885 +0000 UTC m=+36.609797532 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d46b010-8236-429d-af09-8fb8c4618d50-cert") pod "ingress-canary-495hb" (UID: "6d46b010-8236-429d-af09-8fb8c4618d50") : secret "canary-serving-cert" not found Apr 16 23:51:02.561915 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:02.561878 2562 generic.go:358] "Generic (PLEG): container finished" podID="987f3171-af5b-41d5-91e1-d31f667a8755" containerID="04dafc8e0eb5af701363006970e7c58e8552b1d722aaa6d1b361d0e0bbbd400b" exitCode=0 Apr 16 23:51:02.561915 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:02.561917 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kwxk9" event={"ID":"987f3171-af5b-41d5-91e1-d31f667a8755","Type":"ContainerDied","Data":"04dafc8e0eb5af701363006970e7c58e8552b1d722aaa6d1b361d0e0bbbd400b"} Apr 16 23:51:03.566541 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:03.566377 2562 generic.go:358] "Generic (PLEG): container finished" podID="987f3171-af5b-41d5-91e1-d31f667a8755" containerID="09e28d4ea4bc73de09f4db2dbd8a79817e1333ded93b84f35d09400f7af71bab" exitCode=0 Apr 16 23:51:03.566541 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:03.566450 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kwxk9" event={"ID":"987f3171-af5b-41d5-91e1-d31f667a8755","Type":"ContainerDied","Data":"09e28d4ea4bc73de09f4db2dbd8a79817e1333ded93b84f35d09400f7af71bab"} Apr 16 23:51:04.415358 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:04.415272 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/66c9f5e2-4d93-4cf6-96de-a4ade7175b77-metrics-tls\") pod \"dns-default-dxzbw\" (UID: \"66c9f5e2-4d93-4cf6-96de-a4ade7175b77\") " pod="openshift-dns/dns-default-dxzbw" Apr 16 23:51:04.415358 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:04.415333 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d46b010-8236-429d-af09-8fb8c4618d50-cert\") pod \"ingress-canary-495hb\" (UID: \"6d46b010-8236-429d-af09-8fb8c4618d50\") " pod="openshift-ingress-canary/ingress-canary-495hb" Apr 16 23:51:04.415526 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:51:04.415421 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:51:04.415526 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:51:04.415477 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66c9f5e2-4d93-4cf6-96de-a4ade7175b77-metrics-tls podName:66c9f5e2-4d93-4cf6-96de-a4ade7175b77 nodeName:}" failed. No retries permitted until 2026-04-16 23:51:08.415462688 +0000 UTC m=+40.624227342 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/66c9f5e2-4d93-4cf6-96de-a4ade7175b77-metrics-tls") pod "dns-default-dxzbw" (UID: "66c9f5e2-4d93-4cf6-96de-a4ade7175b77") : secret "dns-default-metrics-tls" not found Apr 16 23:51:04.415526 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:51:04.415426 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:51:04.415659 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:51:04.415535 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d46b010-8236-429d-af09-8fb8c4618d50-cert podName:6d46b010-8236-429d-af09-8fb8c4618d50 nodeName:}" failed. No retries permitted until 2026-04-16 23:51:08.415524498 +0000 UTC m=+40.624289152 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d46b010-8236-429d-af09-8fb8c4618d50-cert") pod "ingress-canary-495hb" (UID: "6d46b010-8236-429d-af09-8fb8c4618d50") : secret "canary-serving-cert" not found Apr 16 23:51:04.570988 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:04.570956 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kwxk9" event={"ID":"987f3171-af5b-41d5-91e1-d31f667a8755","Type":"ContainerStarted","Data":"3f9bda5e15c77853125e8618a39ac1f5b7d54ca24e5005c1bd24bad3f0b4bbb9"} Apr 16 23:51:04.590052 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:04.590009 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-kwxk9" podStartSLOduration=5.851509175 podStartE2EDuration="36.58999538s" podCreationTimestamp="2026-04-16 23:50:28 +0000 UTC" firstStartedPulling="2026-04-16 23:50:30.832462311 +0000 UTC m=+3.041226959" lastFinishedPulling="2026-04-16 23:51:01.570948516 +0000 UTC m=+33.779713164" observedRunningTime="2026-04-16 23:51:04.589115305 +0000 UTC m=+36.797879976" watchObservedRunningTime="2026-04-16 23:51:04.58999538 +0000 UTC m=+36.798760032" Apr 16 23:51:08.441710 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:08.441675 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/66c9f5e2-4d93-4cf6-96de-a4ade7175b77-metrics-tls\") pod \"dns-default-dxzbw\" (UID: \"66c9f5e2-4d93-4cf6-96de-a4ade7175b77\") " pod="openshift-dns/dns-default-dxzbw" Apr 16 23:51:08.442117 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:08.441727 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d46b010-8236-429d-af09-8fb8c4618d50-cert\") pod \"ingress-canary-495hb\" (UID: \"6d46b010-8236-429d-af09-8fb8c4618d50\") " pod="openshift-ingress-canary/ingress-canary-495hb" Apr 16 23:51:08.442117 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:51:08.441837 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:51:08.442117 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:51:08.441918 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66c9f5e2-4d93-4cf6-96de-a4ade7175b77-metrics-tls podName:66c9f5e2-4d93-4cf6-96de-a4ade7175b77 nodeName:}" failed. No retries permitted until 2026-04-16 23:51:16.441900335 +0000 UTC m=+48.650664988 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/66c9f5e2-4d93-4cf6-96de-a4ade7175b77-metrics-tls") pod "dns-default-dxzbw" (UID: "66c9f5e2-4d93-4cf6-96de-a4ade7175b77") : secret "dns-default-metrics-tls" not found Apr 16 23:51:08.442117 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:51:08.441842 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:51:08.442117 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:51:08.441995 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d46b010-8236-429d-af09-8fb8c4618d50-cert podName:6d46b010-8236-429d-af09-8fb8c4618d50 nodeName:}" failed. No retries permitted until 2026-04-16 23:51:16.441977893 +0000 UTC m=+48.650742543 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d46b010-8236-429d-af09-8fb8c4618d50-cert") pod "ingress-canary-495hb" (UID: "6d46b010-8236-429d-af09-8fb8c4618d50") : secret "canary-serving-cert" not found Apr 16 23:51:16.495718 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:16.495682 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/66c9f5e2-4d93-4cf6-96de-a4ade7175b77-metrics-tls\") pod \"dns-default-dxzbw\" (UID: \"66c9f5e2-4d93-4cf6-96de-a4ade7175b77\") " pod="openshift-dns/dns-default-dxzbw" Apr 16 23:51:16.496071 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:16.495724 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d46b010-8236-429d-af09-8fb8c4618d50-cert\") pod \"ingress-canary-495hb\" (UID: \"6d46b010-8236-429d-af09-8fb8c4618d50\") " pod="openshift-ingress-canary/ingress-canary-495hb" Apr 16 23:51:16.496071 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:51:16.495808 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:51:16.496071 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:51:16.495811 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:51:16.496071 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:51:16.495857 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d46b010-8236-429d-af09-8fb8c4618d50-cert podName:6d46b010-8236-429d-af09-8fb8c4618d50 nodeName:}" failed. No retries permitted until 2026-04-16 23:51:32.49584382 +0000 UTC m=+64.704608468 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d46b010-8236-429d-af09-8fb8c4618d50-cert") pod "ingress-canary-495hb" (UID: "6d46b010-8236-429d-af09-8fb8c4618d50") : secret "canary-serving-cert" not found Apr 16 23:51:16.496071 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:51:16.495870 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66c9f5e2-4d93-4cf6-96de-a4ade7175b77-metrics-tls podName:66c9f5e2-4d93-4cf6-96de-a4ade7175b77 nodeName:}" failed. No retries permitted until 2026-04-16 23:51:32.495864267 +0000 UTC m=+64.704628915 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/66c9f5e2-4d93-4cf6-96de-a4ade7175b77-metrics-tls") pod "dns-default-dxzbw" (UID: "66c9f5e2-4d93-4cf6-96de-a4ade7175b77") : secret "dns-default-metrics-tls" not found Apr 16 23:51:28.791057 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:28.791029 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-frzl2" Apr 16 23:51:32.596619 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:32.596581 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/66c9f5e2-4d93-4cf6-96de-a4ade7175b77-metrics-tls\") pod \"dns-default-dxzbw\" (UID: \"66c9f5e2-4d93-4cf6-96de-a4ade7175b77\") " pod="openshift-dns/dns-default-dxzbw" Apr 16 23:51:32.596989 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:32.596632 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d46b010-8236-429d-af09-8fb8c4618d50-cert\") pod \"ingress-canary-495hb\" (UID: \"6d46b010-8236-429d-af09-8fb8c4618d50\") " pod="openshift-ingress-canary/ingress-canary-495hb" Apr 16 23:51:32.596989 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:51:32.596718 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:51:32.596989 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:51:32.596736 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:51:32.596989 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:51:32.596789 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66c9f5e2-4d93-4cf6-96de-a4ade7175b77-metrics-tls podName:66c9f5e2-4d93-4cf6-96de-a4ade7175b77 nodeName:}" failed. No retries permitted until 2026-04-16 23:52:04.596772712 +0000 UTC m=+96.805537360 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/66c9f5e2-4d93-4cf6-96de-a4ade7175b77-metrics-tls") pod "dns-default-dxzbw" (UID: "66c9f5e2-4d93-4cf6-96de-a4ade7175b77") : secret "dns-default-metrics-tls" not found Apr 16 23:51:32.596989 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:51:32.596804 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d46b010-8236-429d-af09-8fb8c4618d50-cert podName:6d46b010-8236-429d-af09-8fb8c4618d50 nodeName:}" failed. No retries permitted until 2026-04-16 23:52:04.596797032 +0000 UTC m=+96.805561680 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d46b010-8236-429d-af09-8fb8c4618d50-cert") pod "ingress-canary-495hb" (UID: "6d46b010-8236-429d-af09-8fb8c4618d50") : secret "canary-serving-cert" not found Apr 16 23:51:34.004972 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:34.004942 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86301427-2e66-4c3a-ab22-55ef8ddc5580-metrics-certs\") pod \"network-metrics-daemon-f6rhw\" (UID: \"86301427-2e66-4c3a-ab22-55ef8ddc5580\") " pod="openshift-multus/network-metrics-daemon-f6rhw" Apr 16 23:51:34.007203 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:34.007174 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 23:51:34.015903 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:51:34.015888 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 23:51:34.015984 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:51:34.015937 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86301427-2e66-4c3a-ab22-55ef8ddc5580-metrics-certs podName:86301427-2e66-4c3a-ab22-55ef8ddc5580 nodeName:}" failed. No retries permitted until 2026-04-16 23:52:38.015923254 +0000 UTC m=+130.224687902 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86301427-2e66-4c3a-ab22-55ef8ddc5580-metrics-certs") pod "network-metrics-daemon-f6rhw" (UID: "86301427-2e66-4c3a-ab22-55ef8ddc5580") : secret "metrics-daemon-secret" not found Apr 16 23:51:34.106035 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:34.105992 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nqmck\" (UniqueName: \"kubernetes.io/projected/9df8e85f-0f0a-41bb-af0a-cae68056131b-kube-api-access-nqmck\") pod \"network-check-target-k9xr6\" (UID: \"9df8e85f-0f0a-41bb-af0a-cae68056131b\") " pod="openshift-network-diagnostics/network-check-target-k9xr6" Apr 16 23:51:34.108132 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:34.108116 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 23:51:34.118507 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:34.118488 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 23:51:34.129943 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:34.129917 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqmck\" (UniqueName: \"kubernetes.io/projected/9df8e85f-0f0a-41bb-af0a-cae68056131b-kube-api-access-nqmck\") pod \"network-check-target-k9xr6\" (UID: \"9df8e85f-0f0a-41bb-af0a-cae68056131b\") " pod="openshift-network-diagnostics/network-check-target-k9xr6" Apr 16 23:51:34.166044 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:34.166021 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-x54vt\"" Apr 16 23:51:34.174736 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:34.174719 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9xr6" Apr 16 23:51:34.290822 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:34.290760 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-k9xr6"] Apr 16 23:51:34.294539 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:51:34.294511 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9df8e85f_0f0a_41bb_af0a_cae68056131b.slice/crio-c24113e27e57181aa949a4609b438c7430ae9d2ec4200d1075498dd3cdec6d23 WatchSource:0}: Error finding container c24113e27e57181aa949a4609b438c7430ae9d2ec4200d1075498dd3cdec6d23: Status 404 returned error can't find the container with id c24113e27e57181aa949a4609b438c7430ae9d2ec4200d1075498dd3cdec6d23 Apr 16 23:51:34.625472 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:34.625395 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-k9xr6" event={"ID":"9df8e85f-0f0a-41bb-af0a-cae68056131b","Type":"ContainerStarted","Data":"c24113e27e57181aa949a4609b438c7430ae9d2ec4200d1075498dd3cdec6d23"} Apr 16 23:51:37.632589 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:37.632553 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-k9xr6" event={"ID":"9df8e85f-0f0a-41bb-af0a-cae68056131b","Type":"ContainerStarted","Data":"e4da3db7149b0990609cb19bbedc85b86a9895e4f5d3892af1192d422fa0815e"} Apr 16 23:51:37.632979 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:37.632658 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-k9xr6" Apr 16 23:51:37.646694 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:51:37.646655 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-k9xr6" podStartSLOduration=67.109733158 podStartE2EDuration="1m9.64664558s" podCreationTimestamp="2026-04-16 23:50:28 +0000 UTC" firstStartedPulling="2026-04-16 23:51:34.296592555 +0000 UTC m=+66.505357203" lastFinishedPulling="2026-04-16 23:51:36.833504973 +0000 UTC m=+69.042269625" observedRunningTime="2026-04-16 23:51:37.646521346 +0000 UTC m=+69.855286016" watchObservedRunningTime="2026-04-16 23:51:37.64664558 +0000 UTC m=+69.855410228" Apr 16 23:52:04.691747 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:04.691714 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/66c9f5e2-4d93-4cf6-96de-a4ade7175b77-metrics-tls\") pod \"dns-default-dxzbw\" (UID: \"66c9f5e2-4d93-4cf6-96de-a4ade7175b77\") " pod="openshift-dns/dns-default-dxzbw" Apr 16 23:52:04.692255 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:04.691757 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d46b010-8236-429d-af09-8fb8c4618d50-cert\") pod \"ingress-canary-495hb\" (UID: \"6d46b010-8236-429d-af09-8fb8c4618d50\") " pod="openshift-ingress-canary/ingress-canary-495hb" Apr 16 23:52:04.692255 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:04.691846 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:52:04.692255 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:04.691864 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:52:04.692255 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:04.691893 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d46b010-8236-429d-af09-8fb8c4618d50-cert podName:6d46b010-8236-429d-af09-8fb8c4618d50 nodeName:}" failed. No retries permitted until 2026-04-16 23:53:08.691880528 +0000 UTC m=+160.900645175 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d46b010-8236-429d-af09-8fb8c4618d50-cert") pod "ingress-canary-495hb" (UID: "6d46b010-8236-429d-af09-8fb8c4618d50") : secret "canary-serving-cert" not found Apr 16 23:52:04.692255 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:04.691932 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66c9f5e2-4d93-4cf6-96de-a4ade7175b77-metrics-tls podName:66c9f5e2-4d93-4cf6-96de-a4ade7175b77 nodeName:}" failed. No retries permitted until 2026-04-16 23:53:08.691914884 +0000 UTC m=+160.900679533 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/66c9f5e2-4d93-4cf6-96de-a4ade7175b77-metrics-tls") pod "dns-default-dxzbw" (UID: "66c9f5e2-4d93-4cf6-96de-a4ade7175b77") : secret "dns-default-metrics-tls" not found Apr 16 23:52:08.636797 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:08.636768 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-k9xr6" Apr 16 23:52:16.134696 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.134663 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqd7n"] Apr 16 23:52:16.137355 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.137340 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqd7n" Apr 16 23:52:16.139492 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.139463 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 23:52:16.139492 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.139475 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 23:52:16.140054 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.140038 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 23:52:16.140122 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.140041 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-mwjqw\"" Apr 16 23:52:16.145613 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.145592 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqd7n"] Apr 16 23:52:16.166827 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.166806 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9cms\" (UniqueName: \"kubernetes.io/projected/71a84a47-d4d7-45d4-ab7c-e7e052f9d655-kube-api-access-s9cms\") pod \"cluster-samples-operator-6dc5bdb6b4-gqd7n\" (UID: \"71a84a47-d4d7-45d4-ab7c-e7e052f9d655\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqd7n" Apr 16 23:52:16.166940 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.166877 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/71a84a47-d4d7-45d4-ab7c-e7e052f9d655-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gqd7n\" (UID: \"71a84a47-d4d7-45d4-ab7c-e7e052f9d655\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqd7n" Apr 16 23:52:16.236462 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.236440 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-prmqz"] Apr 16 23:52:16.239010 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.238976 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-prmqz" Apr 16 23:52:16.239767 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.239654 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-mrpzx"] Apr 16 23:52:16.241136 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.241099 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 23:52:16.241261 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.241158 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 23:52:16.241261 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.241232 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 23:52:16.241261 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.241248 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 23:52:16.241582 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.241567 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-cmpnc\"" Apr 16 23:52:16.242392 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.242376 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7bf9785878-zjgxw"] Apr 16 23:52:16.242500 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.242487 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-mrpzx" Apr 16 23:52:16.245227 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.244573 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 23:52:16.245523 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.245503 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" Apr 16 23:52:16.245690 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.245668 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 23:52:16.245973 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.245769 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 23:52:16.245973 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.245916 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-r7k6f\"" Apr 16 23:52:16.245973 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.245606 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 23:52:16.249044 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.249022 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 23:52:16.249655 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.249634 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 23:52:16.250110 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.250092 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-nf568\"" Apr 16 23:52:16.250216 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.250110 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 23:52:16.250700 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.250680 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 23:52:16.252666 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.252631 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-prmqz"] Apr 16 23:52:16.253685 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.253663 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-mrpzx"] Apr 16 23:52:16.254423 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.254401 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 23:52:16.255650 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.255633 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 23:52:16.258037 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.257868 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7bf9785878-zjgxw"] Apr 16 23:52:16.267135 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.267116 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5fddaa09-016b-490d-a9ab-9d50ff167b22-trusted-ca\") pod \"console-operator-9d4b6777b-mrpzx\" (UID: \"5fddaa09-016b-490d-a9ab-9d50ff167b22\") " pod="openshift-console-operator/console-operator-9d4b6777b-mrpzx" Apr 16 23:52:16.267249 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.267140 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/88e7efd3-590c-4e09-8fdb-d9252264d3ad-ca-trust-extracted\") pod \"image-registry-7bf9785878-zjgxw\" (UID: \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\") " pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" Apr 16 23:52:16.267249 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.267158 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/88e7efd3-590c-4e09-8fdb-d9252264d3ad-bound-sa-token\") pod \"image-registry-7bf9785878-zjgxw\" (UID: \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\") " pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" Apr 16 23:52:16.267249 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.267173 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/e44cc99f-6cf4-4c00-ac07-e08e682c6db6-snapshots\") pod \"insights-operator-585dfdc468-prmqz\" (UID: \"e44cc99f-6cf4-4c00-ac07-e08e682c6db6\") " pod="openshift-insights/insights-operator-585dfdc468-prmqz" Apr 16 23:52:16.267249 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.267208 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e44cc99f-6cf4-4c00-ac07-e08e682c6db6-service-ca-bundle\") pod \"insights-operator-585dfdc468-prmqz\" (UID: \"e44cc99f-6cf4-4c00-ac07-e08e682c6db6\") " pod="openshift-insights/insights-operator-585dfdc468-prmqz" Apr 16 23:52:16.267249 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.267235 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/88e7efd3-590c-4e09-8fdb-d9252264d3ad-registry-certificates\") pod \"image-registry-7bf9785878-zjgxw\" (UID: \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\") " pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" Apr 16 23:52:16.267485 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.267274 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9cms\" (UniqueName: \"kubernetes.io/projected/71a84a47-d4d7-45d4-ab7c-e7e052f9d655-kube-api-access-s9cms\") pod \"cluster-samples-operator-6dc5bdb6b4-gqd7n\" (UID: \"71a84a47-d4d7-45d4-ab7c-e7e052f9d655\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqd7n" Apr 16 23:52:16.267485 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.267294 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/88e7efd3-590c-4e09-8fdb-d9252264d3ad-image-registry-private-configuration\") pod \"image-registry-7bf9785878-zjgxw\" (UID: \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\") " pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" Apr 16 23:52:16.267485 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.267310 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/88e7efd3-590c-4e09-8fdb-d9252264d3ad-installation-pull-secrets\") pod \"image-registry-7bf9785878-zjgxw\" (UID: \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\") " pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" Apr 16 23:52:16.267485 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.267325 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e44cc99f-6cf4-4c00-ac07-e08e682c6db6-serving-cert\") pod \"insights-operator-585dfdc468-prmqz\" (UID: \"e44cc99f-6cf4-4c00-ac07-e08e682c6db6\") " pod="openshift-insights/insights-operator-585dfdc468-prmqz" Apr 16 23:52:16.267485 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.267340 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/88e7efd3-590c-4e09-8fdb-d9252264d3ad-registry-tls\") pod \"image-registry-7bf9785878-zjgxw\" (UID: \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\") " pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" Apr 16 23:52:16.267485 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.267354 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j6lg\" (UniqueName: \"kubernetes.io/projected/e44cc99f-6cf4-4c00-ac07-e08e682c6db6-kube-api-access-2j6lg\") pod \"insights-operator-585dfdc468-prmqz\" (UID: \"e44cc99f-6cf4-4c00-ac07-e08e682c6db6\") " pod="openshift-insights/insights-operator-585dfdc468-prmqz" Apr 16 23:52:16.267485 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.267402 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5fddaa09-016b-490d-a9ab-9d50ff167b22-serving-cert\") pod \"console-operator-9d4b6777b-mrpzx\" (UID: \"5fddaa09-016b-490d-a9ab-9d50ff167b22\") " pod="openshift-console-operator/console-operator-9d4b6777b-mrpzx" Apr 16 23:52:16.267485 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.267449 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e44cc99f-6cf4-4c00-ac07-e08e682c6db6-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-prmqz\" (UID: \"e44cc99f-6cf4-4c00-ac07-e08e682c6db6\") " pod="openshift-insights/insights-operator-585dfdc468-prmqz" Apr 16 23:52:16.267485 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.267490 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fddaa09-016b-490d-a9ab-9d50ff167b22-config\") pod \"console-operator-9d4b6777b-mrpzx\" (UID: \"5fddaa09-016b-490d-a9ab-9d50ff167b22\") " pod="openshift-console-operator/console-operator-9d4b6777b-mrpzx" Apr 16 23:52:16.267763 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.267523 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/71a84a47-d4d7-45d4-ab7c-e7e052f9d655-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gqd7n\" (UID: \"71a84a47-d4d7-45d4-ab7c-e7e052f9d655\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqd7n" Apr 16 23:52:16.267763 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.267541 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-457l9\" (UniqueName: \"kubernetes.io/projected/5fddaa09-016b-490d-a9ab-9d50ff167b22-kube-api-access-457l9\") pod \"console-operator-9d4b6777b-mrpzx\" (UID: \"5fddaa09-016b-490d-a9ab-9d50ff167b22\") " pod="openshift-console-operator/console-operator-9d4b6777b-mrpzx" Apr 16 23:52:16.267763 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.267556 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9zvj\" (UniqueName: \"kubernetes.io/projected/88e7efd3-590c-4e09-8fdb-d9252264d3ad-kube-api-access-b9zvj\") pod \"image-registry-7bf9785878-zjgxw\" (UID: \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\") " pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" Apr 16 23:52:16.267763 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.267590 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/88e7efd3-590c-4e09-8fdb-d9252264d3ad-trusted-ca\") pod \"image-registry-7bf9785878-zjgxw\" (UID: \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\") " pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" Apr 16 23:52:16.267763 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:16.267646 2562 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 23:52:16.267763 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:16.267697 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71a84a47-d4d7-45d4-ab7c-e7e052f9d655-samples-operator-tls podName:71a84a47-d4d7-45d4-ab7c-e7e052f9d655 nodeName:}" failed. No retries permitted until 2026-04-16 23:52:16.767681061 +0000 UTC m=+108.976445710 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/71a84a47-d4d7-45d4-ab7c-e7e052f9d655-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gqd7n" (UID: "71a84a47-d4d7-45d4-ab7c-e7e052f9d655") : secret "samples-operator-tls" not found Apr 16 23:52:16.267763 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.267713 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e44cc99f-6cf4-4c00-ac07-e08e682c6db6-tmp\") pod \"insights-operator-585dfdc468-prmqz\" (UID: \"e44cc99f-6cf4-4c00-ac07-e08e682c6db6\") " pod="openshift-insights/insights-operator-585dfdc468-prmqz" Apr 16 23:52:16.278984 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.278963 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9cms\" (UniqueName: \"kubernetes.io/projected/71a84a47-d4d7-45d4-ab7c-e7e052f9d655-kube-api-access-s9cms\") pod \"cluster-samples-operator-6dc5bdb6b4-gqd7n\" (UID: \"71a84a47-d4d7-45d4-ab7c-e7e052f9d655\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqd7n" Apr 16 23:52:16.337558 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.337528 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-c7mwx"] Apr 16 23:52:16.340374 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.340360 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-c7mwx" Apr 16 23:52:16.342378 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.342362 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 23:52:16.342599 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.342586 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-77gvh\"" Apr 16 23:52:16.342654 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.342606 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 23:52:16.342693 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.342606 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 23:52:16.342894 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.342879 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 23:52:16.350990 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.350974 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-c7mwx"] Apr 16 23:52:16.368340 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.368314 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzzzs\" (UniqueName: \"kubernetes.io/projected/c2e5d2fe-09a0-4110-9c80-2ae937a2a115-kube-api-access-qzzzs\") pod \"kube-storage-version-migrator-operator-6769c5d45-c7mwx\" (UID: \"c2e5d2fe-09a0-4110-9c80-2ae937a2a115\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-c7mwx" Apr 16 23:52:16.368435 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.368346 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/88e7efd3-590c-4e09-8fdb-d9252264d3ad-registry-tls\") pod \"image-registry-7bf9785878-zjgxw\" (UID: \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\") " pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" Apr 16 23:52:16.368435 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.368362 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2j6lg\" (UniqueName: \"kubernetes.io/projected/e44cc99f-6cf4-4c00-ac07-e08e682c6db6-kube-api-access-2j6lg\") pod \"insights-operator-585dfdc468-prmqz\" (UID: \"e44cc99f-6cf4-4c00-ac07-e08e682c6db6\") " pod="openshift-insights/insights-operator-585dfdc468-prmqz" Apr 16 23:52:16.368435 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.368382 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5fddaa09-016b-490d-a9ab-9d50ff167b22-serving-cert\") pod \"console-operator-9d4b6777b-mrpzx\" (UID: \"5fddaa09-016b-490d-a9ab-9d50ff167b22\") " pod="openshift-console-operator/console-operator-9d4b6777b-mrpzx" Apr 16 23:52:16.368435 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.368399 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e44cc99f-6cf4-4c00-ac07-e08e682c6db6-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-prmqz\" (UID: \"e44cc99f-6cf4-4c00-ac07-e08e682c6db6\") " pod="openshift-insights/insights-operator-585dfdc468-prmqz" Apr 16 23:52:16.368610 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.368438 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fddaa09-016b-490d-a9ab-9d50ff167b22-config\") pod \"console-operator-9d4b6777b-mrpzx\" (UID: \"5fddaa09-016b-490d-a9ab-9d50ff167b22\") " pod="openshift-console-operator/console-operator-9d4b6777b-mrpzx" Apr 16 23:52:16.368610 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:16.368481 2562 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 23:52:16.368610 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.368492 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-457l9\" (UniqueName: \"kubernetes.io/projected/5fddaa09-016b-490d-a9ab-9d50ff167b22-kube-api-access-457l9\") pod \"console-operator-9d4b6777b-mrpzx\" (UID: \"5fddaa09-016b-490d-a9ab-9d50ff167b22\") " pod="openshift-console-operator/console-operator-9d4b6777b-mrpzx" Apr 16 23:52:16.368610 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:16.368500 2562 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bf9785878-zjgxw: secret "image-registry-tls" not found Apr 16 23:52:16.368610 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:16.368568 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/88e7efd3-590c-4e09-8fdb-d9252264d3ad-registry-tls podName:88e7efd3-590c-4e09-8fdb-d9252264d3ad nodeName:}" failed. No retries permitted until 2026-04-16 23:52:16.868550867 +0000 UTC m=+109.077315515 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/88e7efd3-590c-4e09-8fdb-d9252264d3ad-registry-tls") pod "image-registry-7bf9785878-zjgxw" (UID: "88e7efd3-590c-4e09-8fdb-d9252264d3ad") : secret "image-registry-tls" not found Apr 16 23:52:16.368839 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.368649 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b9zvj\" (UniqueName: \"kubernetes.io/projected/88e7efd3-590c-4e09-8fdb-d9252264d3ad-kube-api-access-b9zvj\") pod \"image-registry-7bf9785878-zjgxw\" (UID: \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\") " pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" Apr 16 23:52:16.368839 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.368703 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/88e7efd3-590c-4e09-8fdb-d9252264d3ad-trusted-ca\") pod \"image-registry-7bf9785878-zjgxw\" (UID: \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\") " pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" Apr 16 23:52:16.368839 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.368758 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e44cc99f-6cf4-4c00-ac07-e08e682c6db6-tmp\") pod \"insights-operator-585dfdc468-prmqz\" (UID: \"e44cc99f-6cf4-4c00-ac07-e08e682c6db6\") " pod="openshift-insights/insights-operator-585dfdc468-prmqz" Apr 16 23:52:16.368839 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.368786 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5fddaa09-016b-490d-a9ab-9d50ff167b22-trusted-ca\") pod \"console-operator-9d4b6777b-mrpzx\" (UID: \"5fddaa09-016b-490d-a9ab-9d50ff167b22\") " pod="openshift-console-operator/console-operator-9d4b6777b-mrpzx" Apr 16 23:52:16.369035 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.368915 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/88e7efd3-590c-4e09-8fdb-d9252264d3ad-ca-trust-extracted\") pod \"image-registry-7bf9785878-zjgxw\" (UID: \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\") " pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" Apr 16 23:52:16.369035 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.368952 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/88e7efd3-590c-4e09-8fdb-d9252264d3ad-bound-sa-token\") pod \"image-registry-7bf9785878-zjgxw\" (UID: \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\") " pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" Apr 16 23:52:16.369035 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.368981 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/e44cc99f-6cf4-4c00-ac07-e08e682c6db6-snapshots\") pod \"insights-operator-585dfdc468-prmqz\" (UID: \"e44cc99f-6cf4-4c00-ac07-e08e682c6db6\") " pod="openshift-insights/insights-operator-585dfdc468-prmqz" Apr 16 23:52:16.369035 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.369007 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e44cc99f-6cf4-4c00-ac07-e08e682c6db6-service-ca-bundle\") pod \"insights-operator-585dfdc468-prmqz\" (UID: \"e44cc99f-6cf4-4c00-ac07-e08e682c6db6\") " pod="openshift-insights/insights-operator-585dfdc468-prmqz" Apr 16 23:52:16.369254 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.369038 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/88e7efd3-590c-4e09-8fdb-d9252264d3ad-registry-certificates\") pod \"image-registry-7bf9785878-zjgxw\" (UID: \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\") " pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" Apr 16 23:52:16.369254 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.369070 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2e5d2fe-09a0-4110-9c80-2ae937a2a115-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-c7mwx\" (UID: \"c2e5d2fe-09a0-4110-9c80-2ae937a2a115\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-c7mwx" Apr 16 23:52:16.369254 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.369096 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2e5d2fe-09a0-4110-9c80-2ae937a2a115-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-c7mwx\" (UID: \"c2e5d2fe-09a0-4110-9c80-2ae937a2a115\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-c7mwx" Apr 16 23:52:16.369254 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.369143 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/88e7efd3-590c-4e09-8fdb-d9252264d3ad-image-registry-private-configuration\") pod \"image-registry-7bf9785878-zjgxw\" (UID: \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\") " pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" Apr 16 23:52:16.369254 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.369172 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/88e7efd3-590c-4e09-8fdb-d9252264d3ad-installation-pull-secrets\") pod \"image-registry-7bf9785878-zjgxw\" (UID: \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\") " pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" Apr 16 23:52:16.369254 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.369225 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e44cc99f-6cf4-4c00-ac07-e08e682c6db6-serving-cert\") pod \"insights-operator-585dfdc468-prmqz\" (UID: \"e44cc99f-6cf4-4c00-ac07-e08e682c6db6\") " pod="openshift-insights/insights-operator-585dfdc468-prmqz" Apr 16 23:52:16.369590 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.369554 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e44cc99f-6cf4-4c00-ac07-e08e682c6db6-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-prmqz\" (UID: \"e44cc99f-6cf4-4c00-ac07-e08e682c6db6\") " pod="openshift-insights/insights-operator-585dfdc468-prmqz" Apr 16 23:52:16.369676 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.369654 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/88e7efd3-590c-4e09-8fdb-d9252264d3ad-trusted-ca\") pod \"image-registry-7bf9785878-zjgxw\" (UID: \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\") " pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" Apr 16 23:52:16.369811 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.369657 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/88e7efd3-590c-4e09-8fdb-d9252264d3ad-ca-trust-extracted\") pod \"image-registry-7bf9785878-zjgxw\" (UID: \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\") " pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" Apr 16 23:52:16.369913 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.369142 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e44cc99f-6cf4-4c00-ac07-e08e682c6db6-tmp\") pod \"insights-operator-585dfdc468-prmqz\" (UID: \"e44cc99f-6cf4-4c00-ac07-e08e682c6db6\") " pod="openshift-insights/insights-operator-585dfdc468-prmqz" Apr 16 23:52:16.369999 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.369669 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5fddaa09-016b-490d-a9ab-9d50ff167b22-trusted-ca\") pod \"console-operator-9d4b6777b-mrpzx\" (UID: \"5fddaa09-016b-490d-a9ab-9d50ff167b22\") " pod="openshift-console-operator/console-operator-9d4b6777b-mrpzx" Apr 16 23:52:16.370080 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.370056 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e44cc99f-6cf4-4c00-ac07-e08e682c6db6-service-ca-bundle\") pod \"insights-operator-585dfdc468-prmqz\" (UID: \"e44cc99f-6cf4-4c00-ac07-e08e682c6db6\") " pod="openshift-insights/insights-operator-585dfdc468-prmqz" Apr 16 23:52:16.370236 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.370045 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/e44cc99f-6cf4-4c00-ac07-e08e682c6db6-snapshots\") pod \"insights-operator-585dfdc468-prmqz\" (UID: \"e44cc99f-6cf4-4c00-ac07-e08e682c6db6\") " pod="openshift-insights/insights-operator-585dfdc468-prmqz" Apr 16 23:52:16.370236 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.370082 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/88e7efd3-590c-4e09-8fdb-d9252264d3ad-registry-certificates\") pod \"image-registry-7bf9785878-zjgxw\" (UID: \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\") " pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" Apr 16 23:52:16.370407 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.370286 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fddaa09-016b-490d-a9ab-9d50ff167b22-config\") pod \"console-operator-9d4b6777b-mrpzx\" (UID: \"5fddaa09-016b-490d-a9ab-9d50ff167b22\") " pod="openshift-console-operator/console-operator-9d4b6777b-mrpzx" Apr 16 23:52:16.371176 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.371149 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5fddaa09-016b-490d-a9ab-9d50ff167b22-serving-cert\") pod \"console-operator-9d4b6777b-mrpzx\" (UID: \"5fddaa09-016b-490d-a9ab-9d50ff167b22\") " pod="openshift-console-operator/console-operator-9d4b6777b-mrpzx" Apr 16 23:52:16.371679 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.371657 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e44cc99f-6cf4-4c00-ac07-e08e682c6db6-serving-cert\") pod \"insights-operator-585dfdc468-prmqz\" (UID: \"e44cc99f-6cf4-4c00-ac07-e08e682c6db6\") " pod="openshift-insights/insights-operator-585dfdc468-prmqz" Apr 16 23:52:16.371841 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.371825 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/88e7efd3-590c-4e09-8fdb-d9252264d3ad-image-registry-private-configuration\") pod \"image-registry-7bf9785878-zjgxw\" (UID: \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\") " pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" Apr 16 23:52:16.371968 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.371954 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/88e7efd3-590c-4e09-8fdb-d9252264d3ad-installation-pull-secrets\") pod \"image-registry-7bf9785878-zjgxw\" (UID: \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\") " pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" Apr 16 23:52:16.376952 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.376929 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j6lg\" (UniqueName: \"kubernetes.io/projected/e44cc99f-6cf4-4c00-ac07-e08e682c6db6-kube-api-access-2j6lg\") pod \"insights-operator-585dfdc468-prmqz\" (UID: \"e44cc99f-6cf4-4c00-ac07-e08e682c6db6\") " pod="openshift-insights/insights-operator-585dfdc468-prmqz" Apr 16 23:52:16.377607 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.377580 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9zvj\" (UniqueName: \"kubernetes.io/projected/88e7efd3-590c-4e09-8fdb-d9252264d3ad-kube-api-access-b9zvj\") pod \"image-registry-7bf9785878-zjgxw\" (UID: \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\") " pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" Apr 16 23:52:16.378134 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.378111 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-457l9\" (UniqueName: \"kubernetes.io/projected/5fddaa09-016b-490d-a9ab-9d50ff167b22-kube-api-access-457l9\") pod \"console-operator-9d4b6777b-mrpzx\" (UID: \"5fddaa09-016b-490d-a9ab-9d50ff167b22\") " pod="openshift-console-operator/console-operator-9d4b6777b-mrpzx" Apr 16 23:52:16.378329 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.378308 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/88e7efd3-590c-4e09-8fdb-d9252264d3ad-bound-sa-token\") pod \"image-registry-7bf9785878-zjgxw\" (UID: \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\") " pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" Apr 16 23:52:16.469604 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.469583 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qzzzs\" (UniqueName: \"kubernetes.io/projected/c2e5d2fe-09a0-4110-9c80-2ae937a2a115-kube-api-access-qzzzs\") pod \"kube-storage-version-migrator-operator-6769c5d45-c7mwx\" (UID: \"c2e5d2fe-09a0-4110-9c80-2ae937a2a115\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-c7mwx" Apr 16 23:52:16.469689 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.469676 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2e5d2fe-09a0-4110-9c80-2ae937a2a115-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-c7mwx\" (UID: \"c2e5d2fe-09a0-4110-9c80-2ae937a2a115\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-c7mwx" Apr 16 23:52:16.469727 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.469701 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2e5d2fe-09a0-4110-9c80-2ae937a2a115-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-c7mwx\" (UID: \"c2e5d2fe-09a0-4110-9c80-2ae937a2a115\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-c7mwx" Apr 16 23:52:16.470209 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.470175 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2e5d2fe-09a0-4110-9c80-2ae937a2a115-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-c7mwx\" (UID: \"c2e5d2fe-09a0-4110-9c80-2ae937a2a115\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-c7mwx" Apr 16 23:52:16.471552 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.471531 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2e5d2fe-09a0-4110-9c80-2ae937a2a115-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-c7mwx\" (UID: \"c2e5d2fe-09a0-4110-9c80-2ae937a2a115\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-c7mwx" Apr 16 23:52:16.475851 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.475830 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzzzs\" (UniqueName: \"kubernetes.io/projected/c2e5d2fe-09a0-4110-9c80-2ae937a2a115-kube-api-access-qzzzs\") pod \"kube-storage-version-migrator-operator-6769c5d45-c7mwx\" (UID: \"c2e5d2fe-09a0-4110-9c80-2ae937a2a115\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-c7mwx" Apr 16 23:52:16.553430 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.553411 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-prmqz" Apr 16 23:52:16.560946 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.560930 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-mrpzx" Apr 16 23:52:16.649482 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.649455 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-c7mwx" Apr 16 23:52:16.670581 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.670558 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-prmqz"] Apr 16 23:52:16.673789 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:52:16.673763 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode44cc99f_6cf4_4c00_ac07_e08e682c6db6.slice/crio-13571cbb2b3c98132beef55f6c5486f9c39beb87c9feffb383164464dac44739 WatchSource:0}: Error finding container 13571cbb2b3c98132beef55f6c5486f9c39beb87c9feffb383164464dac44739: Status 404 returned error can't find the container with id 13571cbb2b3c98132beef55f6c5486f9c39beb87c9feffb383164464dac44739 Apr 16 23:52:16.681820 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.681796 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-mrpzx"] Apr 16 23:52:16.685842 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:52:16.685806 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fddaa09_016b_490d_a9ab_9d50ff167b22.slice/crio-b539a5fea598d96eab75247a59bc96e6673065a8d1c413a6db585e0dfb2ab455 WatchSource:0}: Error finding container b539a5fea598d96eab75247a59bc96e6673065a8d1c413a6db585e0dfb2ab455: Status 404 returned error can't find the container with id b539a5fea598d96eab75247a59bc96e6673065a8d1c413a6db585e0dfb2ab455 Apr 16 23:52:16.704928 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.704894 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-prmqz" event={"ID":"e44cc99f-6cf4-4c00-ac07-e08e682c6db6","Type":"ContainerStarted","Data":"13571cbb2b3c98132beef55f6c5486f9c39beb87c9feffb383164464dac44739"} Apr 16 23:52:16.707537 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.707497 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-mrpzx" event={"ID":"5fddaa09-016b-490d-a9ab-9d50ff167b22","Type":"ContainerStarted","Data":"b539a5fea598d96eab75247a59bc96e6673065a8d1c413a6db585e0dfb2ab455"} Apr 16 23:52:16.761032 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.761000 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-c7mwx"] Apr 16 23:52:16.765178 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:52:16.765149 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2e5d2fe_09a0_4110_9c80_2ae937a2a115.slice/crio-d2ff6050f2f0cf2c219016df334e1871e69856bd3755130d34de01f87f46d6a7 WatchSource:0}: Error finding container d2ff6050f2f0cf2c219016df334e1871e69856bd3755130d34de01f87f46d6a7: Status 404 returned error can't find the container with id d2ff6050f2f0cf2c219016df334e1871e69856bd3755130d34de01f87f46d6a7 Apr 16 23:52:16.772022 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.771999 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/71a84a47-d4d7-45d4-ab7c-e7e052f9d655-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gqd7n\" (UID: \"71a84a47-d4d7-45d4-ab7c-e7e052f9d655\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqd7n" Apr 16 23:52:16.772112 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:16.772094 2562 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 23:52:16.772168 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:16.772139 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71a84a47-d4d7-45d4-ab7c-e7e052f9d655-samples-operator-tls podName:71a84a47-d4d7-45d4-ab7c-e7e052f9d655 nodeName:}" failed. No retries permitted until 2026-04-16 23:52:17.772126732 +0000 UTC m=+109.980891380 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/71a84a47-d4d7-45d4-ab7c-e7e052f9d655-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gqd7n" (UID: "71a84a47-d4d7-45d4-ab7c-e7e052f9d655") : secret "samples-operator-tls" not found Apr 16 23:52:16.872964 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:16.872936 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/88e7efd3-590c-4e09-8fdb-d9252264d3ad-registry-tls\") pod \"image-registry-7bf9785878-zjgxw\" (UID: \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\") " pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" Apr 16 23:52:16.873087 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:16.873070 2562 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 23:52:16.873127 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:16.873089 2562 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bf9785878-zjgxw: secret "image-registry-tls" not found Apr 16 23:52:16.873158 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:16.873131 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/88e7efd3-590c-4e09-8fdb-d9252264d3ad-registry-tls podName:88e7efd3-590c-4e09-8fdb-d9252264d3ad nodeName:}" failed. No retries permitted until 2026-04-16 23:52:17.873117946 +0000 UTC m=+110.081882598 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/88e7efd3-590c-4e09-8fdb-d9252264d3ad-registry-tls") pod "image-registry-7bf9785878-zjgxw" (UID: "88e7efd3-590c-4e09-8fdb-d9252264d3ad") : secret "image-registry-tls" not found Apr 16 23:52:17.711071 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:17.711011 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-c7mwx" event={"ID":"c2e5d2fe-09a0-4110-9c80-2ae937a2a115","Type":"ContainerStarted","Data":"d2ff6050f2f0cf2c219016df334e1871e69856bd3755130d34de01f87f46d6a7"} Apr 16 23:52:17.779202 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:17.779129 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/71a84a47-d4d7-45d4-ab7c-e7e052f9d655-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gqd7n\" (UID: \"71a84a47-d4d7-45d4-ab7c-e7e052f9d655\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqd7n" Apr 16 23:52:17.779373 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:17.779353 2562 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 23:52:17.779439 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:17.779419 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71a84a47-d4d7-45d4-ab7c-e7e052f9d655-samples-operator-tls podName:71a84a47-d4d7-45d4-ab7c-e7e052f9d655 nodeName:}" failed. No retries permitted until 2026-04-16 23:52:19.779397426 +0000 UTC m=+111.988162087 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/71a84a47-d4d7-45d4-ab7c-e7e052f9d655-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gqd7n" (UID: "71a84a47-d4d7-45d4-ab7c-e7e052f9d655") : secret "samples-operator-tls" not found Apr 16 23:52:17.880163 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:17.880038 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/88e7efd3-590c-4e09-8fdb-d9252264d3ad-registry-tls\") pod \"image-registry-7bf9785878-zjgxw\" (UID: \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\") " pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" Apr 16 23:52:17.880352 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:17.880185 2562 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 23:52:17.880352 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:17.880217 2562 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bf9785878-zjgxw: secret "image-registry-tls" not found Apr 16 23:52:17.880352 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:17.880278 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/88e7efd3-590c-4e09-8fdb-d9252264d3ad-registry-tls podName:88e7efd3-590c-4e09-8fdb-d9252264d3ad nodeName:}" failed. No retries permitted until 2026-04-16 23:52:19.880260042 +0000 UTC m=+112.089024709 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/88e7efd3-590c-4e09-8fdb-d9252264d3ad-registry-tls") pod "image-registry-7bf9785878-zjgxw" (UID: "88e7efd3-590c-4e09-8fdb-d9252264d3ad") : secret "image-registry-tls" not found Apr 16 23:52:19.716118 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:19.716088 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-prmqz" event={"ID":"e44cc99f-6cf4-4c00-ac07-e08e682c6db6","Type":"ContainerStarted","Data":"0e9d29d995a8760eedd390ea19d4785cbbe2d54e7e6839e9f43174cc11ce2dc4"} Apr 16 23:52:19.717616 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:19.717597 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrpzx_5fddaa09-016b-490d-a9ab-9d50ff167b22/console-operator/0.log" Apr 16 23:52:19.717701 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:19.717635 2562 generic.go:358] "Generic (PLEG): container finished" podID="5fddaa09-016b-490d-a9ab-9d50ff167b22" containerID="42a6a34cdf283a8fb2d8595ab4d92fce54e0850e807879fd3dc6f96773d97c58" exitCode=255 Apr 16 23:52:19.717701 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:19.717665 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-mrpzx" event={"ID":"5fddaa09-016b-490d-a9ab-9d50ff167b22","Type":"ContainerDied","Data":"42a6a34cdf283a8fb2d8595ab4d92fce54e0850e807879fd3dc6f96773d97c58"} Apr 16 23:52:19.717893 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:19.717878 2562 scope.go:117] "RemoveContainer" containerID="42a6a34cdf283a8fb2d8595ab4d92fce54e0850e807879fd3dc6f96773d97c58" Apr 16 23:52:19.719092 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:19.719071 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-c7mwx" event={"ID":"c2e5d2fe-09a0-4110-9c80-2ae937a2a115","Type":"ContainerStarted","Data":"348a43c729ccd19a91cff613350f4324c3b647629677cb253d20ecca50dba10a"} Apr 16 23:52:19.730481 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:19.730445 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-prmqz" podStartSLOduration=0.869985673 podStartE2EDuration="3.730433918s" podCreationTimestamp="2026-04-16 23:52:16 +0000 UTC" firstStartedPulling="2026-04-16 23:52:16.675570025 +0000 UTC m=+108.884334673" lastFinishedPulling="2026-04-16 23:52:19.536018265 +0000 UTC m=+111.744782918" observedRunningTime="2026-04-16 23:52:19.729367956 +0000 UTC m=+111.938132628" watchObservedRunningTime="2026-04-16 23:52:19.730433918 +0000 UTC m=+111.939198588" Apr 16 23:52:19.741967 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:19.741933 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-c7mwx" podStartSLOduration=0.968135557 podStartE2EDuration="3.741922942s" podCreationTimestamp="2026-04-16 23:52:16 +0000 UTC" firstStartedPulling="2026-04-16 23:52:16.767055643 +0000 UTC m=+108.975820291" lastFinishedPulling="2026-04-16 23:52:19.540843013 +0000 UTC m=+111.749607676" observedRunningTime="2026-04-16 23:52:19.741494392 +0000 UTC m=+111.950259060" watchObservedRunningTime="2026-04-16 23:52:19.741922942 +0000 UTC m=+111.950687612" Apr 16 23:52:19.800607 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:19.800580 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/71a84a47-d4d7-45d4-ab7c-e7e052f9d655-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gqd7n\" (UID: \"71a84a47-d4d7-45d4-ab7c-e7e052f9d655\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqd7n" Apr 16 23:52:19.800926 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:19.800905 2562 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 23:52:19.801016 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:19.800973 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71a84a47-d4d7-45d4-ab7c-e7e052f9d655-samples-operator-tls podName:71a84a47-d4d7-45d4-ab7c-e7e052f9d655 nodeName:}" failed. No retries permitted until 2026-04-16 23:52:23.800953843 +0000 UTC m=+116.009718505 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/71a84a47-d4d7-45d4-ab7c-e7e052f9d655-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gqd7n" (UID: "71a84a47-d4d7-45d4-ab7c-e7e052f9d655") : secret "samples-operator-tls" not found Apr 16 23:52:19.901816 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:19.901783 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/88e7efd3-590c-4e09-8fdb-d9252264d3ad-registry-tls\") pod \"image-registry-7bf9785878-zjgxw\" (UID: \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\") " pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" Apr 16 23:52:19.901923 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:19.901873 2562 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 23:52:19.901923 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:19.901883 2562 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bf9785878-zjgxw: secret "image-registry-tls" not found Apr 16 23:52:19.902030 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:19.901925 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/88e7efd3-590c-4e09-8fdb-d9252264d3ad-registry-tls podName:88e7efd3-590c-4e09-8fdb-d9252264d3ad nodeName:}" failed. No retries permitted until 2026-04-16 23:52:23.901912762 +0000 UTC m=+116.110677410 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/88e7efd3-590c-4e09-8fdb-d9252264d3ad-registry-tls") pod "image-registry-7bf9785878-zjgxw" (UID: "88e7efd3-590c-4e09-8fdb-d9252264d3ad") : secret "image-registry-tls" not found Apr 16 23:52:20.717802 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:20.717771 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-v8hbv"] Apr 16 23:52:20.720848 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:20.720822 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-v8hbv" Apr 16 23:52:20.722578 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:20.722562 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrpzx_5fddaa09-016b-490d-a9ab-9d50ff167b22/console-operator/1.log" Apr 16 23:52:20.722723 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:20.722699 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-jgznm\"" Apr 16 23:52:20.722962 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:20.722948 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrpzx_5fddaa09-016b-490d-a9ab-9d50ff167b22/console-operator/0.log" Apr 16 23:52:20.723026 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:20.722980 2562 generic.go:358] "Generic (PLEG): container finished" podID="5fddaa09-016b-490d-a9ab-9d50ff167b22" containerID="ec550585377c5366c2d5b78bffcf5ffbee32f0c3d536db62d60d47ca966b03b7" exitCode=255 Apr 16 23:52:20.723026 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:20.723008 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-mrpzx" event={"ID":"5fddaa09-016b-490d-a9ab-9d50ff167b22","Type":"ContainerDied","Data":"ec550585377c5366c2d5b78bffcf5ffbee32f0c3d536db62d60d47ca966b03b7"} Apr 16 23:52:20.723121 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:20.723053 2562 scope.go:117] "RemoveContainer" containerID="42a6a34cdf283a8fb2d8595ab4d92fce54e0850e807879fd3dc6f96773d97c58" Apr 16 23:52:20.723370 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:20.723351 2562 scope.go:117] "RemoveContainer" containerID="ec550585377c5366c2d5b78bffcf5ffbee32f0c3d536db62d60d47ca966b03b7" Apr 16 23:52:20.723455 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:20.723380 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 23:52:20.723572 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:20.723555 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 23:52:20.723572 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:20.723563 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-mrpzx_openshift-console-operator(5fddaa09-016b-490d-a9ab-9d50ff167b22)\"" pod="openshift-console-operator/console-operator-9d4b6777b-mrpzx" podUID="5fddaa09-016b-490d-a9ab-9d50ff167b22" Apr 16 23:52:20.727395 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:20.727377 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-v8hbv"] Apr 16 23:52:20.807411 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:20.807380 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r58r7\" (UniqueName: \"kubernetes.io/projected/26f2616f-10db-4f0c-8b82-86c22caa6d59-kube-api-access-r58r7\") pod \"migrator-74bb7799d9-v8hbv\" (UID: \"26f2616f-10db-4f0c-8b82-86c22caa6d59\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-v8hbv" Apr 16 23:52:20.908946 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:20.908909 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r58r7\" (UniqueName: \"kubernetes.io/projected/26f2616f-10db-4f0c-8b82-86c22caa6d59-kube-api-access-r58r7\") pod \"migrator-74bb7799d9-v8hbv\" (UID: \"26f2616f-10db-4f0c-8b82-86c22caa6d59\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-v8hbv" Apr 16 23:52:20.915477 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:20.915458 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r58r7\" (UniqueName: \"kubernetes.io/projected/26f2616f-10db-4f0c-8b82-86c22caa6d59-kube-api-access-r58r7\") pod \"migrator-74bb7799d9-v8hbv\" (UID: \"26f2616f-10db-4f0c-8b82-86c22caa6d59\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-v8hbv" Apr 16 23:52:21.030974 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:21.030898 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-v8hbv" Apr 16 23:52:21.139813 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:21.139785 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-v8hbv"] Apr 16 23:52:21.143354 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:52:21.143320 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26f2616f_10db_4f0c_8b82_86c22caa6d59.slice/crio-f6830988ec4214156b466483b1f4b778477288d12b17bc75d1d4b8f83440a363 WatchSource:0}: Error finding container f6830988ec4214156b466483b1f4b778477288d12b17bc75d1d4b8f83440a363: Status 404 returned error can't find the container with id f6830988ec4214156b466483b1f4b778477288d12b17bc75d1d4b8f83440a363 Apr 16 23:52:21.726353 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:21.726327 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrpzx_5fddaa09-016b-490d-a9ab-9d50ff167b22/console-operator/1.log" Apr 16 23:52:21.726796 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:21.726697 2562 scope.go:117] "RemoveContainer" containerID="ec550585377c5366c2d5b78bffcf5ffbee32f0c3d536db62d60d47ca966b03b7" Apr 16 23:52:21.726934 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:21.726906 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-mrpzx_openshift-console-operator(5fddaa09-016b-490d-a9ab-9d50ff167b22)\"" pod="openshift-console-operator/console-operator-9d4b6777b-mrpzx" podUID="5fddaa09-016b-490d-a9ab-9d50ff167b22" Apr 16 23:52:21.727575 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:21.727548 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-v8hbv" event={"ID":"26f2616f-10db-4f0c-8b82-86c22caa6d59","Type":"ContainerStarted","Data":"f6830988ec4214156b466483b1f4b778477288d12b17bc75d1d4b8f83440a363"} Apr 16 23:52:22.551804 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:22.551741 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mhvhh_e55c5371-047f-4464-8977-501dfa689dd1/dns-node-resolver/0.log" Apr 16 23:52:22.731663 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:22.731629 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-v8hbv" event={"ID":"26f2616f-10db-4f0c-8b82-86c22caa6d59","Type":"ContainerStarted","Data":"c6c7c298ba2f437c96d7816533142c33496d3267c5e2aaea7e1839cbd0c12830"} Apr 16 23:52:22.731663 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:22.731665 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-v8hbv" event={"ID":"26f2616f-10db-4f0c-8b82-86c22caa6d59","Type":"ContainerStarted","Data":"50d6bca55508d45387a4f007cc4fe234dc1a238165ae09c91782c0b23cc3fdd2"} Apr 16 23:52:22.745761 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:22.745716 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-v8hbv" podStartSLOduration=1.6373166860000001 podStartE2EDuration="2.745702485s" podCreationTimestamp="2026-04-16 23:52:20 +0000 UTC" firstStartedPulling="2026-04-16 23:52:21.14537897 +0000 UTC m=+113.354143619" lastFinishedPulling="2026-04-16 23:52:22.25376476 +0000 UTC m=+114.462529418" observedRunningTime="2026-04-16 23:52:22.74487651 +0000 UTC m=+114.953641180" watchObservedRunningTime="2026-04-16 23:52:22.745702485 +0000 UTC m=+114.954467154" Apr 16 23:52:23.751591 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:23.751571 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-hb7s9_ac3d5de2-ce49-4327-89f8-f4642c3bc2f3/node-ca/0.log" Apr 16 23:52:23.835461 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:23.835428 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/71a84a47-d4d7-45d4-ab7c-e7e052f9d655-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gqd7n\" (UID: \"71a84a47-d4d7-45d4-ab7c-e7e052f9d655\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqd7n" Apr 16 23:52:23.835605 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:23.835528 2562 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 23:52:23.835605 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:23.835595 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71a84a47-d4d7-45d4-ab7c-e7e052f9d655-samples-operator-tls podName:71a84a47-d4d7-45d4-ab7c-e7e052f9d655 nodeName:}" failed. No retries permitted until 2026-04-16 23:52:31.835576106 +0000 UTC m=+124.044340754 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/71a84a47-d4d7-45d4-ab7c-e7e052f9d655-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gqd7n" (UID: "71a84a47-d4d7-45d4-ab7c-e7e052f9d655") : secret "samples-operator-tls" not found Apr 16 23:52:23.935941 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:23.935915 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/88e7efd3-590c-4e09-8fdb-d9252264d3ad-registry-tls\") pod \"image-registry-7bf9785878-zjgxw\" (UID: \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\") " pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" Apr 16 23:52:23.936036 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:23.936003 2562 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 23:52:23.936036 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:23.936014 2562 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bf9785878-zjgxw: secret "image-registry-tls" not found Apr 16 23:52:23.936099 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:23.936054 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/88e7efd3-590c-4e09-8fdb-d9252264d3ad-registry-tls podName:88e7efd3-590c-4e09-8fdb-d9252264d3ad nodeName:}" failed. No retries permitted until 2026-04-16 23:52:31.936042362 +0000 UTC m=+124.144807009 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/88e7efd3-590c-4e09-8fdb-d9252264d3ad-registry-tls") pod "image-registry-7bf9785878-zjgxw" (UID: "88e7efd3-590c-4e09-8fdb-d9252264d3ad") : secret "image-registry-tls" not found Apr 16 23:52:26.561600 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:26.561564 2562 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-mrpzx" Apr 16 23:52:26.561600 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:26.561605 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-mrpzx" Apr 16 23:52:26.562000 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:26.561951 2562 scope.go:117] "RemoveContainer" containerID="ec550585377c5366c2d5b78bffcf5ffbee32f0c3d536db62d60d47ca966b03b7" Apr 16 23:52:26.562138 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:26.562120 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-mrpzx_openshift-console-operator(5fddaa09-016b-490d-a9ab-9d50ff167b22)\"" pod="openshift-console-operator/console-operator-9d4b6777b-mrpzx" podUID="5fddaa09-016b-490d-a9ab-9d50ff167b22" Apr 16 23:52:31.893390 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:31.893360 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/71a84a47-d4d7-45d4-ab7c-e7e052f9d655-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gqd7n\" (UID: \"71a84a47-d4d7-45d4-ab7c-e7e052f9d655\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqd7n" Apr 16 23:52:31.895648 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:31.895628 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/71a84a47-d4d7-45d4-ab7c-e7e052f9d655-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gqd7n\" (UID: \"71a84a47-d4d7-45d4-ab7c-e7e052f9d655\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqd7n" Apr 16 23:52:31.994410 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:31.994385 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/88e7efd3-590c-4e09-8fdb-d9252264d3ad-registry-tls\") pod \"image-registry-7bf9785878-zjgxw\" (UID: \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\") " pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" Apr 16 23:52:31.996428 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:31.996409 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/88e7efd3-590c-4e09-8fdb-d9252264d3ad-registry-tls\") pod \"image-registry-7bf9785878-zjgxw\" (UID: \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\") " pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" Apr 16 23:52:32.048627 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:32.048605 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-mwjqw\"" Apr 16 23:52:32.057102 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:32.057083 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqd7n" Apr 16 23:52:32.166907 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:32.166835 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqd7n"] Apr 16 23:52:32.168602 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:32.168583 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-nf568\"" Apr 16 23:52:32.176788 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:32.176763 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" Apr 16 23:52:32.290852 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:32.290824 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7bf9785878-zjgxw"] Apr 16 23:52:32.293479 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:52:32.293454 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88e7efd3_590c_4e09_8fdb_d9252264d3ad.slice/crio-9ec215506c5acb81f8cd4b2875f53057f42f3f7e0667ffa54dce1e061dddb40e WatchSource:0}: Error finding container 9ec215506c5acb81f8cd4b2875f53057f42f3f7e0667ffa54dce1e061dddb40e: Status 404 returned error can't find the container with id 9ec215506c5acb81f8cd4b2875f53057f42f3f7e0667ffa54dce1e061dddb40e Apr 16 23:52:32.755337 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:32.755293 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqd7n" event={"ID":"71a84a47-d4d7-45d4-ab7c-e7e052f9d655","Type":"ContainerStarted","Data":"03d0fbd46fb9c82a6e60ada89258fcd0f9bb2d2069a4c68ceddc7a379049f4c9"} Apr 16 23:52:32.756452 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:32.756426 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" event={"ID":"88e7efd3-590c-4e09-8fdb-d9252264d3ad","Type":"ContainerStarted","Data":"65660836d535580a49ddf3c3b68508d9d5573604c6d9f765666afd86d7687380"} Apr 16 23:52:32.756452 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:32.756456 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" event={"ID":"88e7efd3-590c-4e09-8fdb-d9252264d3ad","Type":"ContainerStarted","Data":"9ec215506c5acb81f8cd4b2875f53057f42f3f7e0667ffa54dce1e061dddb40e"} Apr 16 23:52:32.756615 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:32.756550 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" Apr 16 23:52:32.772501 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:32.772467 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" podStartSLOduration=16.772453689 podStartE2EDuration="16.772453689s" podCreationTimestamp="2026-04-16 23:52:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:52:32.772313434 +0000 UTC m=+124.981078105" watchObservedRunningTime="2026-04-16 23:52:32.772453689 +0000 UTC m=+124.981218360" Apr 16 23:52:34.762737 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:34.762704 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqd7n" event={"ID":"71a84a47-d4d7-45d4-ab7c-e7e052f9d655","Type":"ContainerStarted","Data":"b81d29f01eb7870559160f034df9107f4ad8860e4253e81396e6669deabc283a"} Apr 16 23:52:34.762737 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:34.762738 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqd7n" event={"ID":"71a84a47-d4d7-45d4-ab7c-e7e052f9d655","Type":"ContainerStarted","Data":"f580aea66399e68b0e04f0e4808617fa99e0e21ca5a9be49e4f3a5231166e574"} Apr 16 23:52:34.776963 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:34.776918 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqd7n" podStartSLOduration=16.920477485 podStartE2EDuration="18.776905768s" podCreationTimestamp="2026-04-16 23:52:16 +0000 UTC" firstStartedPulling="2026-04-16 23:52:32.207249924 +0000 UTC m=+124.416014574" lastFinishedPulling="2026-04-16 23:52:34.063678207 +0000 UTC m=+126.272442857" observedRunningTime="2026-04-16 23:52:34.776330335 +0000 UTC m=+126.985095018" watchObservedRunningTime="2026-04-16 23:52:34.776905768 +0000 UTC m=+126.985670439" Apr 16 23:52:38.036525 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:38.036488 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86301427-2e66-4c3a-ab22-55ef8ddc5580-metrics-certs\") pod \"network-metrics-daemon-f6rhw\" (UID: \"86301427-2e66-4c3a-ab22-55ef8ddc5580\") " pod="openshift-multus/network-metrics-daemon-f6rhw" Apr 16 23:52:38.038700 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:38.038675 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86301427-2e66-4c3a-ab22-55ef8ddc5580-metrics-certs\") pod \"network-metrics-daemon-f6rhw\" (UID: \"86301427-2e66-4c3a-ab22-55ef8ddc5580\") " pod="openshift-multus/network-metrics-daemon-f6rhw" Apr 16 23:52:38.061736 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:38.061710 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xjj7v\"" Apr 16 23:52:38.070372 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:38.070353 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6rhw" Apr 16 23:52:38.180112 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:38.180081 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-f6rhw"] Apr 16 23:52:38.183855 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:52:38.183821 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86301427_2e66_4c3a_ab22_55ef8ddc5580.slice/crio-0b49a4a1826cb6c347635e1936e495075ab6b3600dd2bbae8c8fa029019e7388 WatchSource:0}: Error finding container 0b49a4a1826cb6c347635e1936e495075ab6b3600dd2bbae8c8fa029019e7388: Status 404 returned error can't find the container with id 0b49a4a1826cb6c347635e1936e495075ab6b3600dd2bbae8c8fa029019e7388 Apr 16 23:52:38.773014 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:38.772975 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-f6rhw" event={"ID":"86301427-2e66-4c3a-ab22-55ef8ddc5580","Type":"ContainerStarted","Data":"0b49a4a1826cb6c347635e1936e495075ab6b3600dd2bbae8c8fa029019e7388"} Apr 16 23:52:39.777073 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:39.777046 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-f6rhw" event={"ID":"86301427-2e66-4c3a-ab22-55ef8ddc5580","Type":"ContainerStarted","Data":"2010ad6cb6aaaa08d0168f00b052514d8983216dc2697ddda40b78c84149d909"} Apr 16 23:52:39.777073 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:39.777078 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-f6rhw" event={"ID":"86301427-2e66-4c3a-ab22-55ef8ddc5580","Type":"ContainerStarted","Data":"4a60ce9baa2b69194d6a5262e82a537eddc4d5c3592f75688a8631d9f7093783"} Apr 16 23:52:39.790887 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:39.790843 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-f6rhw" podStartSLOduration=130.726228211 podStartE2EDuration="2m11.790830422s" podCreationTimestamp="2026-04-16 23:50:28 +0000 UTC" firstStartedPulling="2026-04-16 23:52:38.186237123 +0000 UTC m=+130.395001770" lastFinishedPulling="2026-04-16 23:52:39.250839334 +0000 UTC m=+131.459603981" observedRunningTime="2026-04-16 23:52:39.790088015 +0000 UTC m=+131.998852685" watchObservedRunningTime="2026-04-16 23:52:39.790830422 +0000 UTC m=+131.999595092" Apr 16 23:52:40.348809 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:40.348677 2562 scope.go:117] "RemoveContainer" containerID="ec550585377c5366c2d5b78bffcf5ffbee32f0c3d536db62d60d47ca966b03b7" Apr 16 23:52:40.781234 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:40.781184 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrpzx_5fddaa09-016b-490d-a9ab-9d50ff167b22/console-operator/2.log" Apr 16 23:52:40.781638 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:40.781580 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrpzx_5fddaa09-016b-490d-a9ab-9d50ff167b22/console-operator/1.log" Apr 16 23:52:40.781638 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:40.781615 2562 generic.go:358] "Generic (PLEG): container finished" podID="5fddaa09-016b-490d-a9ab-9d50ff167b22" containerID="6bb903dfc086a59598ef3350cc7075be2a9c8d22fee3949fe7c95b29a52a5a1f" exitCode=255 Apr 16 23:52:40.781744 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:40.781684 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-mrpzx" event={"ID":"5fddaa09-016b-490d-a9ab-9d50ff167b22","Type":"ContainerDied","Data":"6bb903dfc086a59598ef3350cc7075be2a9c8d22fee3949fe7c95b29a52a5a1f"} Apr 16 23:52:40.781744 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:40.781723 2562 scope.go:117] "RemoveContainer" containerID="ec550585377c5366c2d5b78bffcf5ffbee32f0c3d536db62d60d47ca966b03b7" Apr 16 23:52:40.782181 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:40.782165 2562 scope.go:117] "RemoveContainer" containerID="6bb903dfc086a59598ef3350cc7075be2a9c8d22fee3949fe7c95b29a52a5a1f" Apr 16 23:52:40.782356 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:40.782337 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-mrpzx_openshift-console-operator(5fddaa09-016b-490d-a9ab-9d50ff167b22)\"" pod="openshift-console-operator/console-operator-9d4b6777b-mrpzx" podUID="5fddaa09-016b-490d-a9ab-9d50ff167b22" Apr 16 23:52:41.786371 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:41.786344 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrpzx_5fddaa09-016b-490d-a9ab-9d50ff167b22/console-operator/2.log" Apr 16 23:52:44.969911 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:44.969878 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-cqkvw"] Apr 16 23:52:44.972917 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:44.972892 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-cqkvw" Apr 16 23:52:44.975145 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:44.975127 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 23:52:44.975701 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:44.975684 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-kxrl6\"" Apr 16 23:52:44.975797 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:44.975715 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 23:52:44.983993 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:44.983971 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-cqkvw"] Apr 16 23:52:45.029204 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:45.029164 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7bf9785878-zjgxw"] Apr 16 23:52:45.086204 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:45.086174 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/53632d89-82ac-405b-8bbf-09bc404147e0-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cqkvw\" (UID: \"53632d89-82ac-405b-8bbf-09bc404147e0\") " pod="openshift-insights/insights-runtime-extractor-cqkvw" Apr 16 23:52:45.086321 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:45.086285 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/53632d89-82ac-405b-8bbf-09bc404147e0-crio-socket\") pod \"insights-runtime-extractor-cqkvw\" (UID: \"53632d89-82ac-405b-8bbf-09bc404147e0\") " pod="openshift-insights/insights-runtime-extractor-cqkvw" Apr 16 23:52:45.086366 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:45.086328 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b298\" (UniqueName: \"kubernetes.io/projected/53632d89-82ac-405b-8bbf-09bc404147e0-kube-api-access-4b298\") pod \"insights-runtime-extractor-cqkvw\" (UID: \"53632d89-82ac-405b-8bbf-09bc404147e0\") " pod="openshift-insights/insights-runtime-extractor-cqkvw" Apr 16 23:52:45.086366 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:45.086360 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/53632d89-82ac-405b-8bbf-09bc404147e0-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-cqkvw\" (UID: \"53632d89-82ac-405b-8bbf-09bc404147e0\") " pod="openshift-insights/insights-runtime-extractor-cqkvw" Apr 16 23:52:45.086439 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:45.086386 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/53632d89-82ac-405b-8bbf-09bc404147e0-data-volume\") pod \"insights-runtime-extractor-cqkvw\" (UID: \"53632d89-82ac-405b-8bbf-09bc404147e0\") " pod="openshift-insights/insights-runtime-extractor-cqkvw" Apr 16 23:52:45.186782 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:45.186754 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4b298\" (UniqueName: \"kubernetes.io/projected/53632d89-82ac-405b-8bbf-09bc404147e0-kube-api-access-4b298\") pod \"insights-runtime-extractor-cqkvw\" (UID: \"53632d89-82ac-405b-8bbf-09bc404147e0\") " pod="openshift-insights/insights-runtime-extractor-cqkvw" Apr 16 23:52:45.186913 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:45.186794 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/53632d89-82ac-405b-8bbf-09bc404147e0-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-cqkvw\" (UID: \"53632d89-82ac-405b-8bbf-09bc404147e0\") " pod="openshift-insights/insights-runtime-extractor-cqkvw" Apr 16 23:52:45.186913 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:45.186819 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/53632d89-82ac-405b-8bbf-09bc404147e0-data-volume\") pod \"insights-runtime-extractor-cqkvw\" (UID: \"53632d89-82ac-405b-8bbf-09bc404147e0\") " pod="openshift-insights/insights-runtime-extractor-cqkvw" Apr 16 23:52:45.186913 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:45.186838 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/53632d89-82ac-405b-8bbf-09bc404147e0-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cqkvw\" (UID: \"53632d89-82ac-405b-8bbf-09bc404147e0\") " pod="openshift-insights/insights-runtime-extractor-cqkvw" Apr 16 23:52:45.186913 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:45.186881 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/53632d89-82ac-405b-8bbf-09bc404147e0-crio-socket\") pod \"insights-runtime-extractor-cqkvw\" (UID: \"53632d89-82ac-405b-8bbf-09bc404147e0\") " pod="openshift-insights/insights-runtime-extractor-cqkvw" Apr 16 23:52:45.187105 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:45.186957 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/53632d89-82ac-405b-8bbf-09bc404147e0-crio-socket\") pod \"insights-runtime-extractor-cqkvw\" (UID: \"53632d89-82ac-405b-8bbf-09bc404147e0\") " pod="openshift-insights/insights-runtime-extractor-cqkvw" Apr 16 23:52:45.187105 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:45.187098 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/53632d89-82ac-405b-8bbf-09bc404147e0-data-volume\") pod \"insights-runtime-extractor-cqkvw\" (UID: \"53632d89-82ac-405b-8bbf-09bc404147e0\") " pod="openshift-insights/insights-runtime-extractor-cqkvw" Apr 16 23:52:45.187362 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:45.187331 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/53632d89-82ac-405b-8bbf-09bc404147e0-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-cqkvw\" (UID: \"53632d89-82ac-405b-8bbf-09bc404147e0\") " pod="openshift-insights/insights-runtime-extractor-cqkvw" Apr 16 23:52:45.189057 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:45.189036 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/53632d89-82ac-405b-8bbf-09bc404147e0-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cqkvw\" (UID: \"53632d89-82ac-405b-8bbf-09bc404147e0\") " pod="openshift-insights/insights-runtime-extractor-cqkvw" Apr 16 23:52:45.193853 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:45.193831 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b298\" (UniqueName: \"kubernetes.io/projected/53632d89-82ac-405b-8bbf-09bc404147e0-kube-api-access-4b298\") pod \"insights-runtime-extractor-cqkvw\" (UID: \"53632d89-82ac-405b-8bbf-09bc404147e0\") " pod="openshift-insights/insights-runtime-extractor-cqkvw" Apr 16 23:52:45.282162 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:45.282086 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-cqkvw" Apr 16 23:52:45.394000 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:45.393970 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-cqkvw"] Apr 16 23:52:45.397077 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:52:45.397051 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53632d89_82ac_405b_8bbf_09bc404147e0.slice/crio-36140da41cacdcd29649b376758db5b7823a638fe13a684030dfb0f5d94fb3b8 WatchSource:0}: Error finding container 36140da41cacdcd29649b376758db5b7823a638fe13a684030dfb0f5d94fb3b8: Status 404 returned error can't find the container with id 36140da41cacdcd29649b376758db5b7823a638fe13a684030dfb0f5d94fb3b8 Apr 16 23:52:45.797713 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:45.797682 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cqkvw" event={"ID":"53632d89-82ac-405b-8bbf-09bc404147e0","Type":"ContainerStarted","Data":"ed075cbb1873a4d072473984759b1e197a4faed393909b3241c5cdd0643e84f3"} Apr 16 23:52:45.797854 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:45.797719 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cqkvw" event={"ID":"53632d89-82ac-405b-8bbf-09bc404147e0","Type":"ContainerStarted","Data":"36140da41cacdcd29649b376758db5b7823a638fe13a684030dfb0f5d94fb3b8"} Apr 16 23:52:46.561339 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:46.561317 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-mrpzx" Apr 16 23:52:46.561339 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:46.561346 2562 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-mrpzx" Apr 16 23:52:46.561747 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:46.561654 2562 scope.go:117] "RemoveContainer" containerID="6bb903dfc086a59598ef3350cc7075be2a9c8d22fee3949fe7c95b29a52a5a1f" Apr 16 23:52:46.561832 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:46.561815 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-mrpzx_openshift-console-operator(5fddaa09-016b-490d-a9ab-9d50ff167b22)\"" pod="openshift-console-operator/console-operator-9d4b6777b-mrpzx" podUID="5fddaa09-016b-490d-a9ab-9d50ff167b22" Apr 16 23:52:46.801327 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:46.801291 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cqkvw" event={"ID":"53632d89-82ac-405b-8bbf-09bc404147e0","Type":"ContainerStarted","Data":"b31697c2616059aabd4eb4f66d7dc7a9f2d1150aff41ab8a6115763b2ae60e54"} Apr 16 23:52:47.590635 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:47.590600 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-dh4k6"] Apr 16 23:52:47.595051 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:47.595031 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-dh4k6" Apr 16 23:52:47.597016 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:47.596996 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 23:52:47.597136 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:47.597117 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 23:52:47.597803 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:47.597784 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 23:52:47.597803 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:47.597795 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 23:52:47.597932 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:47.597820 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-xffvp\"" Apr 16 23:52:47.597932 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:47.597800 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 23:52:47.602792 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:47.602772 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-dh4k6"] Apr 16 23:52:47.707761 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:47.707736 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/04c6d30a-a073-440c-bd31-f046f4e9c445-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-dh4k6\" (UID: \"04c6d30a-a073-440c-bd31-f046f4e9c445\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dh4k6" Apr 16 23:52:47.707898 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:47.707777 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/04c6d30a-a073-440c-bd31-f046f4e9c445-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-dh4k6\" (UID: \"04c6d30a-a073-440c-bd31-f046f4e9c445\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dh4k6" Apr 16 23:52:47.707898 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:47.707822 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/04c6d30a-a073-440c-bd31-f046f4e9c445-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-dh4k6\" (UID: \"04c6d30a-a073-440c-bd31-f046f4e9c445\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dh4k6" Apr 16 23:52:47.707898 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:47.707860 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn8wv\" (UniqueName: \"kubernetes.io/projected/04c6d30a-a073-440c-bd31-f046f4e9c445-kube-api-access-fn8wv\") pod \"prometheus-operator-5676c8c784-dh4k6\" (UID: \"04c6d30a-a073-440c-bd31-f046f4e9c445\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dh4k6" Apr 16 23:52:47.805055 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:47.805026 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cqkvw" event={"ID":"53632d89-82ac-405b-8bbf-09bc404147e0","Type":"ContainerStarted","Data":"c44e80e9fff3ae59ee57d6cb6bc4eea9da18e8dbc64286059b5de9d77d9fabbf"} Apr 16 23:52:47.808230 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:47.808211 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/04c6d30a-a073-440c-bd31-f046f4e9c445-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-dh4k6\" (UID: \"04c6d30a-a073-440c-bd31-f046f4e9c445\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dh4k6" Apr 16 23:52:47.808307 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:47.808245 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fn8wv\" (UniqueName: \"kubernetes.io/projected/04c6d30a-a073-440c-bd31-f046f4e9c445-kube-api-access-fn8wv\") pod \"prometheus-operator-5676c8c784-dh4k6\" (UID: \"04c6d30a-a073-440c-bd31-f046f4e9c445\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dh4k6" Apr 16 23:52:47.808307 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:47.808288 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/04c6d30a-a073-440c-bd31-f046f4e9c445-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-dh4k6\" (UID: \"04c6d30a-a073-440c-bd31-f046f4e9c445\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dh4k6" Apr 16 23:52:47.808376 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:47.808308 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/04c6d30a-a073-440c-bd31-f046f4e9c445-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-dh4k6\" (UID: \"04c6d30a-a073-440c-bd31-f046f4e9c445\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dh4k6" Apr 16 23:52:47.808853 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:47.808834 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/04c6d30a-a073-440c-bd31-f046f4e9c445-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-dh4k6\" (UID: \"04c6d30a-a073-440c-bd31-f046f4e9c445\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dh4k6" Apr 16 23:52:47.810452 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:47.810432 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/04c6d30a-a073-440c-bd31-f046f4e9c445-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-dh4k6\" (UID: \"04c6d30a-a073-440c-bd31-f046f4e9c445\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dh4k6" Apr 16 23:52:47.810569 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:47.810552 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/04c6d30a-a073-440c-bd31-f046f4e9c445-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-dh4k6\" (UID: \"04c6d30a-a073-440c-bd31-f046f4e9c445\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dh4k6" Apr 16 23:52:47.814933 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:47.814907 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn8wv\" (UniqueName: \"kubernetes.io/projected/04c6d30a-a073-440c-bd31-f046f4e9c445-kube-api-access-fn8wv\") pod \"prometheus-operator-5676c8c784-dh4k6\" (UID: \"04c6d30a-a073-440c-bd31-f046f4e9c445\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dh4k6" Apr 16 23:52:47.821062 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:47.821027 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-cqkvw" podStartSLOduration=1.619004168 podStartE2EDuration="3.821017846s" podCreationTimestamp="2026-04-16 23:52:44 +0000 UTC" firstStartedPulling="2026-04-16 23:52:45.453325641 +0000 UTC m=+137.662090295" lastFinishedPulling="2026-04-16 23:52:47.655339323 +0000 UTC m=+139.864103973" observedRunningTime="2026-04-16 23:52:47.819797181 +0000 UTC m=+140.028561851" watchObservedRunningTime="2026-04-16 23:52:47.821017846 +0000 UTC m=+140.029782516" Apr 16 23:52:47.905594 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:47.905550 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-dh4k6" Apr 16 23:52:48.028623 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:48.028593 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-dh4k6"] Apr 16 23:52:48.031168 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:52:48.031137 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04c6d30a_a073_440c_bd31_f046f4e9c445.slice/crio-e7715aab92979761ef3cfa887b90795070dc8c0d054c110c0fc6e870f2953015 WatchSource:0}: Error finding container e7715aab92979761ef3cfa887b90795070dc8c0d054c110c0fc6e870f2953015: Status 404 returned error can't find the container with id e7715aab92979761ef3cfa887b90795070dc8c0d054c110c0fc6e870f2953015 Apr 16 23:52:48.809405 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:48.809368 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-dh4k6" event={"ID":"04c6d30a-a073-440c-bd31-f046f4e9c445","Type":"ContainerStarted","Data":"e7715aab92979761ef3cfa887b90795070dc8c0d054c110c0fc6e870f2953015"} Apr 16 23:52:49.813175 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:49.813144 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-dh4k6" event={"ID":"04c6d30a-a073-440c-bd31-f046f4e9c445","Type":"ContainerStarted","Data":"cb485b4350c07d58855dc10a397190e3b46c54bf093357807e43e83430d24bd9"} Apr 16 23:52:49.813549 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:49.813181 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-dh4k6" event={"ID":"04c6d30a-a073-440c-bd31-f046f4e9c445","Type":"ContainerStarted","Data":"276d8e51797e4c66f3505da602ca5b1ac52d9a963c4c6b6e86222198dd9d13b5"} Apr 16 23:52:49.828089 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:49.828048 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-dh4k6" podStartSLOduration=1.5813799400000002 podStartE2EDuration="2.828035095s" podCreationTimestamp="2026-04-16 23:52:47 +0000 UTC" firstStartedPulling="2026-04-16 23:52:48.03299415 +0000 UTC m=+140.241758798" lastFinishedPulling="2026-04-16 23:52:49.279649299 +0000 UTC m=+141.488413953" observedRunningTime="2026-04-16 23:52:49.827227685 +0000 UTC m=+142.035992356" watchObservedRunningTime="2026-04-16 23:52:49.828035095 +0000 UTC m=+142.036799756" Apr 16 23:52:51.936052 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:51.936017 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-xcgwl"] Apr 16 23:52:51.939805 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:51.939780 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-c4nhb"] Apr 16 23:52:51.939961 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:51.939942 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-xcgwl" Apr 16 23:52:51.942021 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:51.941998 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 23:52:51.942148 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:51.942110 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-xphsb\"" Apr 16 23:52:51.942148 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:51.942142 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 23:52:51.942286 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:51.942158 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 23:52:51.942769 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:51.942754 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-bf25v"] Apr 16 23:52:51.942906 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:51.942891 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c4nhb" Apr 16 23:52:51.944514 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:51.944496 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 23:52:51.944791 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:51.944773 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 23:52:51.944876 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:51.944776 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-5sqmk\"" Apr 16 23:52:51.945950 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:51.945933 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-bf25v" Apr 16 23:52:51.948021 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:51.947996 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 23:52:51.948120 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:51.948022 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 23:52:51.948552 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:51.948515 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-fhzzn\"" Apr 16 23:52:51.948689 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:51.948668 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 23:52:51.949842 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:51.949824 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-c4nhb"] Apr 16 23:52:51.951098 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:51.951079 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-xcgwl"] Apr 16 23:52:52.037497 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.037470 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d6890379-66f0-4714-b0f4-da3de91cbdc8-sys\") pod \"node-exporter-bf25v\" (UID: \"d6890379-66f0-4714-b0f4-da3de91cbdc8\") " pod="openshift-monitoring/node-exporter-bf25v" Apr 16 23:52:52.037622 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.037500 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d6890379-66f0-4714-b0f4-da3de91cbdc8-root\") pod \"node-exporter-bf25v\" (UID: \"d6890379-66f0-4714-b0f4-da3de91cbdc8\") " pod="openshift-monitoring/node-exporter-bf25v" Apr 16 23:52:52.037622 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.037522 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d6890379-66f0-4714-b0f4-da3de91cbdc8-node-exporter-textfile\") pod \"node-exporter-bf25v\" (UID: \"d6890379-66f0-4714-b0f4-da3de91cbdc8\") " pod="openshift-monitoring/node-exporter-bf25v" Apr 16 23:52:52.037622 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.037577 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f349e78a-8ba4-44d0-86d8-b96de7477daa-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-xcgwl\" (UID: \"f349e78a-8ba4-44d0-86d8-b96de7477daa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xcgwl" Apr 16 23:52:52.037622 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.037608 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/75c3d93e-0ccd-45ed-87ab-1abfec9fc614-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-c4nhb\" (UID: \"75c3d93e-0ccd-45ed-87ab-1abfec9fc614\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c4nhb" Apr 16 23:52:52.037752 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.037648 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f349e78a-8ba4-44d0-86d8-b96de7477daa-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-xcgwl\" (UID: \"f349e78a-8ba4-44d0-86d8-b96de7477daa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xcgwl" Apr 16 23:52:52.037752 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.037673 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f349e78a-8ba4-44d0-86d8-b96de7477daa-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-xcgwl\" (UID: \"f349e78a-8ba4-44d0-86d8-b96de7477daa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xcgwl" Apr 16 23:52:52.037752 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.037698 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d6890379-66f0-4714-b0f4-da3de91cbdc8-node-exporter-tls\") pod \"node-exporter-bf25v\" (UID: \"d6890379-66f0-4714-b0f4-da3de91cbdc8\") " pod="openshift-monitoring/node-exporter-bf25v" Apr 16 23:52:52.037752 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.037715 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d6890379-66f0-4714-b0f4-da3de91cbdc8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bf25v\" (UID: \"d6890379-66f0-4714-b0f4-da3de91cbdc8\") " pod="openshift-monitoring/node-exporter-bf25v" Apr 16 23:52:52.037752 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.037731 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d6890379-66f0-4714-b0f4-da3de91cbdc8-node-exporter-accelerators-collector-config\") pod \"node-exporter-bf25v\" (UID: \"d6890379-66f0-4714-b0f4-da3de91cbdc8\") " pod="openshift-monitoring/node-exporter-bf25v" Apr 16 23:52:52.037895 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.037811 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d6890379-66f0-4714-b0f4-da3de91cbdc8-metrics-client-ca\") pod \"node-exporter-bf25v\" (UID: \"d6890379-66f0-4714-b0f4-da3de91cbdc8\") " pod="openshift-monitoring/node-exporter-bf25v" Apr 16 23:52:52.037895 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.037840 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d6890379-66f0-4714-b0f4-da3de91cbdc8-node-exporter-wtmp\") pod \"node-exporter-bf25v\" (UID: \"d6890379-66f0-4714-b0f4-da3de91cbdc8\") " pod="openshift-monitoring/node-exporter-bf25v" Apr 16 23:52:52.037895 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.037871 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f349e78a-8ba4-44d0-86d8-b96de7477daa-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-xcgwl\" (UID: \"f349e78a-8ba4-44d0-86d8-b96de7477daa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xcgwl" Apr 16 23:52:52.037895 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.037888 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f349e78a-8ba4-44d0-86d8-b96de7477daa-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-xcgwl\" (UID: \"f349e78a-8ba4-44d0-86d8-b96de7477daa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xcgwl" Apr 16 23:52:52.038010 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.037906 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6chvf\" (UniqueName: \"kubernetes.io/projected/f349e78a-8ba4-44d0-86d8-b96de7477daa-kube-api-access-6chvf\") pod \"kube-state-metrics-69db897b98-xcgwl\" (UID: \"f349e78a-8ba4-44d0-86d8-b96de7477daa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xcgwl" Apr 16 23:52:52.038010 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.037921 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/75c3d93e-0ccd-45ed-87ab-1abfec9fc614-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-c4nhb\" (UID: \"75c3d93e-0ccd-45ed-87ab-1abfec9fc614\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c4nhb" Apr 16 23:52:52.038010 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.037936 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/75c3d93e-0ccd-45ed-87ab-1abfec9fc614-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-c4nhb\" (UID: \"75c3d93e-0ccd-45ed-87ab-1abfec9fc614\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c4nhb" Apr 16 23:52:52.038010 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.037950 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktjc2\" (UniqueName: \"kubernetes.io/projected/d6890379-66f0-4714-b0f4-da3de91cbdc8-kube-api-access-ktjc2\") pod \"node-exporter-bf25v\" (UID: \"d6890379-66f0-4714-b0f4-da3de91cbdc8\") " pod="openshift-monitoring/node-exporter-bf25v" Apr 16 23:52:52.038125 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.038012 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbngf\" (UniqueName: \"kubernetes.io/projected/75c3d93e-0ccd-45ed-87ab-1abfec9fc614-kube-api-access-mbngf\") pod \"openshift-state-metrics-9d44df66c-c4nhb\" (UID: \"75c3d93e-0ccd-45ed-87ab-1abfec9fc614\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c4nhb" Apr 16 23:52:52.138699 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.138667 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d6890379-66f0-4714-b0f4-da3de91cbdc8-metrics-client-ca\") pod \"node-exporter-bf25v\" (UID: \"d6890379-66f0-4714-b0f4-da3de91cbdc8\") " pod="openshift-monitoring/node-exporter-bf25v" Apr 16 23:52:52.138699 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.138702 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d6890379-66f0-4714-b0f4-da3de91cbdc8-node-exporter-wtmp\") pod \"node-exporter-bf25v\" (UID: \"d6890379-66f0-4714-b0f4-da3de91cbdc8\") " pod="openshift-monitoring/node-exporter-bf25v" Apr 16 23:52:52.138876 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.138730 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f349e78a-8ba4-44d0-86d8-b96de7477daa-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-xcgwl\" (UID: \"f349e78a-8ba4-44d0-86d8-b96de7477daa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xcgwl" Apr 16 23:52:52.138876 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.138847 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f349e78a-8ba4-44d0-86d8-b96de7477daa-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-xcgwl\" (UID: \"f349e78a-8ba4-44d0-86d8-b96de7477daa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xcgwl" Apr 16 23:52:52.138982 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.138870 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d6890379-66f0-4714-b0f4-da3de91cbdc8-node-exporter-wtmp\") pod \"node-exporter-bf25v\" (UID: \"d6890379-66f0-4714-b0f4-da3de91cbdc8\") " pod="openshift-monitoring/node-exporter-bf25v" Apr 16 23:52:52.138982 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.138882 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6chvf\" (UniqueName: \"kubernetes.io/projected/f349e78a-8ba4-44d0-86d8-b96de7477daa-kube-api-access-6chvf\") pod \"kube-state-metrics-69db897b98-xcgwl\" (UID: \"f349e78a-8ba4-44d0-86d8-b96de7477daa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xcgwl" Apr 16 23:52:52.138982 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.138931 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/75c3d93e-0ccd-45ed-87ab-1abfec9fc614-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-c4nhb\" (UID: \"75c3d93e-0ccd-45ed-87ab-1abfec9fc614\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c4nhb" Apr 16 23:52:52.138982 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.138964 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/75c3d93e-0ccd-45ed-87ab-1abfec9fc614-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-c4nhb\" (UID: \"75c3d93e-0ccd-45ed-87ab-1abfec9fc614\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c4nhb" Apr 16 23:52:52.139177 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.138988 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ktjc2\" (UniqueName: \"kubernetes.io/projected/d6890379-66f0-4714-b0f4-da3de91cbdc8-kube-api-access-ktjc2\") pod \"node-exporter-bf25v\" (UID: \"d6890379-66f0-4714-b0f4-da3de91cbdc8\") " pod="openshift-monitoring/node-exporter-bf25v" Apr 16 23:52:52.139177 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.139022 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mbngf\" (UniqueName: \"kubernetes.io/projected/75c3d93e-0ccd-45ed-87ab-1abfec9fc614-kube-api-access-mbngf\") pod \"openshift-state-metrics-9d44df66c-c4nhb\" (UID: \"75c3d93e-0ccd-45ed-87ab-1abfec9fc614\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c4nhb" Apr 16 23:52:52.139177 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.139054 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d6890379-66f0-4714-b0f4-da3de91cbdc8-sys\") pod \"node-exporter-bf25v\" (UID: \"d6890379-66f0-4714-b0f4-da3de91cbdc8\") " pod="openshift-monitoring/node-exporter-bf25v" Apr 16 23:52:52.139177 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.139078 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d6890379-66f0-4714-b0f4-da3de91cbdc8-root\") pod \"node-exporter-bf25v\" (UID: \"d6890379-66f0-4714-b0f4-da3de91cbdc8\") " pod="openshift-monitoring/node-exporter-bf25v" Apr 16 23:52:52.139177 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.139105 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d6890379-66f0-4714-b0f4-da3de91cbdc8-node-exporter-textfile\") pod \"node-exporter-bf25v\" (UID: \"d6890379-66f0-4714-b0f4-da3de91cbdc8\") " pod="openshift-monitoring/node-exporter-bf25v" Apr 16 23:52:52.139177 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.139134 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f349e78a-8ba4-44d0-86d8-b96de7477daa-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-xcgwl\" (UID: \"f349e78a-8ba4-44d0-86d8-b96de7477daa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xcgwl" Apr 16 23:52:52.139177 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.139157 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/75c3d93e-0ccd-45ed-87ab-1abfec9fc614-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-c4nhb\" (UID: \"75c3d93e-0ccd-45ed-87ab-1abfec9fc614\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c4nhb" Apr 16 23:52:52.139533 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.139185 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d6890379-66f0-4714-b0f4-da3de91cbdc8-sys\") pod \"node-exporter-bf25v\" (UID: \"d6890379-66f0-4714-b0f4-da3de91cbdc8\") " pod="openshift-monitoring/node-exporter-bf25v" Apr 16 23:52:52.139533 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.139229 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f349e78a-8ba4-44d0-86d8-b96de7477daa-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-xcgwl\" (UID: \"f349e78a-8ba4-44d0-86d8-b96de7477daa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xcgwl" Apr 16 23:52:52.139533 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.139264 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f349e78a-8ba4-44d0-86d8-b96de7477daa-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-xcgwl\" (UID: \"f349e78a-8ba4-44d0-86d8-b96de7477daa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xcgwl" Apr 16 23:52:52.139533 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.139275 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f349e78a-8ba4-44d0-86d8-b96de7477daa-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-xcgwl\" (UID: \"f349e78a-8ba4-44d0-86d8-b96de7477daa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xcgwl" Apr 16 23:52:52.139533 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.139301 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d6890379-66f0-4714-b0f4-da3de91cbdc8-node-exporter-tls\") pod \"node-exporter-bf25v\" (UID: \"d6890379-66f0-4714-b0f4-da3de91cbdc8\") " pod="openshift-monitoring/node-exporter-bf25v" Apr 16 23:52:52.139533 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.139328 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d6890379-66f0-4714-b0f4-da3de91cbdc8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bf25v\" (UID: \"d6890379-66f0-4714-b0f4-da3de91cbdc8\") " pod="openshift-monitoring/node-exporter-bf25v" Apr 16 23:52:52.139533 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.139354 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d6890379-66f0-4714-b0f4-da3de91cbdc8-node-exporter-accelerators-collector-config\") pod \"node-exporter-bf25v\" (UID: \"d6890379-66f0-4714-b0f4-da3de91cbdc8\") " pod="openshift-monitoring/node-exporter-bf25v" Apr 16 23:52:52.139533 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.139398 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d6890379-66f0-4714-b0f4-da3de91cbdc8-metrics-client-ca\") pod \"node-exporter-bf25v\" (UID: \"d6890379-66f0-4714-b0f4-da3de91cbdc8\") " pod="openshift-monitoring/node-exporter-bf25v" Apr 16 23:52:52.139533 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.139523 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d6890379-66f0-4714-b0f4-da3de91cbdc8-root\") pod \"node-exporter-bf25v\" (UID: \"d6890379-66f0-4714-b0f4-da3de91cbdc8\") " pod="openshift-monitoring/node-exporter-bf25v" Apr 16 23:52:52.139957 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.139676 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/75c3d93e-0ccd-45ed-87ab-1abfec9fc614-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-c4nhb\" (UID: \"75c3d93e-0ccd-45ed-87ab-1abfec9fc614\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c4nhb" Apr 16 23:52:52.139957 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:52.139721 2562 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 16 23:52:52.139957 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:52.139755 2562 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 16 23:52:52.139957 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:52.139793 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75c3d93e-0ccd-45ed-87ab-1abfec9fc614-openshift-state-metrics-tls podName:75c3d93e-0ccd-45ed-87ab-1abfec9fc614 nodeName:}" failed. No retries permitted until 2026-04-16 23:52:52.639774834 +0000 UTC m=+144.848539493 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/75c3d93e-0ccd-45ed-87ab-1abfec9fc614-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-c4nhb" (UID: "75c3d93e-0ccd-45ed-87ab-1abfec9fc614") : secret "openshift-state-metrics-tls" not found Apr 16 23:52:52.139957 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:52.139819 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f349e78a-8ba4-44d0-86d8-b96de7477daa-kube-state-metrics-tls podName:f349e78a-8ba4-44d0-86d8-b96de7477daa nodeName:}" failed. No retries permitted until 2026-04-16 23:52:52.639801543 +0000 UTC m=+144.848566191 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/f349e78a-8ba4-44d0-86d8-b96de7477daa-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-xcgwl" (UID: "f349e78a-8ba4-44d0-86d8-b96de7477daa") : secret "kube-state-metrics-tls" not found Apr 16 23:52:52.139957 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.139901 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d6890379-66f0-4714-b0f4-da3de91cbdc8-node-exporter-accelerators-collector-config\") pod \"node-exporter-bf25v\" (UID: \"d6890379-66f0-4714-b0f4-da3de91cbdc8\") " pod="openshift-monitoring/node-exporter-bf25v" Apr 16 23:52:52.140243 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.140056 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f349e78a-8ba4-44d0-86d8-b96de7477daa-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-xcgwl\" (UID: \"f349e78a-8ba4-44d0-86d8-b96de7477daa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xcgwl" Apr 16 23:52:52.140243 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.140203 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f349e78a-8ba4-44d0-86d8-b96de7477daa-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-xcgwl\" (UID: \"f349e78a-8ba4-44d0-86d8-b96de7477daa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xcgwl" Apr 16 23:52:52.140333 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.140253 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d6890379-66f0-4714-b0f4-da3de91cbdc8-node-exporter-textfile\") pod \"node-exporter-bf25v\" (UID: \"d6890379-66f0-4714-b0f4-da3de91cbdc8\") " pod="openshift-monitoring/node-exporter-bf25v" Apr 16 23:52:52.141824 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.141798 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f349e78a-8ba4-44d0-86d8-b96de7477daa-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-xcgwl\" (UID: \"f349e78a-8ba4-44d0-86d8-b96de7477daa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xcgwl" Apr 16 23:52:52.141824 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.141806 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/75c3d93e-0ccd-45ed-87ab-1abfec9fc614-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-c4nhb\" (UID: \"75c3d93e-0ccd-45ed-87ab-1abfec9fc614\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c4nhb" Apr 16 23:52:52.142173 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.142154 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d6890379-66f0-4714-b0f4-da3de91cbdc8-node-exporter-tls\") pod \"node-exporter-bf25v\" (UID: \"d6890379-66f0-4714-b0f4-da3de91cbdc8\") " pod="openshift-monitoring/node-exporter-bf25v" Apr 16 23:52:52.142238 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.142156 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d6890379-66f0-4714-b0f4-da3de91cbdc8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bf25v\" (UID: \"d6890379-66f0-4714-b0f4-da3de91cbdc8\") " pod="openshift-monitoring/node-exporter-bf25v" Apr 16 23:52:52.147280 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.147253 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6chvf\" (UniqueName: \"kubernetes.io/projected/f349e78a-8ba4-44d0-86d8-b96de7477daa-kube-api-access-6chvf\") pod \"kube-state-metrics-69db897b98-xcgwl\" (UID: \"f349e78a-8ba4-44d0-86d8-b96de7477daa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xcgwl" Apr 16 23:52:52.147366 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.147326 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbngf\" (UniqueName: \"kubernetes.io/projected/75c3d93e-0ccd-45ed-87ab-1abfec9fc614-kube-api-access-mbngf\") pod \"openshift-state-metrics-9d44df66c-c4nhb\" (UID: \"75c3d93e-0ccd-45ed-87ab-1abfec9fc614\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c4nhb" Apr 16 23:52:52.147366 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.147327 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktjc2\" (UniqueName: \"kubernetes.io/projected/d6890379-66f0-4714-b0f4-da3de91cbdc8-kube-api-access-ktjc2\") pod \"node-exporter-bf25v\" (UID: \"d6890379-66f0-4714-b0f4-da3de91cbdc8\") " pod="openshift-monitoring/node-exporter-bf25v" Apr 16 23:52:52.266564 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.266507 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-bf25v" Apr 16 23:52:52.274929 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:52:52.274908 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6890379_66f0_4714_b0f4_da3de91cbdc8.slice/crio-1f7c8757bad52dd28bed8c504f084728f523cfc7e367a230a8f09df0877e70e7 WatchSource:0}: Error finding container 1f7c8757bad52dd28bed8c504f084728f523cfc7e367a230a8f09df0877e70e7: Status 404 returned error can't find the container with id 1f7c8757bad52dd28bed8c504f084728f523cfc7e367a230a8f09df0877e70e7 Apr 16 23:52:52.643297 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.643220 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/75c3d93e-0ccd-45ed-87ab-1abfec9fc614-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-c4nhb\" (UID: \"75c3d93e-0ccd-45ed-87ab-1abfec9fc614\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c4nhb" Apr 16 23:52:52.643297 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.643264 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f349e78a-8ba4-44d0-86d8-b96de7477daa-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-xcgwl\" (UID: \"f349e78a-8ba4-44d0-86d8-b96de7477daa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xcgwl" Apr 16 23:52:52.643469 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:52.643352 2562 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 16 23:52:52.643469 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:52.643404 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f349e78a-8ba4-44d0-86d8-b96de7477daa-kube-state-metrics-tls podName:f349e78a-8ba4-44d0-86d8-b96de7477daa nodeName:}" failed. No retries permitted until 2026-04-16 23:52:53.643389093 +0000 UTC m=+145.852153741 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/f349e78a-8ba4-44d0-86d8-b96de7477daa-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-xcgwl" (UID: "f349e78a-8ba4-44d0-86d8-b96de7477daa") : secret "kube-state-metrics-tls" not found Apr 16 23:52:52.645709 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.645678 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/75c3d93e-0ccd-45ed-87ab-1abfec9fc614-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-c4nhb\" (UID: \"75c3d93e-0ccd-45ed-87ab-1abfec9fc614\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c4nhb" Apr 16 23:52:52.822491 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.822453 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bf25v" event={"ID":"d6890379-66f0-4714-b0f4-da3de91cbdc8","Type":"ContainerStarted","Data":"1f7c8757bad52dd28bed8c504f084728f523cfc7e367a230a8f09df0877e70e7"} Apr 16 23:52:52.860483 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:52.860456 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c4nhb" Apr 16 23:52:53.001596 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:53.001569 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-c4nhb"] Apr 16 23:52:53.004556 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:52:53.004518 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75c3d93e_0ccd_45ed_87ab_1abfec9fc614.slice/crio-f88aa80891bb8060d1f5cef8874f92a67b8aed22dd9f140502590d91f64aa2d5 WatchSource:0}: Error finding container f88aa80891bb8060d1f5cef8874f92a67b8aed22dd9f140502590d91f64aa2d5: Status 404 returned error can't find the container with id f88aa80891bb8060d1f5cef8874f92a67b8aed22dd9f140502590d91f64aa2d5 Apr 16 23:52:53.652093 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:53.652067 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f349e78a-8ba4-44d0-86d8-b96de7477daa-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-xcgwl\" (UID: \"f349e78a-8ba4-44d0-86d8-b96de7477daa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xcgwl" Apr 16 23:52:53.654037 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:53.654019 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f349e78a-8ba4-44d0-86d8-b96de7477daa-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-xcgwl\" (UID: \"f349e78a-8ba4-44d0-86d8-b96de7477daa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xcgwl" Apr 16 23:52:53.750923 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:53.750889 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-xcgwl" Apr 16 23:52:53.827430 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:53.827399 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c4nhb" event={"ID":"75c3d93e-0ccd-45ed-87ab-1abfec9fc614","Type":"ContainerStarted","Data":"b5d151e5d78078b4eb68d461aca1f710ee0741367b48d8ce18d2ec1e91a2b3a4"} Apr 16 23:52:53.827546 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:53.827441 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c4nhb" event={"ID":"75c3d93e-0ccd-45ed-87ab-1abfec9fc614","Type":"ContainerStarted","Data":"614816c7a2cccc9da6b072e26b284ede3f4e6c0d258f8b530710c316b9fc0028"} Apr 16 23:52:53.827546 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:53.827451 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c4nhb" event={"ID":"75c3d93e-0ccd-45ed-87ab-1abfec9fc614","Type":"ContainerStarted","Data":"f88aa80891bb8060d1f5cef8874f92a67b8aed22dd9f140502590d91f64aa2d5"} Apr 16 23:52:53.828808 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:53.828783 2562 generic.go:358] "Generic (PLEG): container finished" podID="d6890379-66f0-4714-b0f4-da3de91cbdc8" containerID="ae4da6f42a94f4ab72b225eb0f1c3672d8c6c6df441626c2aefda041215b5241" exitCode=0 Apr 16 23:52:53.828919 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:53.828865 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bf25v" event={"ID":"d6890379-66f0-4714-b0f4-da3de91cbdc8","Type":"ContainerDied","Data":"ae4da6f42a94f4ab72b225eb0f1c3672d8c6c6df441626c2aefda041215b5241"} Apr 16 23:52:53.867286 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:53.867261 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-xcgwl"] Apr 16 23:52:53.869890 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:52:53.869869 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf349e78a_8ba4_44d0_86d8_b96de7477daa.slice/crio-cad66406bdbc8abf352e8e11bdb88a33dbecb98e9c3e72d0c6aea138a4f26456 WatchSource:0}: Error finding container cad66406bdbc8abf352e8e11bdb88a33dbecb98e9c3e72d0c6aea138a4f26456: Status 404 returned error can't find the container with id cad66406bdbc8abf352e8e11bdb88a33dbecb98e9c3e72d0c6aea138a4f26456 Apr 16 23:52:54.002349 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.002326 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-55898d7b96-zltwd"] Apr 16 23:52:54.030884 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.030856 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-55898d7b96-zltwd"] Apr 16 23:52:54.030999 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.030985 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-55898d7b96-zltwd" Apr 16 23:52:54.033269 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.033246 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 23:52:54.033379 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.033309 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-sk74g\"" Apr 16 23:52:54.033379 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.033327 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 23:52:54.033494 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.033381 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 23:52:54.033494 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.033392 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-f17lfnn8qdvc2\"" Apr 16 23:52:54.033602 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.033584 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 23:52:54.033660 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.033640 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 23:52:54.156899 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.156867 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/d6a1a371-ed23-4e03-8408-d1ea20b07e24-secret-thanos-querier-tls\") pod \"thanos-querier-55898d7b96-zltwd\" (UID: \"d6a1a371-ed23-4e03-8408-d1ea20b07e24\") " pod="openshift-monitoring/thanos-querier-55898d7b96-zltwd" Apr 16 23:52:54.157054 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.156914 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d6a1a371-ed23-4e03-8408-d1ea20b07e24-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-55898d7b96-zltwd\" (UID: \"d6a1a371-ed23-4e03-8408-d1ea20b07e24\") " pod="openshift-monitoring/thanos-querier-55898d7b96-zltwd" Apr 16 23:52:54.157054 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.156945 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/d6a1a371-ed23-4e03-8408-d1ea20b07e24-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-55898d7b96-zltwd\" (UID: \"d6a1a371-ed23-4e03-8408-d1ea20b07e24\") " pod="openshift-monitoring/thanos-querier-55898d7b96-zltwd" Apr 16 23:52:54.157054 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.156977 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d6a1a371-ed23-4e03-8408-d1ea20b07e24-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-55898d7b96-zltwd\" (UID: \"d6a1a371-ed23-4e03-8408-d1ea20b07e24\") " pod="openshift-monitoring/thanos-querier-55898d7b96-zltwd" Apr 16 23:52:54.157054 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.157044 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4csgf\" (UniqueName: \"kubernetes.io/projected/d6a1a371-ed23-4e03-8408-d1ea20b07e24-kube-api-access-4csgf\") pod \"thanos-querier-55898d7b96-zltwd\" (UID: \"d6a1a371-ed23-4e03-8408-d1ea20b07e24\") " pod="openshift-monitoring/thanos-querier-55898d7b96-zltwd" Apr 16 23:52:54.157292 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.157109 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d6a1a371-ed23-4e03-8408-d1ea20b07e24-metrics-client-ca\") pod \"thanos-querier-55898d7b96-zltwd\" (UID: \"d6a1a371-ed23-4e03-8408-d1ea20b07e24\") " pod="openshift-monitoring/thanos-querier-55898d7b96-zltwd" Apr 16 23:52:54.157292 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.157136 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/d6a1a371-ed23-4e03-8408-d1ea20b07e24-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-55898d7b96-zltwd\" (UID: \"d6a1a371-ed23-4e03-8408-d1ea20b07e24\") " pod="openshift-monitoring/thanos-querier-55898d7b96-zltwd" Apr 16 23:52:54.157292 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.157159 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d6a1a371-ed23-4e03-8408-d1ea20b07e24-secret-grpc-tls\") pod \"thanos-querier-55898d7b96-zltwd\" (UID: \"d6a1a371-ed23-4e03-8408-d1ea20b07e24\") " pod="openshift-monitoring/thanos-querier-55898d7b96-zltwd" Apr 16 23:52:54.258442 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.258414 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d6a1a371-ed23-4e03-8408-d1ea20b07e24-metrics-client-ca\") pod \"thanos-querier-55898d7b96-zltwd\" (UID: \"d6a1a371-ed23-4e03-8408-d1ea20b07e24\") " pod="openshift-monitoring/thanos-querier-55898d7b96-zltwd" Apr 16 23:52:54.258609 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.258449 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/d6a1a371-ed23-4e03-8408-d1ea20b07e24-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-55898d7b96-zltwd\" (UID: \"d6a1a371-ed23-4e03-8408-d1ea20b07e24\") " pod="openshift-monitoring/thanos-querier-55898d7b96-zltwd" Apr 16 23:52:54.258609 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.258469 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d6a1a371-ed23-4e03-8408-d1ea20b07e24-secret-grpc-tls\") pod \"thanos-querier-55898d7b96-zltwd\" (UID: \"d6a1a371-ed23-4e03-8408-d1ea20b07e24\") " pod="openshift-monitoring/thanos-querier-55898d7b96-zltwd" Apr 16 23:52:54.258609 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.258508 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/d6a1a371-ed23-4e03-8408-d1ea20b07e24-secret-thanos-querier-tls\") pod \"thanos-querier-55898d7b96-zltwd\" (UID: \"d6a1a371-ed23-4e03-8408-d1ea20b07e24\") " pod="openshift-monitoring/thanos-querier-55898d7b96-zltwd" Apr 16 23:52:54.258609 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.258527 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d6a1a371-ed23-4e03-8408-d1ea20b07e24-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-55898d7b96-zltwd\" (UID: \"d6a1a371-ed23-4e03-8408-d1ea20b07e24\") " pod="openshift-monitoring/thanos-querier-55898d7b96-zltwd" Apr 16 23:52:54.258609 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.258543 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/d6a1a371-ed23-4e03-8408-d1ea20b07e24-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-55898d7b96-zltwd\" (UID: \"d6a1a371-ed23-4e03-8408-d1ea20b07e24\") " pod="openshift-monitoring/thanos-querier-55898d7b96-zltwd" Apr 16 23:52:54.258609 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.258561 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d6a1a371-ed23-4e03-8408-d1ea20b07e24-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-55898d7b96-zltwd\" (UID: \"d6a1a371-ed23-4e03-8408-d1ea20b07e24\") " pod="openshift-monitoring/thanos-querier-55898d7b96-zltwd" Apr 16 23:52:54.258609 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.258590 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4csgf\" (UniqueName: \"kubernetes.io/projected/d6a1a371-ed23-4e03-8408-d1ea20b07e24-kube-api-access-4csgf\") pod \"thanos-querier-55898d7b96-zltwd\" (UID: \"d6a1a371-ed23-4e03-8408-d1ea20b07e24\") " pod="openshift-monitoring/thanos-querier-55898d7b96-zltwd" Apr 16 23:52:54.259438 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.259185 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d6a1a371-ed23-4e03-8408-d1ea20b07e24-metrics-client-ca\") pod \"thanos-querier-55898d7b96-zltwd\" (UID: \"d6a1a371-ed23-4e03-8408-d1ea20b07e24\") " pod="openshift-monitoring/thanos-querier-55898d7b96-zltwd" Apr 16 23:52:54.261288 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.261262 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d6a1a371-ed23-4e03-8408-d1ea20b07e24-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-55898d7b96-zltwd\" (UID: \"d6a1a371-ed23-4e03-8408-d1ea20b07e24\") " pod="openshift-monitoring/thanos-querier-55898d7b96-zltwd" Apr 16 23:52:54.261451 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.261426 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d6a1a371-ed23-4e03-8408-d1ea20b07e24-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-55898d7b96-zltwd\" (UID: \"d6a1a371-ed23-4e03-8408-d1ea20b07e24\") " pod="openshift-monitoring/thanos-querier-55898d7b96-zltwd" Apr 16 23:52:54.261527 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.261497 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/d6a1a371-ed23-4e03-8408-d1ea20b07e24-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-55898d7b96-zltwd\" (UID: \"d6a1a371-ed23-4e03-8408-d1ea20b07e24\") " pod="openshift-monitoring/thanos-querier-55898d7b96-zltwd" Apr 16 23:52:54.261641 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.261621 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/d6a1a371-ed23-4e03-8408-d1ea20b07e24-secret-thanos-querier-tls\") pod \"thanos-querier-55898d7b96-zltwd\" (UID: \"d6a1a371-ed23-4e03-8408-d1ea20b07e24\") " pod="openshift-monitoring/thanos-querier-55898d7b96-zltwd" Apr 16 23:52:54.261695 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.261598 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/d6a1a371-ed23-4e03-8408-d1ea20b07e24-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-55898d7b96-zltwd\" (UID: \"d6a1a371-ed23-4e03-8408-d1ea20b07e24\") " pod="openshift-monitoring/thanos-querier-55898d7b96-zltwd" Apr 16 23:52:54.261749 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.261730 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d6a1a371-ed23-4e03-8408-d1ea20b07e24-secret-grpc-tls\") pod \"thanos-querier-55898d7b96-zltwd\" (UID: \"d6a1a371-ed23-4e03-8408-d1ea20b07e24\") " pod="openshift-monitoring/thanos-querier-55898d7b96-zltwd" Apr 16 23:52:54.265040 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.265023 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4csgf\" (UniqueName: \"kubernetes.io/projected/d6a1a371-ed23-4e03-8408-d1ea20b07e24-kube-api-access-4csgf\") pod \"thanos-querier-55898d7b96-zltwd\" (UID: \"d6a1a371-ed23-4e03-8408-d1ea20b07e24\") " pod="openshift-monitoring/thanos-querier-55898d7b96-zltwd" Apr 16 23:52:54.348161 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.348078 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-55898d7b96-zltwd" Apr 16 23:52:54.623841 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.623818 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-55898d7b96-zltwd"] Apr 16 23:52:54.787434 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:52:54.787400 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6a1a371_ed23_4e03_8408_d1ea20b07e24.slice/crio-528ad214b0b7b7fa2dcbf87fc1f999516ae345f677880b2354c06d21abad85ff WatchSource:0}: Error finding container 528ad214b0b7b7fa2dcbf87fc1f999516ae345f677880b2354c06d21abad85ff: Status 404 returned error can't find the container with id 528ad214b0b7b7fa2dcbf87fc1f999516ae345f677880b2354c06d21abad85ff Apr 16 23:52:54.832522 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.832488 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-55898d7b96-zltwd" event={"ID":"d6a1a371-ed23-4e03-8408-d1ea20b07e24","Type":"ContainerStarted","Data":"528ad214b0b7b7fa2dcbf87fc1f999516ae345f677880b2354c06d21abad85ff"} Apr 16 23:52:54.834496 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.834468 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bf25v" event={"ID":"d6890379-66f0-4714-b0f4-da3de91cbdc8","Type":"ContainerStarted","Data":"8fb798bbe8f634e45bae4e778043f1818b9cd3ea155c215c858c7cb513e16892"} Apr 16 23:52:54.834576 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.834503 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bf25v" event={"ID":"d6890379-66f0-4714-b0f4-da3de91cbdc8","Type":"ContainerStarted","Data":"5630f8c099ca97450e6b6d63776d1f64663700c9e476d56ca4d3d85aac9886e7"} Apr 16 23:52:54.835600 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.835572 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xcgwl" event={"ID":"f349e78a-8ba4-44d0-86d8-b96de7477daa","Type":"ContainerStarted","Data":"cad66406bdbc8abf352e8e11bdb88a33dbecb98e9c3e72d0c6aea138a4f26456"} Apr 16 23:52:54.852003 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:54.851954 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-bf25v" podStartSLOduration=2.705093022 podStartE2EDuration="3.851938486s" podCreationTimestamp="2026-04-16 23:52:51 +0000 UTC" firstStartedPulling="2026-04-16 23:52:52.276656712 +0000 UTC m=+144.485421359" lastFinishedPulling="2026-04-16 23:52:53.423502174 +0000 UTC m=+145.632266823" observedRunningTime="2026-04-16 23:52:54.850275372 +0000 UTC m=+147.059040044" watchObservedRunningTime="2026-04-16 23:52:54.851938486 +0000 UTC m=+147.060703149" Apr 16 23:52:55.034915 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:55.034884 2562 patch_prober.go:28] interesting pod/image-registry-7bf9785878-zjgxw container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 23:52:55.035346 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:55.034931 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" podUID="88e7efd3-590c-4e09-8fdb-d9252264d3ad" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 23:52:55.841254 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:55.840899 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c4nhb" event={"ID":"75c3d93e-0ccd-45ed-87ab-1abfec9fc614","Type":"ContainerStarted","Data":"9d81c6aaebb6fcc11a603a27cc755d10473c937a94df1a849331da4bb77741d7"} Apr 16 23:52:55.843155 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:55.843073 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xcgwl" event={"ID":"f349e78a-8ba4-44d0-86d8-b96de7477daa","Type":"ContainerStarted","Data":"b3ce962715c47c39af7c8b1b75800b0701942e6ec61b3dd1ff8e8aa651d92a27"} Apr 16 23:52:55.843155 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:55.843111 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xcgwl" event={"ID":"f349e78a-8ba4-44d0-86d8-b96de7477daa","Type":"ContainerStarted","Data":"86d09bc4210aa3b71df27b7622a30ccb038be645ddbddc066a2626e3ff2813ed"} Apr 16 23:52:55.843155 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:55.843127 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xcgwl" event={"ID":"f349e78a-8ba4-44d0-86d8-b96de7477daa","Type":"ContainerStarted","Data":"6b66bb5449c3af9b40c82fe102c9dd2e3e6d0e5775e87dca687697e4c6783fa9"} Apr 16 23:52:55.858848 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:55.858777 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c4nhb" podStartSLOduration=3.487166969 podStartE2EDuration="4.858730953s" podCreationTimestamp="2026-04-16 23:52:51 +0000 UTC" firstStartedPulling="2026-04-16 23:52:53.420365513 +0000 UTC m=+145.629130161" lastFinishedPulling="2026-04-16 23:52:54.791929493 +0000 UTC m=+147.000694145" observedRunningTime="2026-04-16 23:52:55.856614539 +0000 UTC m=+148.065379221" watchObservedRunningTime="2026-04-16 23:52:55.858730953 +0000 UTC m=+148.067495624" Apr 16 23:52:55.872699 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:55.872653 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-xcgwl" podStartSLOduration=3.278557413 podStartE2EDuration="4.872638223s" podCreationTimestamp="2026-04-16 23:52:51 +0000 UTC" firstStartedPulling="2026-04-16 23:52:53.871571264 +0000 UTC m=+146.080335925" lastFinishedPulling="2026-04-16 23:52:55.465652072 +0000 UTC m=+147.674416735" observedRunningTime="2026-04-16 23:52:55.871002256 +0000 UTC m=+148.079766928" watchObservedRunningTime="2026-04-16 23:52:55.872638223 +0000 UTC m=+148.081402895" Apr 16 23:52:57.106888 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.106861 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-59ff69d868-kzfwh"] Apr 16 23:52:57.110672 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.110654 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-59ff69d868-kzfwh" Apr 16 23:52:57.112650 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.112622 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 23:52:57.112765 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.112683 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 23:52:57.112765 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.112735 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 23:52:57.112890 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.112688 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-tq74t\"" Apr 16 23:52:57.112890 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.112874 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 23:52:57.112984 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.112877 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 23:52:57.125574 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.125497 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 23:52:57.125914 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.125897 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-59ff69d868-kzfwh"] Apr 16 23:52:57.284975 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.284950 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/d4d50529-d814-4e89-9eaf-fae37ca01719-secret-telemeter-client\") pod \"telemeter-client-59ff69d868-kzfwh\" (UID: \"d4d50529-d814-4e89-9eaf-fae37ca01719\") " pod="openshift-monitoring/telemeter-client-59ff69d868-kzfwh" Apr 16 23:52:57.285085 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.284983 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/d4d50529-d814-4e89-9eaf-fae37ca01719-federate-client-tls\") pod \"telemeter-client-59ff69d868-kzfwh\" (UID: \"d4d50529-d814-4e89-9eaf-fae37ca01719\") " pod="openshift-monitoring/telemeter-client-59ff69d868-kzfwh" Apr 16 23:52:57.285085 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.285006 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d4d50529-d814-4e89-9eaf-fae37ca01719-telemeter-client-tls\") pod \"telemeter-client-59ff69d868-kzfwh\" (UID: \"d4d50529-d814-4e89-9eaf-fae37ca01719\") " pod="openshift-monitoring/telemeter-client-59ff69d868-kzfwh" Apr 16 23:52:57.285085 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.285043 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d4d50529-d814-4e89-9eaf-fae37ca01719-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-59ff69d868-kzfwh\" (UID: \"d4d50529-d814-4e89-9eaf-fae37ca01719\") " pod="openshift-monitoring/telemeter-client-59ff69d868-kzfwh" Apr 16 23:52:57.285284 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.285123 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d4d50529-d814-4e89-9eaf-fae37ca01719-metrics-client-ca\") pod \"telemeter-client-59ff69d868-kzfwh\" (UID: \"d4d50529-d814-4e89-9eaf-fae37ca01719\") " pod="openshift-monitoring/telemeter-client-59ff69d868-kzfwh" Apr 16 23:52:57.285284 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.285145 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsn8t\" (UniqueName: \"kubernetes.io/projected/d4d50529-d814-4e89-9eaf-fae37ca01719-kube-api-access-fsn8t\") pod \"telemeter-client-59ff69d868-kzfwh\" (UID: \"d4d50529-d814-4e89-9eaf-fae37ca01719\") " pod="openshift-monitoring/telemeter-client-59ff69d868-kzfwh" Apr 16 23:52:57.285284 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.285216 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4d50529-d814-4e89-9eaf-fae37ca01719-telemeter-trusted-ca-bundle\") pod \"telemeter-client-59ff69d868-kzfwh\" (UID: \"d4d50529-d814-4e89-9eaf-fae37ca01719\") " pod="openshift-monitoring/telemeter-client-59ff69d868-kzfwh" Apr 16 23:52:57.285442 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.285299 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4d50529-d814-4e89-9eaf-fae37ca01719-serving-certs-ca-bundle\") pod \"telemeter-client-59ff69d868-kzfwh\" (UID: \"d4d50529-d814-4e89-9eaf-fae37ca01719\") " pod="openshift-monitoring/telemeter-client-59ff69d868-kzfwh" Apr 16 23:52:57.386514 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.386456 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4d50529-d814-4e89-9eaf-fae37ca01719-serving-certs-ca-bundle\") pod \"telemeter-client-59ff69d868-kzfwh\" (UID: \"d4d50529-d814-4e89-9eaf-fae37ca01719\") " pod="openshift-monitoring/telemeter-client-59ff69d868-kzfwh" Apr 16 23:52:57.386514 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.386498 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/d4d50529-d814-4e89-9eaf-fae37ca01719-secret-telemeter-client\") pod \"telemeter-client-59ff69d868-kzfwh\" (UID: \"d4d50529-d814-4e89-9eaf-fae37ca01719\") " pod="openshift-monitoring/telemeter-client-59ff69d868-kzfwh" Apr 16 23:52:57.386514 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.386515 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/d4d50529-d814-4e89-9eaf-fae37ca01719-federate-client-tls\") pod \"telemeter-client-59ff69d868-kzfwh\" (UID: \"d4d50529-d814-4e89-9eaf-fae37ca01719\") " pod="openshift-monitoring/telemeter-client-59ff69d868-kzfwh" Apr 16 23:52:57.386802 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.386779 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d4d50529-d814-4e89-9eaf-fae37ca01719-telemeter-client-tls\") pod \"telemeter-client-59ff69d868-kzfwh\" (UID: \"d4d50529-d814-4e89-9eaf-fae37ca01719\") " pod="openshift-monitoring/telemeter-client-59ff69d868-kzfwh" Apr 16 23:52:57.386896 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.386814 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d4d50529-d814-4e89-9eaf-fae37ca01719-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-59ff69d868-kzfwh\" (UID: \"d4d50529-d814-4e89-9eaf-fae37ca01719\") " pod="openshift-monitoring/telemeter-client-59ff69d868-kzfwh" Apr 16 23:52:57.386896 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.386871 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d4d50529-d814-4e89-9eaf-fae37ca01719-metrics-client-ca\") pod \"telemeter-client-59ff69d868-kzfwh\" (UID: \"d4d50529-d814-4e89-9eaf-fae37ca01719\") " pod="openshift-monitoring/telemeter-client-59ff69d868-kzfwh" Apr 16 23:52:57.387003 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.386905 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fsn8t\" (UniqueName: \"kubernetes.io/projected/d4d50529-d814-4e89-9eaf-fae37ca01719-kube-api-access-fsn8t\") pod \"telemeter-client-59ff69d868-kzfwh\" (UID: \"d4d50529-d814-4e89-9eaf-fae37ca01719\") " pod="openshift-monitoring/telemeter-client-59ff69d868-kzfwh" Apr 16 23:52:57.387003 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.386937 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4d50529-d814-4e89-9eaf-fae37ca01719-telemeter-trusted-ca-bundle\") pod \"telemeter-client-59ff69d868-kzfwh\" (UID: \"d4d50529-d814-4e89-9eaf-fae37ca01719\") " pod="openshift-monitoring/telemeter-client-59ff69d868-kzfwh" Apr 16 23:52:57.387638 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.387615 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d4d50529-d814-4e89-9eaf-fae37ca01719-metrics-client-ca\") pod \"telemeter-client-59ff69d868-kzfwh\" (UID: \"d4d50529-d814-4e89-9eaf-fae37ca01719\") " pod="openshift-monitoring/telemeter-client-59ff69d868-kzfwh" Apr 16 23:52:57.387884 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.387853 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4d50529-d814-4e89-9eaf-fae37ca01719-telemeter-trusted-ca-bundle\") pod \"telemeter-client-59ff69d868-kzfwh\" (UID: \"d4d50529-d814-4e89-9eaf-fae37ca01719\") " pod="openshift-monitoring/telemeter-client-59ff69d868-kzfwh" Apr 16 23:52:57.388029 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.388009 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4d50529-d814-4e89-9eaf-fae37ca01719-serving-certs-ca-bundle\") pod \"telemeter-client-59ff69d868-kzfwh\" (UID: \"d4d50529-d814-4e89-9eaf-fae37ca01719\") " pod="openshift-monitoring/telemeter-client-59ff69d868-kzfwh" Apr 16 23:52:57.389234 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.389211 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d4d50529-d814-4e89-9eaf-fae37ca01719-telemeter-client-tls\") pod \"telemeter-client-59ff69d868-kzfwh\" (UID: \"d4d50529-d814-4e89-9eaf-fae37ca01719\") " pod="openshift-monitoring/telemeter-client-59ff69d868-kzfwh" Apr 16 23:52:57.389347 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.389321 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d4d50529-d814-4e89-9eaf-fae37ca01719-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-59ff69d868-kzfwh\" (UID: \"d4d50529-d814-4e89-9eaf-fae37ca01719\") " pod="openshift-monitoring/telemeter-client-59ff69d868-kzfwh" Apr 16 23:52:57.389410 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.389393 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/d4d50529-d814-4e89-9eaf-fae37ca01719-federate-client-tls\") pod \"telemeter-client-59ff69d868-kzfwh\" (UID: \"d4d50529-d814-4e89-9eaf-fae37ca01719\") " pod="openshift-monitoring/telemeter-client-59ff69d868-kzfwh" Apr 16 23:52:57.389410 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.389403 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/d4d50529-d814-4e89-9eaf-fae37ca01719-secret-telemeter-client\") pod \"telemeter-client-59ff69d868-kzfwh\" (UID: \"d4d50529-d814-4e89-9eaf-fae37ca01719\") " pod="openshift-monitoring/telemeter-client-59ff69d868-kzfwh" Apr 16 23:52:57.394007 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.393987 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsn8t\" (UniqueName: \"kubernetes.io/projected/d4d50529-d814-4e89-9eaf-fae37ca01719-kube-api-access-fsn8t\") pod \"telemeter-client-59ff69d868-kzfwh\" (UID: \"d4d50529-d814-4e89-9eaf-fae37ca01719\") " pod="openshift-monitoring/telemeter-client-59ff69d868-kzfwh" Apr 16 23:52:57.426015 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.425995 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-59ff69d868-kzfwh" Apr 16 23:52:57.541363 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.541306 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-59ff69d868-kzfwh"] Apr 16 23:52:57.543386 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:52:57.543358 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4d50529_d814_4e89_9eaf_fae37ca01719.slice/crio-57e26e8ffa138e0caa55ff75ab7cc267404438aeba4640b78609652e33c2298f WatchSource:0}: Error finding container 57e26e8ffa138e0caa55ff75ab7cc267404438aeba4640b78609652e33c2298f: Status 404 returned error can't find the container with id 57e26e8ffa138e0caa55ff75ab7cc267404438aeba4640b78609652e33c2298f Apr 16 23:52:57.850568 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.850536 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-55898d7b96-zltwd" event={"ID":"d6a1a371-ed23-4e03-8408-d1ea20b07e24","Type":"ContainerStarted","Data":"7922aca7513eb7448209e0ee26f0a290f67fa6d0c03028c3d0ad5e2b3dbcf4ca"} Apr 16 23:52:57.850568 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.850570 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-55898d7b96-zltwd" event={"ID":"d6a1a371-ed23-4e03-8408-d1ea20b07e24","Type":"ContainerStarted","Data":"58f5706a103b941cd5a35a529410f6f6797b464304c42f255a95c8806ac51e2c"} Apr 16 23:52:57.850786 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.850580 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-55898d7b96-zltwd" event={"ID":"d6a1a371-ed23-4e03-8408-d1ea20b07e24","Type":"ContainerStarted","Data":"c22f819fd9612930e6f92b7ae60d1311c0698eb65a666beaa6be4b93bc01fe4f"} Apr 16 23:52:57.851444 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:57.851421 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-59ff69d868-kzfwh" event={"ID":"d4d50529-d814-4e89-9eaf-fae37ca01719","Type":"ContainerStarted","Data":"57e26e8ffa138e0caa55ff75ab7cc267404438aeba4640b78609652e33c2298f"} Apr 16 23:52:58.091227 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.091137 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 23:52:58.096545 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.095961 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.098890 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.098707 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 23:52:58.099549 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.099150 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-5z44f\"" Apr 16 23:52:58.099549 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.099289 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 23:52:58.099549 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.099392 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 23:52:58.099549 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.099411 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 23:52:58.099549 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.099150 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 23:52:58.099997 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.099975 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 23:52:58.101344 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.101284 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 23:52:58.101344 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.101308 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 23:52:58.101510 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.101436 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 23:52:58.101510 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.101487 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 23:52:58.101621 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.101530 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 23:52:58.101621 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.101603 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-3l2c35iqfuh2k\"" Apr 16 23:52:58.101727 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.101631 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 23:52:58.101881 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.101863 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 23:52:58.111212 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.111149 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 23:52:58.192344 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.192254 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.192344 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.192312 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.192344 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.192343 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.192632 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.192377 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.192632 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.192435 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1530be49-0a24-4da9-ac18-d8f5540c0309-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.192632 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.192510 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1530be49-0a24-4da9-ac18-d8f5540c0309-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.192632 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.192551 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwvs8\" (UniqueName: \"kubernetes.io/projected/1530be49-0a24-4da9-ac18-d8f5540c0309-kube-api-access-rwvs8\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.192632 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.192581 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.192632 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.192628 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1530be49-0a24-4da9-ac18-d8f5540c0309-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.193274 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.193249 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-config\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.193406 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.193310 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1530be49-0a24-4da9-ac18-d8f5540c0309-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.193406 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.193350 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1530be49-0a24-4da9-ac18-d8f5540c0309-config-out\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.193406 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.193372 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1530be49-0a24-4da9-ac18-d8f5540c0309-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.193552 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.193406 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.193552 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.193446 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1530be49-0a24-4da9-ac18-d8f5540c0309-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.193552 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.193512 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.193552 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.193541 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-web-config\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.193704 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.193571 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1530be49-0a24-4da9-ac18-d8f5540c0309-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.297426 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.295046 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1530be49-0a24-4da9-ac18-d8f5540c0309-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.297426 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.295100 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.297426 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.295162 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.297426 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.295203 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.297426 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.295230 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.297426 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.295282 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1530be49-0a24-4da9-ac18-d8f5540c0309-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.297426 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.295309 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1530be49-0a24-4da9-ac18-d8f5540c0309-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.297426 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.295332 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rwvs8\" (UniqueName: \"kubernetes.io/projected/1530be49-0a24-4da9-ac18-d8f5540c0309-kube-api-access-rwvs8\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.297426 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.295358 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.297426 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.295398 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1530be49-0a24-4da9-ac18-d8f5540c0309-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.297426 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.295441 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-config\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.297426 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.295485 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1530be49-0a24-4da9-ac18-d8f5540c0309-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.297426 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.295509 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1530be49-0a24-4da9-ac18-d8f5540c0309-config-out\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.297426 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.295532 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1530be49-0a24-4da9-ac18-d8f5540c0309-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.297426 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.295563 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.297426 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.295587 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1530be49-0a24-4da9-ac18-d8f5540c0309-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.297426 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.295629 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.298414 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.295654 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-web-config\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.298414 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.295798 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1530be49-0a24-4da9-ac18-d8f5540c0309-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.298414 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.297151 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1530be49-0a24-4da9-ac18-d8f5540c0309-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.301383 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.301310 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1530be49-0a24-4da9-ac18-d8f5540c0309-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.302294 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.302212 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.302640 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.302610 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1530be49-0a24-4da9-ac18-d8f5540c0309-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.302862 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.302840 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1530be49-0a24-4da9-ac18-d8f5540c0309-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.303430 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.303408 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1530be49-0a24-4da9-ac18-d8f5540c0309-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.305645 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.305611 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1530be49-0a24-4da9-ac18-d8f5540c0309-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.306811 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.306162 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.306811 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.306290 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.306811 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.306687 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.307004 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.306837 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1530be49-0a24-4da9-ac18-d8f5540c0309-config-out\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.307337 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.307299 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.307758 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.307721 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-web-config\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.308447 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.308410 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-config\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.310649 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.310612 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.311441 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.311300 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.324451 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.319311 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwvs8\" (UniqueName: \"kubernetes.io/projected/1530be49-0a24-4da9-ac18-d8f5540c0309-kube-api-access-rwvs8\") pod \"prometheus-k8s-0\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.414416 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.414388 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:52:58.575384 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.575355 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 23:52:58.855880 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.855836 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1530be49-0a24-4da9-ac18-d8f5540c0309","Type":"ContainerStarted","Data":"326a478e22e66197a6304a4fe252239f9844fe6d1b2e031a4b83c567317b8415"} Apr 16 23:52:58.858698 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.858670 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-55898d7b96-zltwd" event={"ID":"d6a1a371-ed23-4e03-8408-d1ea20b07e24","Type":"ContainerStarted","Data":"095a73e7fa29ee3a134b6f5b7678ad698eb6c24dc15f710c1f4dbd0b8e899f0c"} Apr 16 23:52:58.858698 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.858701 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-55898d7b96-zltwd" event={"ID":"d6a1a371-ed23-4e03-8408-d1ea20b07e24","Type":"ContainerStarted","Data":"5e4bd9e998b73bcdbdc292e6875c12aa7e8eae76510440effc27180a68e01b41"} Apr 16 23:52:58.858901 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.858710 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-55898d7b96-zltwd" event={"ID":"d6a1a371-ed23-4e03-8408-d1ea20b07e24","Type":"ContainerStarted","Data":"b3dddebf110cc2ca7211f074c8f03fbd0694ff153924ece383b02e90ed0345b8"} Apr 16 23:52:58.858957 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.858913 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-55898d7b96-zltwd" Apr 16 23:52:58.877865 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:58.877818 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-55898d7b96-zltwd" podStartSLOduration=2.316407676 podStartE2EDuration="5.877801459s" podCreationTimestamp="2026-04-16 23:52:53 +0000 UTC" firstStartedPulling="2026-04-16 23:52:54.789485982 +0000 UTC m=+146.998250631" lastFinishedPulling="2026-04-16 23:52:58.350879749 +0000 UTC m=+150.559644414" observedRunningTime="2026-04-16 23:52:58.876678407 +0000 UTC m=+151.085443090" watchObservedRunningTime="2026-04-16 23:52:58.877801459 +0000 UTC m=+151.086566129" Apr 16 23:52:59.349595 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:59.349551 2562 scope.go:117] "RemoveContainer" containerID="6bb903dfc086a59598ef3350cc7075be2a9c8d22fee3949fe7c95b29a52a5a1f" Apr 16 23:52:59.350053 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:52:59.349823 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-mrpzx_openshift-console-operator(5fddaa09-016b-490d-a9ab-9d50ff167b22)\"" pod="openshift-console-operator/console-operator-9d4b6777b-mrpzx" podUID="5fddaa09-016b-490d-a9ab-9d50ff167b22" Apr 16 23:52:59.862448 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:59.862421 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-59ff69d868-kzfwh" event={"ID":"d4d50529-d814-4e89-9eaf-fae37ca01719","Type":"ContainerStarted","Data":"8090c17d383d66ff74ce0dcc20406097622c20b4cf2d44e4f5530cebd714ff62"} Apr 16 23:52:59.862555 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:59.862459 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-59ff69d868-kzfwh" event={"ID":"d4d50529-d814-4e89-9eaf-fae37ca01719","Type":"ContainerStarted","Data":"023661f691a725306ab377e68b51fe3d8f0190423701f663e275980e5d4b3258"} Apr 16 23:52:59.863584 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:59.863562 2562 generic.go:358] "Generic (PLEG): container finished" podID="1530be49-0a24-4da9-ac18-d8f5540c0309" containerID="25d2f4e7b7dc4d850d20932b1f6280064e2ea09f9e1433cc54da39aee3c6695c" exitCode=0 Apr 16 23:52:59.863671 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:52:59.863651 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1530be49-0a24-4da9-ac18-d8f5540c0309","Type":"ContainerDied","Data":"25d2f4e7b7dc4d850d20932b1f6280064e2ea09f9e1433cc54da39aee3c6695c"} Apr 16 23:53:00.868799 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:00.868760 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-59ff69d868-kzfwh" event={"ID":"d4d50529-d814-4e89-9eaf-fae37ca01719","Type":"ContainerStarted","Data":"93af84c07a374089f5d2f0f32ec490a05603e43d86f85027264a790132a0716b"} Apr 16 23:53:00.888221 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:00.888160 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-59ff69d868-kzfwh" podStartSLOduration=1.779239724 podStartE2EDuration="3.888146423s" podCreationTimestamp="2026-04-16 23:52:57 +0000 UTC" firstStartedPulling="2026-04-16 23:52:57.545250497 +0000 UTC m=+149.754015144" lastFinishedPulling="2026-04-16 23:52:59.654157182 +0000 UTC m=+151.862921843" observedRunningTime="2026-04-16 23:53:00.886532002 +0000 UTC m=+153.095296673" watchObservedRunningTime="2026-04-16 23:53:00.888146423 +0000 UTC m=+153.096911094" Apr 16 23:53:02.881674 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:02.879881 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1530be49-0a24-4da9-ac18-d8f5540c0309","Type":"ContainerStarted","Data":"e0812048c7daf9a9592d147de4453bfc84b0cd1bf15f3b207a06c240f2448d81"} Apr 16 23:53:02.881674 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:02.879926 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1530be49-0a24-4da9-ac18-d8f5540c0309","Type":"ContainerStarted","Data":"e96d4ee0cd9b82f57604f1626c7ea82fbfaf042321525c5d70233806581b7e16"} Apr 16 23:53:02.881674 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:02.879944 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1530be49-0a24-4da9-ac18-d8f5540c0309","Type":"ContainerStarted","Data":"efefbeba9d8c1be5ccaa003ae27020b2385270a0a459d782ec5ddbcdec4c315a"} Apr 16 23:53:03.682562 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:53:03.682522 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-dxzbw" podUID="66c9f5e2-4d93-4cf6-96de-a4ade7175b77" Apr 16 23:53:03.700724 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:53:03.700701 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-495hb" podUID="6d46b010-8236-429d-af09-8fb8c4618d50" Apr 16 23:53:03.885585 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:03.885547 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1530be49-0a24-4da9-ac18-d8f5540c0309","Type":"ContainerStarted","Data":"a7f6d681f403ffd60d2a730dae66e850386d26570dde62f0fd75b1fa5299bf4c"} Apr 16 23:53:03.885585 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:03.885583 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dxzbw" Apr 16 23:53:03.885963 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:03.885588 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1530be49-0a24-4da9-ac18-d8f5540c0309","Type":"ContainerStarted","Data":"0e1c7f1870c2a60939b929277865024b87ebce5e59885cd3b6748a052c3f165b"} Apr 16 23:53:03.885963 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:03.885677 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1530be49-0a24-4da9-ac18-d8f5540c0309","Type":"ContainerStarted","Data":"dfaadcbb6438f06ce36a5ae7e5b00d2da1835a4506d044c6aeb8e3dfada3977d"} Apr 16 23:53:03.911413 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:03.911357 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.841275894 podStartE2EDuration="5.91133982s" podCreationTimestamp="2026-04-16 23:52:58 +0000 UTC" firstStartedPulling="2026-04-16 23:52:58.591652637 +0000 UTC m=+150.800417285" lastFinishedPulling="2026-04-16 23:53:02.661716562 +0000 UTC m=+154.870481211" observedRunningTime="2026-04-16 23:53:03.908233917 +0000 UTC m=+156.116998586" watchObservedRunningTime="2026-04-16 23:53:03.91133982 +0000 UTC m=+156.120104491" Apr 16 23:53:04.870181 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:04.870155 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-55898d7b96-zltwd" Apr 16 23:53:05.034584 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:05.034552 2562 patch_prober.go:28] interesting pod/image-registry-7bf9785878-zjgxw container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 23:53:05.035062 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:05.034878 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" podUID="88e7efd3-590c-4e09-8fdb-d9252264d3ad" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 23:53:08.415424 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:08.415388 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:53:08.787449 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:08.787417 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d46b010-8236-429d-af09-8fb8c4618d50-cert\") pod \"ingress-canary-495hb\" (UID: \"6d46b010-8236-429d-af09-8fb8c4618d50\") " pod="openshift-ingress-canary/ingress-canary-495hb" Apr 16 23:53:08.787621 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:08.787457 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/66c9f5e2-4d93-4cf6-96de-a4ade7175b77-metrics-tls\") pod \"dns-default-dxzbw\" (UID: \"66c9f5e2-4d93-4cf6-96de-a4ade7175b77\") " pod="openshift-dns/dns-default-dxzbw" Apr 16 23:53:08.789607 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:08.789586 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/66c9f5e2-4d93-4cf6-96de-a4ade7175b77-metrics-tls\") pod \"dns-default-dxzbw\" (UID: \"66c9f5e2-4d93-4cf6-96de-a4ade7175b77\") " pod="openshift-dns/dns-default-dxzbw" Apr 16 23:53:08.789815 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:08.789794 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d46b010-8236-429d-af09-8fb8c4618d50-cert\") pod \"ingress-canary-495hb\" (UID: \"6d46b010-8236-429d-af09-8fb8c4618d50\") " pod="openshift-ingress-canary/ingress-canary-495hb" Apr 16 23:53:08.988515 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:08.988487 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-jh2wm\"" Apr 16 23:53:08.997146 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:08.997127 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dxzbw" Apr 16 23:53:09.111096 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:09.111062 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dxzbw"] Apr 16 23:53:09.113712 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:53:09.113681 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66c9f5e2_4d93_4cf6_96de_a4ade7175b77.slice/crio-ef831229e5524f374a4aaa8b4e6013c656622cecbb4fe449ccfe10eae96da253 WatchSource:0}: Error finding container ef831229e5524f374a4aaa8b4e6013c656622cecbb4fe449ccfe10eae96da253: Status 404 returned error can't find the container with id ef831229e5524f374a4aaa8b4e6013c656622cecbb4fe449ccfe10eae96da253 Apr 16 23:53:09.906060 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:09.906016 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dxzbw" event={"ID":"66c9f5e2-4d93-4cf6-96de-a4ade7175b77","Type":"ContainerStarted","Data":"ef831229e5524f374a4aaa8b4e6013c656622cecbb4fe449ccfe10eae96da253"} Apr 16 23:53:10.047484 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:10.047434 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" podUID="88e7efd3-590c-4e09-8fdb-d9252264d3ad" containerName="registry" containerID="cri-o://65660836d535580a49ddf3c3b68508d9d5573604c6d9f765666afd86d7687380" gracePeriod=30 Apr 16 23:53:10.454878 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:10.454855 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" Apr 16 23:53:10.499222 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:10.499179 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/88e7efd3-590c-4e09-8fdb-d9252264d3ad-registry-tls\") pod \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\" (UID: \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\") " Apr 16 23:53:10.499395 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:10.499378 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/88e7efd3-590c-4e09-8fdb-d9252264d3ad-ca-trust-extracted\") pod \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\" (UID: \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\") " Apr 16 23:53:10.499501 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:10.499488 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/88e7efd3-590c-4e09-8fdb-d9252264d3ad-bound-sa-token\") pod \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\" (UID: \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\") " Apr 16 23:53:10.499610 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:10.499596 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/88e7efd3-590c-4e09-8fdb-d9252264d3ad-image-registry-private-configuration\") pod \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\" (UID: \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\") " Apr 16 23:53:10.499764 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:10.499752 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/88e7efd3-590c-4e09-8fdb-d9252264d3ad-trusted-ca\") pod \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\" (UID: \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\") " Apr 16 23:53:10.499931 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:10.499918 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/88e7efd3-590c-4e09-8fdb-d9252264d3ad-installation-pull-secrets\") pod \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\" (UID: \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\") " Apr 16 23:53:10.500022 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:10.500011 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9zvj\" (UniqueName: \"kubernetes.io/projected/88e7efd3-590c-4e09-8fdb-d9252264d3ad-kube-api-access-b9zvj\") pod \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\" (UID: \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\") " Apr 16 23:53:10.500115 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:10.500104 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/88e7efd3-590c-4e09-8fdb-d9252264d3ad-registry-certificates\") pod \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\" (UID: \"88e7efd3-590c-4e09-8fdb-d9252264d3ad\") " Apr 16 23:53:10.501023 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:10.500708 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88e7efd3-590c-4e09-8fdb-d9252264d3ad-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "88e7efd3-590c-4e09-8fdb-d9252264d3ad" (UID: "88e7efd3-590c-4e09-8fdb-d9252264d3ad"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:53:10.501023 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:10.500764 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88e7efd3-590c-4e09-8fdb-d9252264d3ad-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "88e7efd3-590c-4e09-8fdb-d9252264d3ad" (UID: "88e7efd3-590c-4e09-8fdb-d9252264d3ad"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:53:10.501660 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:10.501631 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88e7efd3-590c-4e09-8fdb-d9252264d3ad-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "88e7efd3-590c-4e09-8fdb-d9252264d3ad" (UID: "88e7efd3-590c-4e09-8fdb-d9252264d3ad"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:53:10.507953 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:10.507908 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88e7efd3-590c-4e09-8fdb-d9252264d3ad-kube-api-access-b9zvj" (OuterVolumeSpecName: "kube-api-access-b9zvj") pod "88e7efd3-590c-4e09-8fdb-d9252264d3ad" (UID: "88e7efd3-590c-4e09-8fdb-d9252264d3ad"). InnerVolumeSpecName "kube-api-access-b9zvj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:53:10.507953 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:10.507934 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88e7efd3-590c-4e09-8fdb-d9252264d3ad-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "88e7efd3-590c-4e09-8fdb-d9252264d3ad" (UID: "88e7efd3-590c-4e09-8fdb-d9252264d3ad"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:53:10.508231 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:10.508181 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88e7efd3-590c-4e09-8fdb-d9252264d3ad-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "88e7efd3-590c-4e09-8fdb-d9252264d3ad" (UID: "88e7efd3-590c-4e09-8fdb-d9252264d3ad"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:53:10.508735 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:10.508541 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88e7efd3-590c-4e09-8fdb-d9252264d3ad-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "88e7efd3-590c-4e09-8fdb-d9252264d3ad" (UID: "88e7efd3-590c-4e09-8fdb-d9252264d3ad"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:53:10.511637 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:10.511615 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88e7efd3-590c-4e09-8fdb-d9252264d3ad-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "88e7efd3-590c-4e09-8fdb-d9252264d3ad" (UID: "88e7efd3-590c-4e09-8fdb-d9252264d3ad"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:53:10.601341 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:10.601319 2562 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/88e7efd3-590c-4e09-8fdb-d9252264d3ad-trusted-ca\") on node \"ip-10-0-128-98.ec2.internal\" DevicePath \"\"" Apr 16 23:53:10.601341 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:10.601343 2562 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/88e7efd3-590c-4e09-8fdb-d9252264d3ad-installation-pull-secrets\") on node \"ip-10-0-128-98.ec2.internal\" DevicePath \"\"" Apr 16 23:53:10.601466 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:10.601353 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b9zvj\" (UniqueName: \"kubernetes.io/projected/88e7efd3-590c-4e09-8fdb-d9252264d3ad-kube-api-access-b9zvj\") on node \"ip-10-0-128-98.ec2.internal\" DevicePath \"\"" Apr 16 23:53:10.601466 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:10.601362 2562 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/88e7efd3-590c-4e09-8fdb-d9252264d3ad-registry-certificates\") on node \"ip-10-0-128-98.ec2.internal\" DevicePath \"\"" Apr 16 23:53:10.601466 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:10.601372 2562 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/88e7efd3-590c-4e09-8fdb-d9252264d3ad-registry-tls\") on node \"ip-10-0-128-98.ec2.internal\" DevicePath \"\"" Apr 16 23:53:10.601466 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:10.601380 2562 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/88e7efd3-590c-4e09-8fdb-d9252264d3ad-ca-trust-extracted\") on node \"ip-10-0-128-98.ec2.internal\" DevicePath \"\"" Apr 16 23:53:10.601466 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:10.601388 2562 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/88e7efd3-590c-4e09-8fdb-d9252264d3ad-bound-sa-token\") on node \"ip-10-0-128-98.ec2.internal\" DevicePath \"\"" Apr 16 23:53:10.601466 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:10.601398 2562 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/88e7efd3-590c-4e09-8fdb-d9252264d3ad-image-registry-private-configuration\") on node \"ip-10-0-128-98.ec2.internal\" DevicePath \"\"" Apr 16 23:53:10.909694 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:10.909612 2562 generic.go:358] "Generic (PLEG): container finished" podID="88e7efd3-590c-4e09-8fdb-d9252264d3ad" containerID="65660836d535580a49ddf3c3b68508d9d5573604c6d9f765666afd86d7687380" exitCode=0 Apr 16 23:53:10.910131 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:10.909697 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" Apr 16 23:53:10.910131 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:10.909697 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" event={"ID":"88e7efd3-590c-4e09-8fdb-d9252264d3ad","Type":"ContainerDied","Data":"65660836d535580a49ddf3c3b68508d9d5573604c6d9f765666afd86d7687380"} Apr 16 23:53:10.910131 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:10.909739 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bf9785878-zjgxw" event={"ID":"88e7efd3-590c-4e09-8fdb-d9252264d3ad","Type":"ContainerDied","Data":"9ec215506c5acb81f8cd4b2875f53057f42f3f7e0667ffa54dce1e061dddb40e"} Apr 16 23:53:10.910131 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:10.909759 2562 scope.go:117] "RemoveContainer" containerID="65660836d535580a49ddf3c3b68508d9d5573604c6d9f765666afd86d7687380" Apr 16 23:53:10.911433 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:10.911411 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dxzbw" event={"ID":"66c9f5e2-4d93-4cf6-96de-a4ade7175b77","Type":"ContainerStarted","Data":"45ade71d1001a31e2a3032d8f91e58abffe2c2512cce6d165ff0a1c89788afc8"} Apr 16 23:53:10.911433 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:10.911436 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dxzbw" event={"ID":"66c9f5e2-4d93-4cf6-96de-a4ade7175b77","Type":"ContainerStarted","Data":"f010f0e700f369fc8664d18cf8babf306d0aa54f4b3de913720b0ca488e3f567"} Apr 16 23:53:10.911621 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:10.911551 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-dxzbw" Apr 16 23:53:10.918053 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:10.918039 2562 scope.go:117] "RemoveContainer" containerID="65660836d535580a49ddf3c3b68508d9d5573604c6d9f765666afd86d7687380" Apr 16 23:53:10.918328 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:53:10.918311 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65660836d535580a49ddf3c3b68508d9d5573604c6d9f765666afd86d7687380\": container with ID starting with 65660836d535580a49ddf3c3b68508d9d5573604c6d9f765666afd86d7687380 not found: ID does not exist" containerID="65660836d535580a49ddf3c3b68508d9d5573604c6d9f765666afd86d7687380" Apr 16 23:53:10.918387 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:10.918336 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65660836d535580a49ddf3c3b68508d9d5573604c6d9f765666afd86d7687380"} err="failed to get container status \"65660836d535580a49ddf3c3b68508d9d5573604c6d9f765666afd86d7687380\": rpc error: code = NotFound desc = could not find container \"65660836d535580a49ddf3c3b68508d9d5573604c6d9f765666afd86d7687380\": container with ID starting with 65660836d535580a49ddf3c3b68508d9d5573604c6d9f765666afd86d7687380 not found: ID does not exist" Apr 16 23:53:10.927388 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:10.927352 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-dxzbw" podStartSLOduration=129.705594497 podStartE2EDuration="2m10.92734033s" podCreationTimestamp="2026-04-16 23:51:00 +0000 UTC" firstStartedPulling="2026-04-16 23:53:09.115717528 +0000 UTC m=+161.324482176" lastFinishedPulling="2026-04-16 23:53:10.337463359 +0000 UTC m=+162.546228009" observedRunningTime="2026-04-16 23:53:10.9260909 +0000 UTC m=+163.134855569" watchObservedRunningTime="2026-04-16 23:53:10.92734033 +0000 UTC m=+163.136105001" Apr 16 23:53:10.939513 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:10.939448 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7bf9785878-zjgxw"] Apr 16 23:53:10.941104 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:10.941083 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7bf9785878-zjgxw"] Apr 16 23:53:12.352956 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:12.352927 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88e7efd3-590c-4e09-8fdb-d9252264d3ad" path="/var/lib/kubelet/pods/88e7efd3-590c-4e09-8fdb-d9252264d3ad/volumes" Apr 16 23:53:14.348909 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:14.348880 2562 scope.go:117] "RemoveContainer" containerID="6bb903dfc086a59598ef3350cc7075be2a9c8d22fee3949fe7c95b29a52a5a1f" Apr 16 23:53:14.926795 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:14.926766 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrpzx_5fddaa09-016b-490d-a9ab-9d50ff167b22/console-operator/2.log" Apr 16 23:53:14.926987 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:14.926869 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-mrpzx" event={"ID":"5fddaa09-016b-490d-a9ab-9d50ff167b22","Type":"ContainerStarted","Data":"240aa4296844e0ad1873e1b70636f3c8c3230b68559d558f53f5bc7e7e2a7fa0"} Apr 16 23:53:14.929035 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:14.929004 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-mrpzx" Apr 16 23:53:14.934825 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:14.934803 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-mrpzx" Apr 16 23:53:14.943470 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:14.943427 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-mrpzx" podStartSLOduration=56.091973367 podStartE2EDuration="58.943414485s" podCreationTimestamp="2026-04-16 23:52:16 +0000 UTC" firstStartedPulling="2026-04-16 23:52:16.687390186 +0000 UTC m=+108.896154833" lastFinishedPulling="2026-04-16 23:52:19.538831298 +0000 UTC m=+111.747595951" observedRunningTime="2026-04-16 23:53:14.941736351 +0000 UTC m=+167.150501022" watchObservedRunningTime="2026-04-16 23:53:14.943414485 +0000 UTC m=+167.152179201" Apr 16 23:53:15.348507 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:15.348476 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-495hb" Apr 16 23:53:15.350657 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:15.350636 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-6grx4\"" Apr 16 23:53:15.359213 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:15.359185 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-495hb" Apr 16 23:53:15.472920 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:15.472897 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-495hb"] Apr 16 23:53:15.475473 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:53:15.475439 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d46b010_8236_429d_af09_8fb8c4618d50.slice/crio-991d2001f852c8426c3cf38ecca2e18c932eb12a55abf99b5aab3251283703fe WatchSource:0}: Error finding container 991d2001f852c8426c3cf38ecca2e18c932eb12a55abf99b5aab3251283703fe: Status 404 returned error can't find the container with id 991d2001f852c8426c3cf38ecca2e18c932eb12a55abf99b5aab3251283703fe Apr 16 23:53:15.932058 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:15.932020 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-495hb" event={"ID":"6d46b010-8236-429d-af09-8fb8c4618d50","Type":"ContainerStarted","Data":"991d2001f852c8426c3cf38ecca2e18c932eb12a55abf99b5aab3251283703fe"} Apr 16 23:53:17.940841 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:17.940805 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-495hb" event={"ID":"6d46b010-8236-429d-af09-8fb8c4618d50","Type":"ContainerStarted","Data":"efb8faeeebbc90406f92dfa3e9c21b19af511c7442a3d1d1eb76216223c8bde2"} Apr 16 23:53:17.954259 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:17.954217 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-495hb" podStartSLOduration=136.379110415 podStartE2EDuration="2m17.954183493s" podCreationTimestamp="2026-04-16 23:51:00 +0000 UTC" firstStartedPulling="2026-04-16 23:53:15.477344285 +0000 UTC m=+167.686108938" lastFinishedPulling="2026-04-16 23:53:17.052417366 +0000 UTC m=+169.261182016" observedRunningTime="2026-04-16 23:53:17.953617349 +0000 UTC m=+170.162382020" watchObservedRunningTime="2026-04-16 23:53:17.954183493 +0000 UTC m=+170.162948162" Apr 16 23:53:20.916969 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:20.916936 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-dxzbw" Apr 16 23:53:40.009481 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:40.009451 2562 generic.go:358] "Generic (PLEG): container finished" podID="c2e5d2fe-09a0-4110-9c80-2ae937a2a115" containerID="348a43c729ccd19a91cff613350f4324c3b647629677cb253d20ecca50dba10a" exitCode=0 Apr 16 23:53:40.009840 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:40.009527 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-c7mwx" event={"ID":"c2e5d2fe-09a0-4110-9c80-2ae937a2a115","Type":"ContainerDied","Data":"348a43c729ccd19a91cff613350f4324c3b647629677cb253d20ecca50dba10a"} Apr 16 23:53:40.009889 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:40.009854 2562 scope.go:117] "RemoveContainer" containerID="348a43c729ccd19a91cff613350f4324c3b647629677cb253d20ecca50dba10a" Apr 16 23:53:41.013668 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:41.013636 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-c7mwx" event={"ID":"c2e5d2fe-09a0-4110-9c80-2ae937a2a115","Type":"ContainerStarted","Data":"a45a0fe42664a7bedb96e46c141897c29815d84da07db36d8ae11a30589802f0"} Apr 16 23:53:46.029303 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:46.029268 2562 generic.go:358] "Generic (PLEG): container finished" podID="e44cc99f-6cf4-4c00-ac07-e08e682c6db6" containerID="0e9d29d995a8760eedd390ea19d4785cbbe2d54e7e6839e9f43174cc11ce2dc4" exitCode=0 Apr 16 23:53:46.029667 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:46.029325 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-prmqz" event={"ID":"e44cc99f-6cf4-4c00-ac07-e08e682c6db6","Type":"ContainerDied","Data":"0e9d29d995a8760eedd390ea19d4785cbbe2d54e7e6839e9f43174cc11ce2dc4"} Apr 16 23:53:46.029667 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:46.029650 2562 scope.go:117] "RemoveContainer" containerID="0e9d29d995a8760eedd390ea19d4785cbbe2d54e7e6839e9f43174cc11ce2dc4" Apr 16 23:53:47.033505 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:47.033474 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-prmqz" event={"ID":"e44cc99f-6cf4-4c00-ac07-e08e682c6db6","Type":"ContainerStarted","Data":"b25da3489c20c366b242967b88572acddafc8f6455cf9756244fa9a64fac4230"} Apr 16 23:53:48.178641 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:48.178613 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-xcgwl_f349e78a-8ba4-44d0-86d8-b96de7477daa/kube-state-metrics/0.log" Apr 16 23:53:48.378427 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:48.378398 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-xcgwl_f349e78a-8ba4-44d0-86d8-b96de7477daa/kube-rbac-proxy-main/0.log" Apr 16 23:53:48.578539 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:48.578512 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-xcgwl_f349e78a-8ba4-44d0-86d8-b96de7477daa/kube-rbac-proxy-self/0.log" Apr 16 23:53:49.178838 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:49.178809 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bf25v_d6890379-66f0-4714-b0f4-da3de91cbdc8/init-textfile/0.log" Apr 16 23:53:49.379328 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:49.379300 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bf25v_d6890379-66f0-4714-b0f4-da3de91cbdc8/node-exporter/0.log" Apr 16 23:53:49.580227 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:49.580180 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bf25v_d6890379-66f0-4714-b0f4-da3de91cbdc8/kube-rbac-proxy/0.log" Apr 16 23:53:50.979067 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:50.979040 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-c4nhb_75c3d93e-0ccd-45ed-87ab-1abfec9fc614/kube-rbac-proxy-main/0.log" Apr 16 23:53:51.177990 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:51.177960 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-c4nhb_75c3d93e-0ccd-45ed-87ab-1abfec9fc614/kube-rbac-proxy-self/0.log" Apr 16 23:53:51.379054 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:51.378985 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-c4nhb_75c3d93e-0ccd-45ed-87ab-1abfec9fc614/openshift-state-metrics/0.log" Apr 16 23:53:51.577908 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:51.577880 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1530be49-0a24-4da9-ac18-d8f5540c0309/init-config-reloader/0.log" Apr 16 23:53:51.780328 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:51.780301 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1530be49-0a24-4da9-ac18-d8f5540c0309/prometheus/0.log" Apr 16 23:53:51.978653 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:51.978627 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1530be49-0a24-4da9-ac18-d8f5540c0309/config-reloader/0.log" Apr 16 23:53:52.178860 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:52.178791 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1530be49-0a24-4da9-ac18-d8f5540c0309/thanos-sidecar/0.log" Apr 16 23:53:52.378106 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:52.378080 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1530be49-0a24-4da9-ac18-d8f5540c0309/kube-rbac-proxy-web/0.log" Apr 16 23:53:52.578544 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:52.578521 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1530be49-0a24-4da9-ac18-d8f5540c0309/kube-rbac-proxy/0.log" Apr 16 23:53:52.778109 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:52.778061 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1530be49-0a24-4da9-ac18-d8f5540c0309/kube-rbac-proxy-thanos/0.log" Apr 16 23:53:52.980124 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:52.980094 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-dh4k6_04c6d30a-a073-440c-bd31-f046f4e9c445/prometheus-operator/0.log" Apr 16 23:53:53.178341 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:53.178307 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-dh4k6_04c6d30a-a073-440c-bd31-f046f4e9c445/kube-rbac-proxy/0.log" Apr 16 23:53:53.578953 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:53.578928 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-59ff69d868-kzfwh_d4d50529-d814-4e89-9eaf-fae37ca01719/telemeter-client/0.log" Apr 16 23:53:53.778857 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:53.778834 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-59ff69d868-kzfwh_d4d50529-d814-4e89-9eaf-fae37ca01719/reload/0.log" Apr 16 23:53:53.978845 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:53.978814 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-59ff69d868-kzfwh_d4d50529-d814-4e89-9eaf-fae37ca01719/kube-rbac-proxy/0.log" Apr 16 23:53:54.178575 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:54.178544 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-55898d7b96-zltwd_d6a1a371-ed23-4e03-8408-d1ea20b07e24/thanos-query/0.log" Apr 16 23:53:54.378769 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:54.378690 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-55898d7b96-zltwd_d6a1a371-ed23-4e03-8408-d1ea20b07e24/kube-rbac-proxy-web/0.log" Apr 16 23:53:54.579178 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:54.579146 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-55898d7b96-zltwd_d6a1a371-ed23-4e03-8408-d1ea20b07e24/kube-rbac-proxy/0.log" Apr 16 23:53:54.778009 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:54.777981 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-55898d7b96-zltwd_d6a1a371-ed23-4e03-8408-d1ea20b07e24/prom-label-proxy/0.log" Apr 16 23:53:54.978727 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:54.978699 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-55898d7b96-zltwd_d6a1a371-ed23-4e03-8408-d1ea20b07e24/kube-rbac-proxy-rules/0.log" Apr 16 23:53:55.178211 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:55.178106 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-55898d7b96-zltwd_d6a1a371-ed23-4e03-8408-d1ea20b07e24/kube-rbac-proxy-metrics/0.log" Apr 16 23:53:55.578872 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:55.578842 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrpzx_5fddaa09-016b-490d-a9ab-9d50ff167b22/console-operator/2.log" Apr 16 23:53:55.780287 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:55.780260 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrpzx_5fddaa09-016b-490d-a9ab-9d50ff167b22/console-operator/3.log" Apr 16 23:53:58.414810 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:58.414779 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:53:58.434426 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:58.434400 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:53:59.084592 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:53:59.084562 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:16.431103 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:16.431070 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 23:54:16.432330 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:16.432272 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1530be49-0a24-4da9-ac18-d8f5540c0309" containerName="prometheus" containerID="cri-o://efefbeba9d8c1be5ccaa003ae27020b2385270a0a459d782ec5ddbcdec4c315a" gracePeriod=600 Apr 16 23:54:16.432795 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:16.432765 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1530be49-0a24-4da9-ac18-d8f5540c0309" containerName="kube-rbac-proxy-thanos" containerID="cri-o://a7f6d681f403ffd60d2a730dae66e850386d26570dde62f0fd75b1fa5299bf4c" gracePeriod=600 Apr 16 23:54:16.432992 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:16.432974 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1530be49-0a24-4da9-ac18-d8f5540c0309" containerName="kube-rbac-proxy" containerID="cri-o://0e1c7f1870c2a60939b929277865024b87ebce5e59885cd3b6748a052c3f165b" gracePeriod=600 Apr 16 23:54:16.433149 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:16.433130 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1530be49-0a24-4da9-ac18-d8f5540c0309" containerName="kube-rbac-proxy-web" containerID="cri-o://dfaadcbb6438f06ce36a5ae7e5b00d2da1835a4506d044c6aeb8e3dfada3977d" gracePeriod=600 Apr 16 23:54:16.434397 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:16.433420 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1530be49-0a24-4da9-ac18-d8f5540c0309" containerName="config-reloader" containerID="cri-o://e96d4ee0cd9b82f57604f1626c7ea82fbfaf042321525c5d70233806581b7e16" gracePeriod=600 Apr 16 23:54:16.434397 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:16.433467 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1530be49-0a24-4da9-ac18-d8f5540c0309" containerName="thanos-sidecar" containerID="cri-o://e0812048c7daf9a9592d147de4453bfc84b0cd1bf15f3b207a06c240f2448d81" gracePeriod=600 Apr 16 23:54:17.120468 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.120434 2562 generic.go:358] "Generic (PLEG): container finished" podID="1530be49-0a24-4da9-ac18-d8f5540c0309" containerID="a7f6d681f403ffd60d2a730dae66e850386d26570dde62f0fd75b1fa5299bf4c" exitCode=0 Apr 16 23:54:17.120468 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.120461 2562 generic.go:358] "Generic (PLEG): container finished" podID="1530be49-0a24-4da9-ac18-d8f5540c0309" containerID="0e1c7f1870c2a60939b929277865024b87ebce5e59885cd3b6748a052c3f165b" exitCode=0 Apr 16 23:54:17.120468 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.120468 2562 generic.go:358] "Generic (PLEG): container finished" podID="1530be49-0a24-4da9-ac18-d8f5540c0309" containerID="e0812048c7daf9a9592d147de4453bfc84b0cd1bf15f3b207a06c240f2448d81" exitCode=0 Apr 16 23:54:17.120468 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.120473 2562 generic.go:358] "Generic (PLEG): container finished" podID="1530be49-0a24-4da9-ac18-d8f5540c0309" containerID="e96d4ee0cd9b82f57604f1626c7ea82fbfaf042321525c5d70233806581b7e16" exitCode=0 Apr 16 23:54:17.120468 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.120478 2562 generic.go:358] "Generic (PLEG): container finished" podID="1530be49-0a24-4da9-ac18-d8f5540c0309" containerID="efefbeba9d8c1be5ccaa003ae27020b2385270a0a459d782ec5ddbcdec4c315a" exitCode=0 Apr 16 23:54:17.120739 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.120501 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1530be49-0a24-4da9-ac18-d8f5540c0309","Type":"ContainerDied","Data":"a7f6d681f403ffd60d2a730dae66e850386d26570dde62f0fd75b1fa5299bf4c"} Apr 16 23:54:17.120739 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.120532 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1530be49-0a24-4da9-ac18-d8f5540c0309","Type":"ContainerDied","Data":"0e1c7f1870c2a60939b929277865024b87ebce5e59885cd3b6748a052c3f165b"} Apr 16 23:54:17.120739 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.120543 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1530be49-0a24-4da9-ac18-d8f5540c0309","Type":"ContainerDied","Data":"e0812048c7daf9a9592d147de4453bfc84b0cd1bf15f3b207a06c240f2448d81"} Apr 16 23:54:17.120739 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.120551 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1530be49-0a24-4da9-ac18-d8f5540c0309","Type":"ContainerDied","Data":"e96d4ee0cd9b82f57604f1626c7ea82fbfaf042321525c5d70233806581b7e16"} Apr 16 23:54:17.120739 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.120560 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1530be49-0a24-4da9-ac18-d8f5540c0309","Type":"ContainerDied","Data":"efefbeba9d8c1be5ccaa003ae27020b2385270a0a459d782ec5ddbcdec4c315a"} Apr 16 23:54:17.663424 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.663401 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:17.737345 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.737313 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1530be49-0a24-4da9-ac18-d8f5540c0309-prometheus-k8s-rulefiles-0\") pod \"1530be49-0a24-4da9-ac18-d8f5540c0309\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " Apr 16 23:54:17.737505 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.737363 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1530be49-0a24-4da9-ac18-d8f5540c0309-configmap-metrics-client-ca\") pod \"1530be49-0a24-4da9-ac18-d8f5540c0309\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " Apr 16 23:54:17.737505 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.737388 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1530be49-0a24-4da9-ac18-d8f5540c0309-config-out\") pod \"1530be49-0a24-4da9-ac18-d8f5540c0309\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " Apr 16 23:54:17.737505 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.737418 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1530be49-0a24-4da9-ac18-d8f5540c0309-configmap-kubelet-serving-ca-bundle\") pod \"1530be49-0a24-4da9-ac18-d8f5540c0309\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " Apr 16 23:54:17.737505 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.737451 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"1530be49-0a24-4da9-ac18-d8f5540c0309\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " Apr 16 23:54:17.737505 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.737485 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-secret-prometheus-k8s-tls\") pod \"1530be49-0a24-4da9-ac18-d8f5540c0309\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " Apr 16 23:54:17.737750 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.737707 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-secret-kube-rbac-proxy\") pod \"1530be49-0a24-4da9-ac18-d8f5540c0309\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " Apr 16 23:54:17.737810 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.737752 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-secret-metrics-client-certs\") pod \"1530be49-0a24-4da9-ac18-d8f5540c0309\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " Apr 16 23:54:17.737810 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.737790 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"1530be49-0a24-4da9-ac18-d8f5540c0309\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " Apr 16 23:54:17.737927 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.737836 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-secret-grpc-tls\") pod \"1530be49-0a24-4da9-ac18-d8f5540c0309\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " Apr 16 23:54:17.737927 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.737864 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1530be49-0a24-4da9-ac18-d8f5540c0309-tls-assets\") pod \"1530be49-0a24-4da9-ac18-d8f5540c0309\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " Apr 16 23:54:17.737927 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.737896 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1530be49-0a24-4da9-ac18-d8f5540c0309-configmap-serving-certs-ca-bundle\") pod \"1530be49-0a24-4da9-ac18-d8f5540c0309\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " Apr 16 23:54:17.737927 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.737920 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1530be49-0a24-4da9-ac18-d8f5540c0309-prometheus-k8s-db\") pod \"1530be49-0a24-4da9-ac18-d8f5540c0309\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " Apr 16 23:54:17.738120 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.737959 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1530be49-0a24-4da9-ac18-d8f5540c0309-prometheus-trusted-ca-bundle\") pod \"1530be49-0a24-4da9-ac18-d8f5540c0309\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " Apr 16 23:54:17.738120 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.737987 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-thanos-prometheus-http-client-file\") pod \"1530be49-0a24-4da9-ac18-d8f5540c0309\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " Apr 16 23:54:17.738120 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.738014 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwvs8\" (UniqueName: \"kubernetes.io/projected/1530be49-0a24-4da9-ac18-d8f5540c0309-kube-api-access-rwvs8\") pod \"1530be49-0a24-4da9-ac18-d8f5540c0309\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " Apr 16 23:54:17.738120 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.738044 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-web-config\") pod \"1530be49-0a24-4da9-ac18-d8f5540c0309\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " Apr 16 23:54:17.738120 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.738080 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-config\") pod \"1530be49-0a24-4da9-ac18-d8f5540c0309\" (UID: \"1530be49-0a24-4da9-ac18-d8f5540c0309\") " Apr 16 23:54:17.738845 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.737857 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1530be49-0a24-4da9-ac18-d8f5540c0309-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "1530be49-0a24-4da9-ac18-d8f5540c0309" (UID: "1530be49-0a24-4da9-ac18-d8f5540c0309"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:54:17.738911 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.737859 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1530be49-0a24-4da9-ac18-d8f5540c0309-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "1530be49-0a24-4da9-ac18-d8f5540c0309" (UID: "1530be49-0a24-4da9-ac18-d8f5540c0309"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:54:17.739280 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.739251 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1530be49-0a24-4da9-ac18-d8f5540c0309-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "1530be49-0a24-4da9-ac18-d8f5540c0309" (UID: "1530be49-0a24-4da9-ac18-d8f5540c0309"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:54:17.741351 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.739997 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1530be49-0a24-4da9-ac18-d8f5540c0309-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "1530be49-0a24-4da9-ac18-d8f5540c0309" (UID: "1530be49-0a24-4da9-ac18-d8f5540c0309"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:54:17.741351 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.740366 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "1530be49-0a24-4da9-ac18-d8f5540c0309" (UID: "1530be49-0a24-4da9-ac18-d8f5540c0309"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:54:17.741351 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.740435 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "1530be49-0a24-4da9-ac18-d8f5540c0309" (UID: "1530be49-0a24-4da9-ac18-d8f5540c0309"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:54:17.741351 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.740488 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1530be49-0a24-4da9-ac18-d8f5540c0309-config-out" (OuterVolumeSpecName: "config-out") pod "1530be49-0a24-4da9-ac18-d8f5540c0309" (UID: "1530be49-0a24-4da9-ac18-d8f5540c0309"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:54:17.741351 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.740542 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1530be49-0a24-4da9-ac18-d8f5540c0309-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "1530be49-0a24-4da9-ac18-d8f5540c0309" (UID: "1530be49-0a24-4da9-ac18-d8f5540c0309"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:54:17.741351 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.740795 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "1530be49-0a24-4da9-ac18-d8f5540c0309" (UID: "1530be49-0a24-4da9-ac18-d8f5540c0309"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:54:17.741351 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.740823 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1530be49-0a24-4da9-ac18-d8f5540c0309-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "1530be49-0a24-4da9-ac18-d8f5540c0309" (UID: "1530be49-0a24-4da9-ac18-d8f5540c0309"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:54:17.741351 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.741214 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "1530be49-0a24-4da9-ac18-d8f5540c0309" (UID: "1530be49-0a24-4da9-ac18-d8f5540c0309"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:54:17.742274 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.741695 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "1530be49-0a24-4da9-ac18-d8f5540c0309" (UID: "1530be49-0a24-4da9-ac18-d8f5540c0309"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:54:17.742274 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.742048 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "1530be49-0a24-4da9-ac18-d8f5540c0309" (UID: "1530be49-0a24-4da9-ac18-d8f5540c0309"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:54:17.742406 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.742384 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1530be49-0a24-4da9-ac18-d8f5540c0309-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "1530be49-0a24-4da9-ac18-d8f5540c0309" (UID: "1530be49-0a24-4da9-ac18-d8f5540c0309"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:54:17.742502 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.742485 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "1530be49-0a24-4da9-ac18-d8f5540c0309" (UID: "1530be49-0a24-4da9-ac18-d8f5540c0309"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:54:17.742563 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.742547 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-config" (OuterVolumeSpecName: "config") pod "1530be49-0a24-4da9-ac18-d8f5540c0309" (UID: "1530be49-0a24-4da9-ac18-d8f5540c0309"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:54:17.743017 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.742997 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1530be49-0a24-4da9-ac18-d8f5540c0309-kube-api-access-rwvs8" (OuterVolumeSpecName: "kube-api-access-rwvs8") pod "1530be49-0a24-4da9-ac18-d8f5540c0309" (UID: "1530be49-0a24-4da9-ac18-d8f5540c0309"). InnerVolumeSpecName "kube-api-access-rwvs8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:54:17.754269 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.754245 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-web-config" (OuterVolumeSpecName: "web-config") pod "1530be49-0a24-4da9-ac18-d8f5540c0309" (UID: "1530be49-0a24-4da9-ac18-d8f5540c0309"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:54:17.838629 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.838608 2562 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1530be49-0a24-4da9-ac18-d8f5540c0309-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-128-98.ec2.internal\" DevicePath \"\"" Apr 16 23:54:17.838629 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.838627 2562 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1530be49-0a24-4da9-ac18-d8f5540c0309-configmap-metrics-client-ca\") on node \"ip-10-0-128-98.ec2.internal\" DevicePath \"\"" Apr 16 23:54:17.838737 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.838639 2562 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1530be49-0a24-4da9-ac18-d8f5540c0309-config-out\") on node \"ip-10-0-128-98.ec2.internal\" DevicePath \"\"" Apr 16 23:54:17.838737 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.838648 2562 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1530be49-0a24-4da9-ac18-d8f5540c0309-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-128-98.ec2.internal\" DevicePath \"\"" Apr 16 23:54:17.838737 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.838660 2562 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-128-98.ec2.internal\" DevicePath \"\"" Apr 16 23:54:17.838737 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.838669 2562 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-secret-prometheus-k8s-tls\") on node \"ip-10-0-128-98.ec2.internal\" DevicePath \"\"" Apr 16 23:54:17.838737 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.838679 2562 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-secret-kube-rbac-proxy\") on node \"ip-10-0-128-98.ec2.internal\" DevicePath \"\"" Apr 16 23:54:17.838737 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.838686 2562 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-secret-metrics-client-certs\") on node \"ip-10-0-128-98.ec2.internal\" DevicePath \"\"" Apr 16 23:54:17.838737 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.838697 2562 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-128-98.ec2.internal\" DevicePath \"\"" Apr 16 23:54:17.838737 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.838717 2562 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-secret-grpc-tls\") on node \"ip-10-0-128-98.ec2.internal\" DevicePath \"\"" Apr 16 23:54:17.838737 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.838725 2562 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1530be49-0a24-4da9-ac18-d8f5540c0309-tls-assets\") on node \"ip-10-0-128-98.ec2.internal\" DevicePath \"\"" Apr 16 23:54:17.838737 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.838734 2562 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1530be49-0a24-4da9-ac18-d8f5540c0309-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-128-98.ec2.internal\" DevicePath \"\"" Apr 16 23:54:17.839017 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.838743 2562 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1530be49-0a24-4da9-ac18-d8f5540c0309-prometheus-k8s-db\") on node \"ip-10-0-128-98.ec2.internal\" DevicePath \"\"" Apr 16 23:54:17.839017 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.838752 2562 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1530be49-0a24-4da9-ac18-d8f5540c0309-prometheus-trusted-ca-bundle\") on node \"ip-10-0-128-98.ec2.internal\" DevicePath \"\"" Apr 16 23:54:17.839017 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.838760 2562 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-thanos-prometheus-http-client-file\") on node \"ip-10-0-128-98.ec2.internal\" DevicePath \"\"" Apr 16 23:54:17.839017 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.838769 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rwvs8\" (UniqueName: \"kubernetes.io/projected/1530be49-0a24-4da9-ac18-d8f5540c0309-kube-api-access-rwvs8\") on node \"ip-10-0-128-98.ec2.internal\" DevicePath \"\"" Apr 16 23:54:17.839017 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.838779 2562 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-web-config\") on node \"ip-10-0-128-98.ec2.internal\" DevicePath \"\"" Apr 16 23:54:17.839017 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:17.838787 2562 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1530be49-0a24-4da9-ac18-d8f5540c0309-config\") on node \"ip-10-0-128-98.ec2.internal\" DevicePath \"\"" Apr 16 23:54:18.126444 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.126372 2562 generic.go:358] "Generic (PLEG): container finished" podID="1530be49-0a24-4da9-ac18-d8f5540c0309" containerID="dfaadcbb6438f06ce36a5ae7e5b00d2da1835a4506d044c6aeb8e3dfada3977d" exitCode=0 Apr 16 23:54:18.126581 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.126442 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1530be49-0a24-4da9-ac18-d8f5540c0309","Type":"ContainerDied","Data":"dfaadcbb6438f06ce36a5ae7e5b00d2da1835a4506d044c6aeb8e3dfada3977d"} Apr 16 23:54:18.126581 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.126474 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.126581 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.126492 2562 scope.go:117] "RemoveContainer" containerID="a7f6d681f403ffd60d2a730dae66e850386d26570dde62f0fd75b1fa5299bf4c" Apr 16 23:54:18.126712 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.126478 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1530be49-0a24-4da9-ac18-d8f5540c0309","Type":"ContainerDied","Data":"326a478e22e66197a6304a4fe252239f9844fe6d1b2e031a4b83c567317b8415"} Apr 16 23:54:18.136552 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.136479 2562 scope.go:117] "RemoveContainer" containerID="0e1c7f1870c2a60939b929277865024b87ebce5e59885cd3b6748a052c3f165b" Apr 16 23:54:18.143332 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.143317 2562 scope.go:117] "RemoveContainer" containerID="dfaadcbb6438f06ce36a5ae7e5b00d2da1835a4506d044c6aeb8e3dfada3977d" Apr 16 23:54:18.150092 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.150072 2562 scope.go:117] "RemoveContainer" containerID="e0812048c7daf9a9592d147de4453bfc84b0cd1bf15f3b207a06c240f2448d81" Apr 16 23:54:18.150976 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.150917 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 23:54:18.154640 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.154619 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 23:54:18.157091 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.157071 2562 scope.go:117] "RemoveContainer" containerID="e96d4ee0cd9b82f57604f1626c7ea82fbfaf042321525c5d70233806581b7e16" Apr 16 23:54:18.163529 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.163513 2562 scope.go:117] "RemoveContainer" containerID="efefbeba9d8c1be5ccaa003ae27020b2385270a0a459d782ec5ddbcdec4c315a" Apr 16 23:54:18.170156 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.170142 2562 scope.go:117] "RemoveContainer" containerID="25d2f4e7b7dc4d850d20932b1f6280064e2ea09f9e1433cc54da39aee3c6695c" Apr 16 23:54:18.176233 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.176217 2562 scope.go:117] "RemoveContainer" containerID="a7f6d681f403ffd60d2a730dae66e850386d26570dde62f0fd75b1fa5299bf4c" Apr 16 23:54:18.176484 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:54:18.176465 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7f6d681f403ffd60d2a730dae66e850386d26570dde62f0fd75b1fa5299bf4c\": container with ID starting with a7f6d681f403ffd60d2a730dae66e850386d26570dde62f0fd75b1fa5299bf4c not found: ID does not exist" containerID="a7f6d681f403ffd60d2a730dae66e850386d26570dde62f0fd75b1fa5299bf4c" Apr 16 23:54:18.176544 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.176492 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7f6d681f403ffd60d2a730dae66e850386d26570dde62f0fd75b1fa5299bf4c"} err="failed to get container status \"a7f6d681f403ffd60d2a730dae66e850386d26570dde62f0fd75b1fa5299bf4c\": rpc error: code = NotFound desc = could not find container \"a7f6d681f403ffd60d2a730dae66e850386d26570dde62f0fd75b1fa5299bf4c\": container with ID starting with a7f6d681f403ffd60d2a730dae66e850386d26570dde62f0fd75b1fa5299bf4c not found: ID does not exist" Apr 16 23:54:18.176544 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.176510 2562 scope.go:117] "RemoveContainer" containerID="0e1c7f1870c2a60939b929277865024b87ebce5e59885cd3b6748a052c3f165b" Apr 16 23:54:18.176729 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:54:18.176712 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e1c7f1870c2a60939b929277865024b87ebce5e59885cd3b6748a052c3f165b\": container with ID starting with 0e1c7f1870c2a60939b929277865024b87ebce5e59885cd3b6748a052c3f165b not found: ID does not exist" containerID="0e1c7f1870c2a60939b929277865024b87ebce5e59885cd3b6748a052c3f165b" Apr 16 23:54:18.176771 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.176742 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e1c7f1870c2a60939b929277865024b87ebce5e59885cd3b6748a052c3f165b"} err="failed to get container status \"0e1c7f1870c2a60939b929277865024b87ebce5e59885cd3b6748a052c3f165b\": rpc error: code = NotFound desc = could not find container \"0e1c7f1870c2a60939b929277865024b87ebce5e59885cd3b6748a052c3f165b\": container with ID starting with 0e1c7f1870c2a60939b929277865024b87ebce5e59885cd3b6748a052c3f165b not found: ID does not exist" Apr 16 23:54:18.176771 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.176756 2562 scope.go:117] "RemoveContainer" containerID="dfaadcbb6438f06ce36a5ae7e5b00d2da1835a4506d044c6aeb8e3dfada3977d" Apr 16 23:54:18.176985 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:54:18.176970 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfaadcbb6438f06ce36a5ae7e5b00d2da1835a4506d044c6aeb8e3dfada3977d\": container with ID starting with dfaadcbb6438f06ce36a5ae7e5b00d2da1835a4506d044c6aeb8e3dfada3977d not found: ID does not exist" containerID="dfaadcbb6438f06ce36a5ae7e5b00d2da1835a4506d044c6aeb8e3dfada3977d" Apr 16 23:54:18.177032 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.176989 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfaadcbb6438f06ce36a5ae7e5b00d2da1835a4506d044c6aeb8e3dfada3977d"} err="failed to get container status \"dfaadcbb6438f06ce36a5ae7e5b00d2da1835a4506d044c6aeb8e3dfada3977d\": rpc error: code = NotFound desc = could not find container \"dfaadcbb6438f06ce36a5ae7e5b00d2da1835a4506d044c6aeb8e3dfada3977d\": container with ID starting with dfaadcbb6438f06ce36a5ae7e5b00d2da1835a4506d044c6aeb8e3dfada3977d not found: ID does not exist" Apr 16 23:54:18.177032 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.177003 2562 scope.go:117] "RemoveContainer" containerID="e0812048c7daf9a9592d147de4453bfc84b0cd1bf15f3b207a06c240f2448d81" Apr 16 23:54:18.177236 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:54:18.177214 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0812048c7daf9a9592d147de4453bfc84b0cd1bf15f3b207a06c240f2448d81\": container with ID starting with e0812048c7daf9a9592d147de4453bfc84b0cd1bf15f3b207a06c240f2448d81 not found: ID does not exist" containerID="e0812048c7daf9a9592d147de4453bfc84b0cd1bf15f3b207a06c240f2448d81" Apr 16 23:54:18.177339 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.177239 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0812048c7daf9a9592d147de4453bfc84b0cd1bf15f3b207a06c240f2448d81"} err="failed to get container status \"e0812048c7daf9a9592d147de4453bfc84b0cd1bf15f3b207a06c240f2448d81\": rpc error: code = NotFound desc = could not find container \"e0812048c7daf9a9592d147de4453bfc84b0cd1bf15f3b207a06c240f2448d81\": container with ID starting with e0812048c7daf9a9592d147de4453bfc84b0cd1bf15f3b207a06c240f2448d81 not found: ID does not exist" Apr 16 23:54:18.177339 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.177259 2562 scope.go:117] "RemoveContainer" containerID="e96d4ee0cd9b82f57604f1626c7ea82fbfaf042321525c5d70233806581b7e16" Apr 16 23:54:18.177523 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:54:18.177506 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e96d4ee0cd9b82f57604f1626c7ea82fbfaf042321525c5d70233806581b7e16\": container with ID starting with e96d4ee0cd9b82f57604f1626c7ea82fbfaf042321525c5d70233806581b7e16 not found: ID does not exist" containerID="e96d4ee0cd9b82f57604f1626c7ea82fbfaf042321525c5d70233806581b7e16" Apr 16 23:54:18.177580 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.177528 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e96d4ee0cd9b82f57604f1626c7ea82fbfaf042321525c5d70233806581b7e16"} err="failed to get container status \"e96d4ee0cd9b82f57604f1626c7ea82fbfaf042321525c5d70233806581b7e16\": rpc error: code = NotFound desc = could not find container \"e96d4ee0cd9b82f57604f1626c7ea82fbfaf042321525c5d70233806581b7e16\": container with ID starting with e96d4ee0cd9b82f57604f1626c7ea82fbfaf042321525c5d70233806581b7e16 not found: ID does not exist" Apr 16 23:54:18.177580 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.177548 2562 scope.go:117] "RemoveContainer" containerID="efefbeba9d8c1be5ccaa003ae27020b2385270a0a459d782ec5ddbcdec4c315a" Apr 16 23:54:18.178093 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:54:18.178064 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efefbeba9d8c1be5ccaa003ae27020b2385270a0a459d782ec5ddbcdec4c315a\": container with ID starting with efefbeba9d8c1be5ccaa003ae27020b2385270a0a459d782ec5ddbcdec4c315a not found: ID does not exist" containerID="efefbeba9d8c1be5ccaa003ae27020b2385270a0a459d782ec5ddbcdec4c315a" Apr 16 23:54:18.178184 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.178098 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efefbeba9d8c1be5ccaa003ae27020b2385270a0a459d782ec5ddbcdec4c315a"} err="failed to get container status \"efefbeba9d8c1be5ccaa003ae27020b2385270a0a459d782ec5ddbcdec4c315a\": rpc error: code = NotFound desc = could not find container \"efefbeba9d8c1be5ccaa003ae27020b2385270a0a459d782ec5ddbcdec4c315a\": container with ID starting with efefbeba9d8c1be5ccaa003ae27020b2385270a0a459d782ec5ddbcdec4c315a not found: ID does not exist" Apr 16 23:54:18.178184 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.178112 2562 scope.go:117] "RemoveContainer" containerID="25d2f4e7b7dc4d850d20932b1f6280064e2ea09f9e1433cc54da39aee3c6695c" Apr 16 23:54:18.178446 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:54:18.178408 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25d2f4e7b7dc4d850d20932b1f6280064e2ea09f9e1433cc54da39aee3c6695c\": container with ID starting with 25d2f4e7b7dc4d850d20932b1f6280064e2ea09f9e1433cc54da39aee3c6695c not found: ID does not exist" containerID="25d2f4e7b7dc4d850d20932b1f6280064e2ea09f9e1433cc54da39aee3c6695c" Apr 16 23:54:18.178528 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.178447 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25d2f4e7b7dc4d850d20932b1f6280064e2ea09f9e1433cc54da39aee3c6695c"} err="failed to get container status \"25d2f4e7b7dc4d850d20932b1f6280064e2ea09f9e1433cc54da39aee3c6695c\": rpc error: code = NotFound desc = could not find container \"25d2f4e7b7dc4d850d20932b1f6280064e2ea09f9e1433cc54da39aee3c6695c\": container with ID starting with 25d2f4e7b7dc4d850d20932b1f6280064e2ea09f9e1433cc54da39aee3c6695c not found: ID does not exist" Apr 16 23:54:18.180015 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.179997 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 23:54:18.180492 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.180476 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1530be49-0a24-4da9-ac18-d8f5540c0309" containerName="kube-rbac-proxy-web" Apr 16 23:54:18.180534 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.180495 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="1530be49-0a24-4da9-ac18-d8f5540c0309" containerName="kube-rbac-proxy-web" Apr 16 23:54:18.180534 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.180510 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1530be49-0a24-4da9-ac18-d8f5540c0309" containerName="thanos-sidecar" Apr 16 23:54:18.180534 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.180519 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="1530be49-0a24-4da9-ac18-d8f5540c0309" containerName="thanos-sidecar" Apr 16 23:54:18.180627 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.180533 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1530be49-0a24-4da9-ac18-d8f5540c0309" containerName="kube-rbac-proxy" Apr 16 23:54:18.180627 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.180541 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="1530be49-0a24-4da9-ac18-d8f5540c0309" containerName="kube-rbac-proxy" Apr 16 23:54:18.180627 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.180551 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1530be49-0a24-4da9-ac18-d8f5540c0309" containerName="kube-rbac-proxy-thanos" Apr 16 23:54:18.180627 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.180559 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="1530be49-0a24-4da9-ac18-d8f5540c0309" containerName="kube-rbac-proxy-thanos" Apr 16 23:54:18.180627 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.180580 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1530be49-0a24-4da9-ac18-d8f5540c0309" containerName="config-reloader" Apr 16 23:54:18.180627 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.180589 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="1530be49-0a24-4da9-ac18-d8f5540c0309" containerName="config-reloader" Apr 16 23:54:18.180627 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.180604 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1530be49-0a24-4da9-ac18-d8f5540c0309" containerName="init-config-reloader" Apr 16 23:54:18.180627 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.180612 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="1530be49-0a24-4da9-ac18-d8f5540c0309" containerName="init-config-reloader" Apr 16 23:54:18.180627 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.180623 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1530be49-0a24-4da9-ac18-d8f5540c0309" containerName="prometheus" Apr 16 23:54:18.180945 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.180632 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="1530be49-0a24-4da9-ac18-d8f5540c0309" containerName="prometheus" Apr 16 23:54:18.180945 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.180648 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88e7efd3-590c-4e09-8fdb-d9252264d3ad" containerName="registry" Apr 16 23:54:18.180945 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.180656 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="88e7efd3-590c-4e09-8fdb-d9252264d3ad" containerName="registry" Apr 16 23:54:18.180945 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.180721 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="1530be49-0a24-4da9-ac18-d8f5540c0309" containerName="config-reloader" Apr 16 23:54:18.180945 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.180736 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="1530be49-0a24-4da9-ac18-d8f5540c0309" containerName="kube-rbac-proxy-web" Apr 16 23:54:18.180945 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.180745 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="88e7efd3-590c-4e09-8fdb-d9252264d3ad" containerName="registry" Apr 16 23:54:18.180945 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.180754 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="1530be49-0a24-4da9-ac18-d8f5540c0309" containerName="kube-rbac-proxy" Apr 16 23:54:18.180945 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.180763 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="1530be49-0a24-4da9-ac18-d8f5540c0309" containerName="kube-rbac-proxy-thanos" Apr 16 23:54:18.180945 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.180774 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="1530be49-0a24-4da9-ac18-d8f5540c0309" containerName="prometheus" Apr 16 23:54:18.180945 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.180794 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="1530be49-0a24-4da9-ac18-d8f5540c0309" containerName="thanos-sidecar" Apr 16 23:54:18.186771 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.186752 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.188830 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.188811 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 23:54:18.188913 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.188887 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-3l2c35iqfuh2k\"" Apr 16 23:54:18.189290 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.189268 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 23:54:18.189438 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.189423 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 23:54:18.189509 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.189469 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-5z44f\"" Apr 16 23:54:18.189509 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.189482 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 23:54:18.189509 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.189490 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 23:54:18.189509 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.189500 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 23:54:18.189697 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.189558 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 23:54:18.189741 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.189724 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 23:54:18.189794 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.189780 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 23:54:18.189843 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.189828 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 23:54:18.189893 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.189841 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 23:54:18.192896 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.192727 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 23:54:18.194895 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.194749 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 23:54:18.195433 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.195412 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 23:54:18.240842 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.240814 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12db5959-3135-4d08-a456-a00777f84b9e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.240957 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.240847 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/12db5959-3135-4d08-a456-a00777f84b9e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.240957 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.240870 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/12db5959-3135-4d08-a456-a00777f84b9e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.240957 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.240938 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12db5959-3135-4d08-a456-a00777f84b9e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.241068 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.240972 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/12db5959-3135-4d08-a456-a00777f84b9e-config-out\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.241068 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.240994 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/12db5959-3135-4d08-a456-a00777f84b9e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.241068 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.241010 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/12db5959-3135-4d08-a456-a00777f84b9e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.241068 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.241030 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/12db5959-3135-4d08-a456-a00777f84b9e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.241231 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.241093 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12db5959-3135-4d08-a456-a00777f84b9e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.241231 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.241121 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/12db5959-3135-4d08-a456-a00777f84b9e-web-config\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.241231 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.241137 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/12db5959-3135-4d08-a456-a00777f84b9e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.241231 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.241155 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/12db5959-3135-4d08-a456-a00777f84b9e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.241231 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.241172 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/12db5959-3135-4d08-a456-a00777f84b9e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.241231 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.241227 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/12db5959-3135-4d08-a456-a00777f84b9e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.241431 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.241252 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/12db5959-3135-4d08-a456-a00777f84b9e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.241431 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.241302 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t4j9\" (UniqueName: \"kubernetes.io/projected/12db5959-3135-4d08-a456-a00777f84b9e-kube-api-access-7t4j9\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.241431 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.241339 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/12db5959-3135-4d08-a456-a00777f84b9e-config\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.241431 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.241357 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/12db5959-3135-4d08-a456-a00777f84b9e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.341690 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.341661 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/12db5959-3135-4d08-a456-a00777f84b9e-config\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.341847 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.341704 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/12db5959-3135-4d08-a456-a00777f84b9e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.341847 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.341725 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12db5959-3135-4d08-a456-a00777f84b9e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.341847 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.341753 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/12db5959-3135-4d08-a456-a00777f84b9e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.341847 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.341772 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/12db5959-3135-4d08-a456-a00777f84b9e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.341847 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.341789 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12db5959-3135-4d08-a456-a00777f84b9e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.341847 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.341810 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/12db5959-3135-4d08-a456-a00777f84b9e-config-out\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.341847 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.341839 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/12db5959-3135-4d08-a456-a00777f84b9e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.342177 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.341867 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/12db5959-3135-4d08-a456-a00777f84b9e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.342177 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.341898 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/12db5959-3135-4d08-a456-a00777f84b9e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.342177 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.341964 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12db5959-3135-4d08-a456-a00777f84b9e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.342177 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.341999 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/12db5959-3135-4d08-a456-a00777f84b9e-web-config\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.342177 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.342022 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/12db5959-3135-4d08-a456-a00777f84b9e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.342177 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.342052 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/12db5959-3135-4d08-a456-a00777f84b9e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.342177 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.342080 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/12db5959-3135-4d08-a456-a00777f84b9e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.342554 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.342180 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/12db5959-3135-4d08-a456-a00777f84b9e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.342554 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.342397 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/12db5959-3135-4d08-a456-a00777f84b9e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.342689 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.342665 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12db5959-3135-4d08-a456-a00777f84b9e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.342747 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.342711 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/12db5959-3135-4d08-a456-a00777f84b9e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.342806 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.342790 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12db5959-3135-4d08-a456-a00777f84b9e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.343167 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.343135 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12db5959-3135-4d08-a456-a00777f84b9e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.343167 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.343159 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/12db5959-3135-4d08-a456-a00777f84b9e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.343337 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.343226 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7t4j9\" (UniqueName: \"kubernetes.io/projected/12db5959-3135-4d08-a456-a00777f84b9e-kube-api-access-7t4j9\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.344926 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.344901 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/12db5959-3135-4d08-a456-a00777f84b9e-config\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.345117 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.345095 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/12db5959-3135-4d08-a456-a00777f84b9e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.346212 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.346074 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/12db5959-3135-4d08-a456-a00777f84b9e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.346701 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.346549 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/12db5959-3135-4d08-a456-a00777f84b9e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.346701 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.346639 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/12db5959-3135-4d08-a456-a00777f84b9e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.346848 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.346731 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/12db5959-3135-4d08-a456-a00777f84b9e-config-out\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.346848 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.346820 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/12db5959-3135-4d08-a456-a00777f84b9e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.346956 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.346843 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/12db5959-3135-4d08-a456-a00777f84b9e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.347015 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.346954 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/12db5959-3135-4d08-a456-a00777f84b9e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.347361 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.347331 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/12db5959-3135-4d08-a456-a00777f84b9e-web-config\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.348004 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.347983 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/12db5959-3135-4d08-a456-a00777f84b9e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.348970 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.348948 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/12db5959-3135-4d08-a456-a00777f84b9e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.350475 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.350456 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t4j9\" (UniqueName: \"kubernetes.io/projected/12db5959-3135-4d08-a456-a00777f84b9e-kube-api-access-7t4j9\") pod \"prometheus-k8s-0\" (UID: \"12db5959-3135-4d08-a456-a00777f84b9e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.357582 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.357549 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1530be49-0a24-4da9-ac18-d8f5540c0309" path="/var/lib/kubelet/pods/1530be49-0a24-4da9-ac18-d8f5540c0309/volumes" Apr 16 23:54:18.497386 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.497357 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:18.621647 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:18.621477 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 23:54:18.624215 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:54:18.624174 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12db5959_3135_4d08_a456_a00777f84b9e.slice/crio-bfd812a1a30905212d8110c6b386d9b38f4e28757edfb7566252aa881909a813 WatchSource:0}: Error finding container bfd812a1a30905212d8110c6b386d9b38f4e28757edfb7566252aa881909a813: Status 404 returned error can't find the container with id bfd812a1a30905212d8110c6b386d9b38f4e28757edfb7566252aa881909a813 Apr 16 23:54:19.131114 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:19.131076 2562 generic.go:358] "Generic (PLEG): container finished" podID="12db5959-3135-4d08-a456-a00777f84b9e" containerID="efb466f92309adc9d6009e20624210955ef9f5aeff949b221da9e9ccee3de44e" exitCode=0 Apr 16 23:54:19.131479 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:19.131167 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"12db5959-3135-4d08-a456-a00777f84b9e","Type":"ContainerDied","Data":"efb466f92309adc9d6009e20624210955ef9f5aeff949b221da9e9ccee3de44e"} Apr 16 23:54:19.131479 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:19.131220 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"12db5959-3135-4d08-a456-a00777f84b9e","Type":"ContainerStarted","Data":"bfd812a1a30905212d8110c6b386d9b38f4e28757edfb7566252aa881909a813"} Apr 16 23:54:20.137186 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:20.137155 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"12db5959-3135-4d08-a456-a00777f84b9e","Type":"ContainerStarted","Data":"79ef6aecd30712454317f3432525a1d63b7461ee561832c28d54db683177ed24"} Apr 16 23:54:20.137186 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:20.137206 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"12db5959-3135-4d08-a456-a00777f84b9e","Type":"ContainerStarted","Data":"3f0ff1c499f07d79091401d8e66b42dc6c1d0361edf2ff6ae3fbd46f52942041"} Apr 16 23:54:20.137575 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:20.137220 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"12db5959-3135-4d08-a456-a00777f84b9e","Type":"ContainerStarted","Data":"0353cf393502ead61d52806c5e1b8d24456d7b52fd3ffa3f6c52558002ee551e"} Apr 16 23:54:20.137575 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:20.137229 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"12db5959-3135-4d08-a456-a00777f84b9e","Type":"ContainerStarted","Data":"0b44da8f9193eb937c6fca6d3835140b2c56853d77d95d01e851be7538f2ca57"} Apr 16 23:54:20.137575 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:20.137237 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"12db5959-3135-4d08-a456-a00777f84b9e","Type":"ContainerStarted","Data":"cdb623d20e89064e0c478d9075b53b7a73e9aa2c72c0a0ab46e2733a7f6fec3b"} Apr 16 23:54:20.137575 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:20.137246 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"12db5959-3135-4d08-a456-a00777f84b9e","Type":"ContainerStarted","Data":"c7177b58abf36f3cd6a9b8de30cd539b5671203f69ac16ec9b605672f3eb92f8"} Apr 16 23:54:20.161889 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:20.161842 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.16182558 podStartE2EDuration="2.16182558s" podCreationTimestamp="2026-04-16 23:54:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:54:20.159586424 +0000 UTC m=+232.368351129" watchObservedRunningTime="2026-04-16 23:54:20.16182558 +0000 UTC m=+232.370590251" Apr 16 23:54:23.497506 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:23.497476 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:54:55.809595 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:55.809561 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-rnrxc"] Apr 16 23:54:55.813105 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:55.813085 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rnrxc" Apr 16 23:54:55.815271 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:55.815252 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 23:54:55.818541 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:55.818514 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-rnrxc"] Apr 16 23:54:55.924652 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:55.924625 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/78ecb6de-eebd-4a1c-a6d7-206ea9999103-original-pull-secret\") pod \"global-pull-secret-syncer-rnrxc\" (UID: \"78ecb6de-eebd-4a1c-a6d7-206ea9999103\") " pod="kube-system/global-pull-secret-syncer-rnrxc" Apr 16 23:54:55.924770 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:55.924693 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/78ecb6de-eebd-4a1c-a6d7-206ea9999103-kubelet-config\") pod \"global-pull-secret-syncer-rnrxc\" (UID: \"78ecb6de-eebd-4a1c-a6d7-206ea9999103\") " pod="kube-system/global-pull-secret-syncer-rnrxc" Apr 16 23:54:55.924770 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:55.924720 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/78ecb6de-eebd-4a1c-a6d7-206ea9999103-dbus\") pod \"global-pull-secret-syncer-rnrxc\" (UID: \"78ecb6de-eebd-4a1c-a6d7-206ea9999103\") " pod="kube-system/global-pull-secret-syncer-rnrxc" Apr 16 23:54:56.025943 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:56.025906 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/78ecb6de-eebd-4a1c-a6d7-206ea9999103-dbus\") pod \"global-pull-secret-syncer-rnrxc\" (UID: \"78ecb6de-eebd-4a1c-a6d7-206ea9999103\") " pod="kube-system/global-pull-secret-syncer-rnrxc" Apr 16 23:54:56.026076 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:56.025962 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/78ecb6de-eebd-4a1c-a6d7-206ea9999103-original-pull-secret\") pod \"global-pull-secret-syncer-rnrxc\" (UID: \"78ecb6de-eebd-4a1c-a6d7-206ea9999103\") " pod="kube-system/global-pull-secret-syncer-rnrxc" Apr 16 23:54:56.026076 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:56.026026 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/78ecb6de-eebd-4a1c-a6d7-206ea9999103-kubelet-config\") pod \"global-pull-secret-syncer-rnrxc\" (UID: \"78ecb6de-eebd-4a1c-a6d7-206ea9999103\") " pod="kube-system/global-pull-secret-syncer-rnrxc" Apr 16 23:54:56.026145 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:56.026104 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/78ecb6de-eebd-4a1c-a6d7-206ea9999103-dbus\") pod \"global-pull-secret-syncer-rnrxc\" (UID: \"78ecb6de-eebd-4a1c-a6d7-206ea9999103\") " pod="kube-system/global-pull-secret-syncer-rnrxc" Apr 16 23:54:56.026145 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:56.026105 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/78ecb6de-eebd-4a1c-a6d7-206ea9999103-kubelet-config\") pod \"global-pull-secret-syncer-rnrxc\" (UID: \"78ecb6de-eebd-4a1c-a6d7-206ea9999103\") " pod="kube-system/global-pull-secret-syncer-rnrxc" Apr 16 23:54:56.028238 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:56.028216 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/78ecb6de-eebd-4a1c-a6d7-206ea9999103-original-pull-secret\") pod \"global-pull-secret-syncer-rnrxc\" (UID: \"78ecb6de-eebd-4a1c-a6d7-206ea9999103\") " pod="kube-system/global-pull-secret-syncer-rnrxc" Apr 16 23:54:56.122987 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:56.122936 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rnrxc" Apr 16 23:54:56.234604 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:56.234524 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-rnrxc"] Apr 16 23:54:56.236814 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:54:56.236787 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78ecb6de_eebd_4a1c_a6d7_206ea9999103.slice/crio-5dc0644e79c28c7e7d7821e8c4313846e533016986380f2752e1822068bfaf2e WatchSource:0}: Error finding container 5dc0644e79c28c7e7d7821e8c4313846e533016986380f2752e1822068bfaf2e: Status 404 returned error can't find the container with id 5dc0644e79c28c7e7d7821e8c4313846e533016986380f2752e1822068bfaf2e Apr 16 23:54:56.240501 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:54:56.240474 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-rnrxc" event={"ID":"78ecb6de-eebd-4a1c-a6d7-206ea9999103","Type":"ContainerStarted","Data":"5dc0644e79c28c7e7d7821e8c4313846e533016986380f2752e1822068bfaf2e"} Apr 16 23:55:00.254483 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:55:00.254447 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-rnrxc" event={"ID":"78ecb6de-eebd-4a1c-a6d7-206ea9999103","Type":"ContainerStarted","Data":"2ccba36fe7963dbd394f8d9ec043fa328e8e6b635fee8d198562a567a783b7f5"} Apr 16 23:55:00.270096 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:55:00.270043 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-rnrxc" podStartSLOduration=1.742288529 podStartE2EDuration="5.270025163s" podCreationTimestamp="2026-04-16 23:54:55 +0000 UTC" firstStartedPulling="2026-04-16 23:54:56.238373783 +0000 UTC m=+268.447138431" lastFinishedPulling="2026-04-16 23:54:59.766110414 +0000 UTC m=+271.974875065" observedRunningTime="2026-04-16 23:55:00.269035501 +0000 UTC m=+272.477800171" watchObservedRunningTime="2026-04-16 23:55:00.270025163 +0000 UTC m=+272.478789835" Apr 16 23:55:18.497824 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:55:18.497784 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:55:18.512718 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:55:18.512691 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:55:19.322118 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:55:19.322091 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:55:28.262856 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:55:28.262826 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrpzx_5fddaa09-016b-490d-a9ab-9d50ff167b22/console-operator/2.log" Apr 16 23:55:28.263460 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:55:28.263169 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrpzx_5fddaa09-016b-490d-a9ab-9d50ff167b22/console-operator/2.log" Apr 16 23:55:50.365217 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:55:50.365166 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-4p47t"] Apr 16 23:55:50.368536 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:55:50.368517 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-4p47t" Apr 16 23:55:50.370559 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:55:50.370538 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 23:55:50.370665 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:55:50.370536 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-5zxc8\"" Apr 16 23:55:50.371157 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:55:50.371142 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 23:55:50.374714 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:55:50.374692 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-4p47t"] Apr 16 23:55:50.433046 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:55:50.433020 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f3068172-bd48-41a9-82aa-a6114fc0fd73-tmp\") pod \"openshift-lws-operator-bfc7f696d-4p47t\" (UID: \"f3068172-bd48-41a9-82aa-a6114fc0fd73\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-4p47t" Apr 16 23:55:50.433168 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:55:50.433146 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sch9j\" (UniqueName: \"kubernetes.io/projected/f3068172-bd48-41a9-82aa-a6114fc0fd73-kube-api-access-sch9j\") pod \"openshift-lws-operator-bfc7f696d-4p47t\" (UID: \"f3068172-bd48-41a9-82aa-a6114fc0fd73\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-4p47t" Apr 16 23:55:50.533553 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:55:50.533518 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sch9j\" (UniqueName: \"kubernetes.io/projected/f3068172-bd48-41a9-82aa-a6114fc0fd73-kube-api-access-sch9j\") pod \"openshift-lws-operator-bfc7f696d-4p47t\" (UID: \"f3068172-bd48-41a9-82aa-a6114fc0fd73\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-4p47t" Apr 16 23:55:50.533678 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:55:50.533560 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f3068172-bd48-41a9-82aa-a6114fc0fd73-tmp\") pod \"openshift-lws-operator-bfc7f696d-4p47t\" (UID: \"f3068172-bd48-41a9-82aa-a6114fc0fd73\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-4p47t" Apr 16 23:55:50.533870 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:55:50.533855 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f3068172-bd48-41a9-82aa-a6114fc0fd73-tmp\") pod \"openshift-lws-operator-bfc7f696d-4p47t\" (UID: \"f3068172-bd48-41a9-82aa-a6114fc0fd73\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-4p47t" Apr 16 23:55:50.543173 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:55:50.543156 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sch9j\" (UniqueName: \"kubernetes.io/projected/f3068172-bd48-41a9-82aa-a6114fc0fd73-kube-api-access-sch9j\") pod \"openshift-lws-operator-bfc7f696d-4p47t\" (UID: \"f3068172-bd48-41a9-82aa-a6114fc0fd73\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-4p47t" Apr 16 23:55:50.685039 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:55:50.685010 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-4p47t" Apr 16 23:55:50.797078 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:55:50.797056 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-4p47t"] Apr 16 23:55:50.799798 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:55:50.799768 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3068172_bd48_41a9_82aa_a6114fc0fd73.slice/crio-e21a6536c08d7129a4a8af2261eb70d273321a6eec421da597b10c60772bf5ab WatchSource:0}: Error finding container e21a6536c08d7129a4a8af2261eb70d273321a6eec421da597b10c60772bf5ab: Status 404 returned error can't find the container with id e21a6536c08d7129a4a8af2261eb70d273321a6eec421da597b10c60772bf5ab Apr 16 23:55:50.801156 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:55:50.801138 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 23:55:51.400512 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:55:51.400464 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-4p47t" event={"ID":"f3068172-bd48-41a9-82aa-a6114fc0fd73","Type":"ContainerStarted","Data":"e21a6536c08d7129a4a8af2261eb70d273321a6eec421da597b10c60772bf5ab"} Apr 16 23:55:53.407794 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:55:53.407754 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-4p47t" event={"ID":"f3068172-bd48-41a9-82aa-a6114fc0fd73","Type":"ContainerStarted","Data":"5581bc8411a9318e690c8c4617a969db45713349338e5b8cc12ba68f86f69d7b"} Apr 16 23:55:53.421933 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:55:53.421886 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-4p47t" podStartSLOduration=1.019157526 podStartE2EDuration="3.421871934s" podCreationTimestamp="2026-04-16 23:55:50 +0000 UTC" firstStartedPulling="2026-04-16 23:55:50.801282184 +0000 UTC m=+323.010046833" lastFinishedPulling="2026-04-16 23:55:53.203996592 +0000 UTC m=+325.412761241" observedRunningTime="2026-04-16 23:55:53.4214213 +0000 UTC m=+325.630185969" watchObservedRunningTime="2026-04-16 23:55:53.421871934 +0000 UTC m=+325.630636605" Apr 16 23:56:12.515733 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:12.515702 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-bf54d8685-5pzhh"] Apr 16 23:56:12.519560 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:12.519543 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-5pzhh" Apr 16 23:56:12.521903 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:12.521873 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 16 23:56:12.521903 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:12.521876 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 16 23:56:12.522092 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:12.521920 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 16 23:56:12.522092 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:12.522010 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 16 23:56:12.522288 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:12.522207 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-nnn2n\"" Apr 16 23:56:12.531510 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:12.531491 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-bf54d8685-5pzhh"] Apr 16 23:56:12.608903 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:12.608877 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dabfacec-d089-4662-a3f0-dfad7cab3b53-webhook-cert\") pod \"opendatahub-operator-controller-manager-bf54d8685-5pzhh\" (UID: \"dabfacec-d089-4662-a3f0-dfad7cab3b53\") " pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-5pzhh" Apr 16 23:56:12.608999 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:12.608931 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dabfacec-d089-4662-a3f0-dfad7cab3b53-apiservice-cert\") pod \"opendatahub-operator-controller-manager-bf54d8685-5pzhh\" (UID: \"dabfacec-d089-4662-a3f0-dfad7cab3b53\") " pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-5pzhh" Apr 16 23:56:12.608999 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:12.608989 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhm5r\" (UniqueName: \"kubernetes.io/projected/dabfacec-d089-4662-a3f0-dfad7cab3b53-kube-api-access-rhm5r\") pod \"opendatahub-operator-controller-manager-bf54d8685-5pzhh\" (UID: \"dabfacec-d089-4662-a3f0-dfad7cab3b53\") " pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-5pzhh" Apr 16 23:56:12.710283 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:12.710249 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhm5r\" (UniqueName: \"kubernetes.io/projected/dabfacec-d089-4662-a3f0-dfad7cab3b53-kube-api-access-rhm5r\") pod \"opendatahub-operator-controller-manager-bf54d8685-5pzhh\" (UID: \"dabfacec-d089-4662-a3f0-dfad7cab3b53\") " pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-5pzhh" Apr 16 23:56:12.710385 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:12.710294 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dabfacec-d089-4662-a3f0-dfad7cab3b53-webhook-cert\") pod \"opendatahub-operator-controller-manager-bf54d8685-5pzhh\" (UID: \"dabfacec-d089-4662-a3f0-dfad7cab3b53\") " pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-5pzhh" Apr 16 23:56:12.710385 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:12.710337 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dabfacec-d089-4662-a3f0-dfad7cab3b53-apiservice-cert\") pod \"opendatahub-operator-controller-manager-bf54d8685-5pzhh\" (UID: \"dabfacec-d089-4662-a3f0-dfad7cab3b53\") " pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-5pzhh" Apr 16 23:56:12.712715 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:12.712691 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dabfacec-d089-4662-a3f0-dfad7cab3b53-apiservice-cert\") pod \"opendatahub-operator-controller-manager-bf54d8685-5pzhh\" (UID: \"dabfacec-d089-4662-a3f0-dfad7cab3b53\") " pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-5pzhh" Apr 16 23:56:12.712800 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:12.712716 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dabfacec-d089-4662-a3f0-dfad7cab3b53-webhook-cert\") pod \"opendatahub-operator-controller-manager-bf54d8685-5pzhh\" (UID: \"dabfacec-d089-4662-a3f0-dfad7cab3b53\") " pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-5pzhh" Apr 16 23:56:12.718272 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:12.718252 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhm5r\" (UniqueName: \"kubernetes.io/projected/dabfacec-d089-4662-a3f0-dfad7cab3b53-kube-api-access-rhm5r\") pod \"opendatahub-operator-controller-manager-bf54d8685-5pzhh\" (UID: \"dabfacec-d089-4662-a3f0-dfad7cab3b53\") " pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-5pzhh" Apr 16 23:56:12.829658 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:12.829606 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-5pzhh" Apr 16 23:56:12.951991 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:12.951952 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-bf54d8685-5pzhh"] Apr 16 23:56:12.955206 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:56:12.955160 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddabfacec_d089_4662_a3f0_dfad7cab3b53.slice/crio-30b68898a4f234f375902b945d83fbf3c6a40bed5619f9be0b87d0c53fbb7a67 WatchSource:0}: Error finding container 30b68898a4f234f375902b945d83fbf3c6a40bed5619f9be0b87d0c53fbb7a67: Status 404 returned error can't find the container with id 30b68898a4f234f375902b945d83fbf3c6a40bed5619f9be0b87d0c53fbb7a67 Apr 16 23:56:13.470581 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:13.470546 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-5pzhh" event={"ID":"dabfacec-d089-4662-a3f0-dfad7cab3b53","Type":"ContainerStarted","Data":"30b68898a4f234f375902b945d83fbf3c6a40bed5619f9be0b87d0c53fbb7a67"} Apr 16 23:56:15.479550 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:15.479515 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-5pzhh" event={"ID":"dabfacec-d089-4662-a3f0-dfad7cab3b53","Type":"ContainerStarted","Data":"bbd36394cb6435d85e34e1cb58d359ae96b8113abf6707ea73ea4e0beefc8ec1"} Apr 16 23:56:15.479924 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:15.479618 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-5pzhh" Apr 16 23:56:15.501341 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:15.501296 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-5pzhh" podStartSLOduration=1.102970864 podStartE2EDuration="3.50128407s" podCreationTimestamp="2026-04-16 23:56:12 +0000 UTC" firstStartedPulling="2026-04-16 23:56:12.957018358 +0000 UTC m=+345.165783007" lastFinishedPulling="2026-04-16 23:56:15.355331558 +0000 UTC m=+347.564096213" observedRunningTime="2026-04-16 23:56:15.499932251 +0000 UTC m=+347.708696921" watchObservedRunningTime="2026-04-16 23:56:15.50128407 +0000 UTC m=+347.710048740" Apr 16 23:56:26.484464 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:26.484388 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-bf54d8685-5pzhh" Apr 16 23:56:29.125677 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:29.125637 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-6b6988ccb7-qs2zh"] Apr 16 23:56:29.133205 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:29.133164 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6b6988ccb7-qs2zh" Apr 16 23:56:29.136339 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:29.136318 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 23:56:29.136339 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:29.136333 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-wl95d\"" Apr 16 23:56:29.136497 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:29.136334 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 23:56:29.136497 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:29.136382 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 23:56:29.145335 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:29.145313 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6b6988ccb7-qs2zh"] Apr 16 23:56:29.150282 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:29.150262 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/d01bbe12-b55d-461e-8495-1be44d06bbd3-manager-config\") pod \"lws-controller-manager-6b6988ccb7-qs2zh\" (UID: \"d01bbe12-b55d-461e-8495-1be44d06bbd3\") " pod="openshift-lws-operator/lws-controller-manager-6b6988ccb7-qs2zh" Apr 16 23:56:29.150374 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:29.150290 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d01bbe12-b55d-461e-8495-1be44d06bbd3-cert\") pod \"lws-controller-manager-6b6988ccb7-qs2zh\" (UID: \"d01bbe12-b55d-461e-8495-1be44d06bbd3\") " pod="openshift-lws-operator/lws-controller-manager-6b6988ccb7-qs2zh" Apr 16 23:56:29.150374 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:29.150321 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6t94\" (UniqueName: \"kubernetes.io/projected/d01bbe12-b55d-461e-8495-1be44d06bbd3-kube-api-access-f6t94\") pod \"lws-controller-manager-6b6988ccb7-qs2zh\" (UID: \"d01bbe12-b55d-461e-8495-1be44d06bbd3\") " pod="openshift-lws-operator/lws-controller-manager-6b6988ccb7-qs2zh" Apr 16 23:56:29.150446 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:29.150405 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/d01bbe12-b55d-461e-8495-1be44d06bbd3-metrics-cert\") pod \"lws-controller-manager-6b6988ccb7-qs2zh\" (UID: \"d01bbe12-b55d-461e-8495-1be44d06bbd3\") " pod="openshift-lws-operator/lws-controller-manager-6b6988ccb7-qs2zh" Apr 16 23:56:29.250847 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:29.250817 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/d01bbe12-b55d-461e-8495-1be44d06bbd3-manager-config\") pod \"lws-controller-manager-6b6988ccb7-qs2zh\" (UID: \"d01bbe12-b55d-461e-8495-1be44d06bbd3\") " pod="openshift-lws-operator/lws-controller-manager-6b6988ccb7-qs2zh" Apr 16 23:56:29.250973 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:29.250852 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d01bbe12-b55d-461e-8495-1be44d06bbd3-cert\") pod \"lws-controller-manager-6b6988ccb7-qs2zh\" (UID: \"d01bbe12-b55d-461e-8495-1be44d06bbd3\") " pod="openshift-lws-operator/lws-controller-manager-6b6988ccb7-qs2zh" Apr 16 23:56:29.250973 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:29.250882 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f6t94\" (UniqueName: \"kubernetes.io/projected/d01bbe12-b55d-461e-8495-1be44d06bbd3-kube-api-access-f6t94\") pod \"lws-controller-manager-6b6988ccb7-qs2zh\" (UID: \"d01bbe12-b55d-461e-8495-1be44d06bbd3\") " pod="openshift-lws-operator/lws-controller-manager-6b6988ccb7-qs2zh" Apr 16 23:56:29.250973 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:29.250911 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/d01bbe12-b55d-461e-8495-1be44d06bbd3-metrics-cert\") pod \"lws-controller-manager-6b6988ccb7-qs2zh\" (UID: \"d01bbe12-b55d-461e-8495-1be44d06bbd3\") " pod="openshift-lws-operator/lws-controller-manager-6b6988ccb7-qs2zh" Apr 16 23:56:29.251526 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:29.251500 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/d01bbe12-b55d-461e-8495-1be44d06bbd3-manager-config\") pod \"lws-controller-manager-6b6988ccb7-qs2zh\" (UID: \"d01bbe12-b55d-461e-8495-1be44d06bbd3\") " pod="openshift-lws-operator/lws-controller-manager-6b6988ccb7-qs2zh" Apr 16 23:56:29.253275 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:29.253252 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/d01bbe12-b55d-461e-8495-1be44d06bbd3-metrics-cert\") pod \"lws-controller-manager-6b6988ccb7-qs2zh\" (UID: \"d01bbe12-b55d-461e-8495-1be44d06bbd3\") " pod="openshift-lws-operator/lws-controller-manager-6b6988ccb7-qs2zh" Apr 16 23:56:29.253392 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:29.253374 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d01bbe12-b55d-461e-8495-1be44d06bbd3-cert\") pod \"lws-controller-manager-6b6988ccb7-qs2zh\" (UID: \"d01bbe12-b55d-461e-8495-1be44d06bbd3\") " pod="openshift-lws-operator/lws-controller-manager-6b6988ccb7-qs2zh" Apr 16 23:56:29.259619 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:29.259601 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6t94\" (UniqueName: \"kubernetes.io/projected/d01bbe12-b55d-461e-8495-1be44d06bbd3-kube-api-access-f6t94\") pod \"lws-controller-manager-6b6988ccb7-qs2zh\" (UID: \"d01bbe12-b55d-461e-8495-1be44d06bbd3\") " pod="openshift-lws-operator/lws-controller-manager-6b6988ccb7-qs2zh" Apr 16 23:56:29.442999 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:29.442973 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6b6988ccb7-qs2zh" Apr 16 23:56:29.556751 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:29.556706 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6b6988ccb7-qs2zh"] Apr 16 23:56:29.559507 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:56:29.559485 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd01bbe12_b55d_461e_8495_1be44d06bbd3.slice/crio-836d0cfdfc3a02748770ea49524a04e08dcb607ef175608ef2928909b134bf16 WatchSource:0}: Error finding container 836d0cfdfc3a02748770ea49524a04e08dcb607ef175608ef2928909b134bf16: Status 404 returned error can't find the container with id 836d0cfdfc3a02748770ea49524a04e08dcb607ef175608ef2928909b134bf16 Apr 16 23:56:30.527482 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:30.527446 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6b6988ccb7-qs2zh" event={"ID":"d01bbe12-b55d-461e-8495-1be44d06bbd3","Type":"ContainerStarted","Data":"836d0cfdfc3a02748770ea49524a04e08dcb607ef175608ef2928909b134bf16"} Apr 16 23:56:31.532165 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:31.532132 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6b6988ccb7-qs2zh" event={"ID":"d01bbe12-b55d-461e-8495-1be44d06bbd3","Type":"ContainerStarted","Data":"c9ed9b442e03d664cfc45e75521405bec470c98f8d87c859b24d6b1e250b8af8"} Apr 16 23:56:31.532532 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:31.532178 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-6b6988ccb7-qs2zh" Apr 16 23:56:31.547062 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:31.547016 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-6b6988ccb7-qs2zh" podStartSLOduration=0.641504254 podStartE2EDuration="2.547001894s" podCreationTimestamp="2026-04-16 23:56:29 +0000 UTC" firstStartedPulling="2026-04-16 23:56:29.561382865 +0000 UTC m=+361.770147516" lastFinishedPulling="2026-04-16 23:56:31.466880505 +0000 UTC m=+363.675645156" observedRunningTime="2026-04-16 23:56:31.545841455 +0000 UTC m=+363.754606125" watchObservedRunningTime="2026-04-16 23:56:31.547001894 +0000 UTC m=+363.755766563" Apr 16 23:56:41.002479 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:41.002439 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-5cd78b4564-6xccd"] Apr 16 23:56:41.005574 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:41.005556 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-5cd78b4564-6xccd" Apr 16 23:56:41.008166 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:41.008138 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 16 23:56:41.008166 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:41.008154 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 23:56:41.008866 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:41.008848 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 16 23:56:41.008974 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:41.008876 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-chms8\"" Apr 16 23:56:41.013389 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:41.013371 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 23:56:41.031681 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:41.031659 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-5cd78b4564-6xccd"] Apr 16 23:56:41.045542 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:41.045522 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2b5911e9-0274-4fa8-b0b5-12d942461233-tmp\") pod \"kube-auth-proxy-5cd78b4564-6xccd\" (UID: \"2b5911e9-0274-4fa8-b0b5-12d942461233\") " pod="openshift-ingress/kube-auth-proxy-5cd78b4564-6xccd" Apr 16 23:56:41.045617 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:41.045583 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2b5911e9-0274-4fa8-b0b5-12d942461233-tls-certs\") pod \"kube-auth-proxy-5cd78b4564-6xccd\" (UID: \"2b5911e9-0274-4fa8-b0b5-12d942461233\") " pod="openshift-ingress/kube-auth-proxy-5cd78b4564-6xccd" Apr 16 23:56:41.045673 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:41.045634 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkbh6\" (UniqueName: \"kubernetes.io/projected/2b5911e9-0274-4fa8-b0b5-12d942461233-kube-api-access-kkbh6\") pod \"kube-auth-proxy-5cd78b4564-6xccd\" (UID: \"2b5911e9-0274-4fa8-b0b5-12d942461233\") " pod="openshift-ingress/kube-auth-proxy-5cd78b4564-6xccd" Apr 16 23:56:41.146995 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:41.146968 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2b5911e9-0274-4fa8-b0b5-12d942461233-tls-certs\") pod \"kube-auth-proxy-5cd78b4564-6xccd\" (UID: \"2b5911e9-0274-4fa8-b0b5-12d942461233\") " pod="openshift-ingress/kube-auth-proxy-5cd78b4564-6xccd" Apr 16 23:56:41.147132 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:41.147020 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kkbh6\" (UniqueName: \"kubernetes.io/projected/2b5911e9-0274-4fa8-b0b5-12d942461233-kube-api-access-kkbh6\") pod \"kube-auth-proxy-5cd78b4564-6xccd\" (UID: \"2b5911e9-0274-4fa8-b0b5-12d942461233\") " pod="openshift-ingress/kube-auth-proxy-5cd78b4564-6xccd" Apr 16 23:56:41.147132 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:41.147041 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2b5911e9-0274-4fa8-b0b5-12d942461233-tmp\") pod \"kube-auth-proxy-5cd78b4564-6xccd\" (UID: \"2b5911e9-0274-4fa8-b0b5-12d942461233\") " pod="openshift-ingress/kube-auth-proxy-5cd78b4564-6xccd" Apr 16 23:56:41.149146 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:41.149117 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2b5911e9-0274-4fa8-b0b5-12d942461233-tmp\") pod \"kube-auth-proxy-5cd78b4564-6xccd\" (UID: \"2b5911e9-0274-4fa8-b0b5-12d942461233\") " pod="openshift-ingress/kube-auth-proxy-5cd78b4564-6xccd" Apr 16 23:56:41.149398 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:41.149378 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2b5911e9-0274-4fa8-b0b5-12d942461233-tls-certs\") pod \"kube-auth-proxy-5cd78b4564-6xccd\" (UID: \"2b5911e9-0274-4fa8-b0b5-12d942461233\") " pod="openshift-ingress/kube-auth-proxy-5cd78b4564-6xccd" Apr 16 23:56:41.153618 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:41.153597 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkbh6\" (UniqueName: \"kubernetes.io/projected/2b5911e9-0274-4fa8-b0b5-12d942461233-kube-api-access-kkbh6\") pod \"kube-auth-proxy-5cd78b4564-6xccd\" (UID: \"2b5911e9-0274-4fa8-b0b5-12d942461233\") " pod="openshift-ingress/kube-auth-proxy-5cd78b4564-6xccd" Apr 16 23:56:41.315668 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:41.315593 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-5cd78b4564-6xccd" Apr 16 23:56:41.432075 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:41.432049 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-5cd78b4564-6xccd"] Apr 16 23:56:41.435251 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:56:41.435185 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b5911e9_0274_4fa8_b0b5_12d942461233.slice/crio-cdfa68c945f1cdd1f97f2ac69ea7069aadc32bc4f2215cde2b00c06ddf5f8887 WatchSource:0}: Error finding container cdfa68c945f1cdd1f97f2ac69ea7069aadc32bc4f2215cde2b00c06ddf5f8887: Status 404 returned error can't find the container with id cdfa68c945f1cdd1f97f2ac69ea7069aadc32bc4f2215cde2b00c06ddf5f8887 Apr 16 23:56:41.567258 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:41.567179 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-5cd78b4564-6xccd" event={"ID":"2b5911e9-0274-4fa8-b0b5-12d942461233","Type":"ContainerStarted","Data":"cdfa68c945f1cdd1f97f2ac69ea7069aadc32bc4f2215cde2b00c06ddf5f8887"} Apr 16 23:56:42.537963 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:42.537931 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-6b6988ccb7-qs2zh" Apr 16 23:56:44.578504 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:44.578415 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-5cd78b4564-6xccd" event={"ID":"2b5911e9-0274-4fa8-b0b5-12d942461233","Type":"ContainerStarted","Data":"cd90fdd9cd673fe4d60d78ea8de4fa940b33a6169ef7c31579554bf3ac59ea35"} Apr 16 23:56:44.594686 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:56:44.594634 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-5cd78b4564-6xccd" podStartSLOduration=1.702711732 podStartE2EDuration="4.594621397s" podCreationTimestamp="2026-04-16 23:56:40 +0000 UTC" firstStartedPulling="2026-04-16 23:56:41.43704825 +0000 UTC m=+373.645812897" lastFinishedPulling="2026-04-16 23:56:44.328957915 +0000 UTC m=+376.537722562" observedRunningTime="2026-04-16 23:56:44.592044178 +0000 UTC m=+376.800808848" watchObservedRunningTime="2026-04-16 23:56:44.594621397 +0000 UTC m=+376.803386067" Apr 16 23:58:33.204045 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:58:33.204004 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zhlq6"] Apr 16 23:58:33.207453 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:58:33.207436 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zhlq6" Apr 16 23:58:33.211888 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:58:33.211865 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-6r82n\"" Apr 16 23:58:33.211888 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:58:33.211879 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 23:58:33.212049 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:58:33.211871 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 23:58:33.216526 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:58:33.216121 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zhlq6"] Apr 16 23:58:33.328708 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:58:33.328679 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh7tm\" (UniqueName: \"kubernetes.io/projected/4048cf65-31fc-4b21-869b-07bdf03f9acb-kube-api-access-wh7tm\") pod \"limitador-operator-controller-manager-85c4996f8c-zhlq6\" (UID: \"4048cf65-31fc-4b21-869b-07bdf03f9acb\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zhlq6" Apr 16 23:58:33.429523 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:58:33.429477 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wh7tm\" (UniqueName: \"kubernetes.io/projected/4048cf65-31fc-4b21-869b-07bdf03f9acb-kube-api-access-wh7tm\") pod \"limitador-operator-controller-manager-85c4996f8c-zhlq6\" (UID: \"4048cf65-31fc-4b21-869b-07bdf03f9acb\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zhlq6" Apr 16 23:58:33.440137 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:58:33.440115 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh7tm\" (UniqueName: \"kubernetes.io/projected/4048cf65-31fc-4b21-869b-07bdf03f9acb-kube-api-access-wh7tm\") pod \"limitador-operator-controller-manager-85c4996f8c-zhlq6\" (UID: \"4048cf65-31fc-4b21-869b-07bdf03f9acb\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zhlq6" Apr 16 23:58:33.520600 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:58:33.520539 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zhlq6" Apr 16 23:58:33.636857 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:58:33.636786 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zhlq6"] Apr 16 23:58:33.639236 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:58:33.639210 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4048cf65_31fc_4b21_869b_07bdf03f9acb.slice/crio-32974767ac94f2840e91bcc769c9ffb70d4bbbe85c202f3a3f409923dc695f46 WatchSource:0}: Error finding container 32974767ac94f2840e91bcc769c9ffb70d4bbbe85c202f3a3f409923dc695f46: Status 404 returned error can't find the container with id 32974767ac94f2840e91bcc769c9ffb70d4bbbe85c202f3a3f409923dc695f46 Apr 16 23:58:33.930779 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:58:33.930752 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zhlq6" event={"ID":"4048cf65-31fc-4b21-869b-07bdf03f9acb","Type":"ContainerStarted","Data":"32974767ac94f2840e91bcc769c9ffb70d4bbbe85c202f3a3f409923dc695f46"} Apr 16 23:58:35.938903 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:58:35.938866 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zhlq6" event={"ID":"4048cf65-31fc-4b21-869b-07bdf03f9acb","Type":"ContainerStarted","Data":"4ade27dfbf064f69bf2041fa77326e30b87a8ed719bb221d050ad1a4ef9b1750"} Apr 16 23:58:35.939320 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:58:35.938931 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zhlq6" Apr 16 23:58:35.953685 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:58:35.953642 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zhlq6" podStartSLOduration=1.446454478 podStartE2EDuration="2.953629291s" podCreationTimestamp="2026-04-16 23:58:33 +0000 UTC" firstStartedPulling="2026-04-16 23:58:33.641087233 +0000 UTC m=+485.849851881" lastFinishedPulling="2026-04-16 23:58:35.148262037 +0000 UTC m=+487.357026694" observedRunningTime="2026-04-16 23:58:35.951965661 +0000 UTC m=+488.160730330" watchObservedRunningTime="2026-04-16 23:58:35.953629291 +0000 UTC m=+488.162393961" Apr 16 23:58:46.944465 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:58:46.944429 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zhlq6" Apr 16 23:59:26.074821 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:26.074783 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-b5k6m"] Apr 16 23:59:26.078546 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:26.078523 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-b5k6m" Apr 16 23:59:26.080606 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:26.080582 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-q5x47\"" Apr 16 23:59:26.082107 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:26.082083 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-b5k6m"] Apr 16 23:59:26.131777 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:26.131755 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sls67\" (UniqueName: \"kubernetes.io/projected/01d3d21e-4a4d-4c30-9f39-43683ba3eb48-kube-api-access-sls67\") pod \"authorino-f99f4b5cd-b5k6m\" (UID: \"01d3d21e-4a4d-4c30-9f39-43683ba3eb48\") " pod="kuadrant-system/authorino-f99f4b5cd-b5k6m" Apr 16 23:59:26.200516 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:26.200492 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-ww67f"] Apr 16 23:59:26.204002 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:26.203988 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-ww67f" Apr 16 23:59:26.209202 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:26.209173 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-ww67f"] Apr 16 23:59:26.232361 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:26.232335 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sls67\" (UniqueName: \"kubernetes.io/projected/01d3d21e-4a4d-4c30-9f39-43683ba3eb48-kube-api-access-sls67\") pod \"authorino-f99f4b5cd-b5k6m\" (UID: \"01d3d21e-4a4d-4c30-9f39-43683ba3eb48\") " pod="kuadrant-system/authorino-f99f4b5cd-b5k6m" Apr 16 23:59:26.239350 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:26.239329 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sls67\" (UniqueName: \"kubernetes.io/projected/01d3d21e-4a4d-4c30-9f39-43683ba3eb48-kube-api-access-sls67\") pod \"authorino-f99f4b5cd-b5k6m\" (UID: \"01d3d21e-4a4d-4c30-9f39-43683ba3eb48\") " pod="kuadrant-system/authorino-f99f4b5cd-b5k6m" Apr 16 23:59:26.333542 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:26.333480 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd9zj\" (UniqueName: \"kubernetes.io/projected/b48480e3-8a01-4791-9071-7570800d2982-kube-api-access-hd9zj\") pod \"authorino-7498df8756-ww67f\" (UID: \"b48480e3-8a01-4791-9071-7570800d2982\") " pod="kuadrant-system/authorino-7498df8756-ww67f" Apr 16 23:59:26.388790 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:26.388769 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-b5k6m" Apr 16 23:59:26.435020 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:26.434981 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hd9zj\" (UniqueName: \"kubernetes.io/projected/b48480e3-8a01-4791-9071-7570800d2982-kube-api-access-hd9zj\") pod \"authorino-7498df8756-ww67f\" (UID: \"b48480e3-8a01-4791-9071-7570800d2982\") " pod="kuadrant-system/authorino-7498df8756-ww67f" Apr 16 23:59:26.442174 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:26.442145 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd9zj\" (UniqueName: \"kubernetes.io/projected/b48480e3-8a01-4791-9071-7570800d2982-kube-api-access-hd9zj\") pod \"authorino-7498df8756-ww67f\" (UID: \"b48480e3-8a01-4791-9071-7570800d2982\") " pod="kuadrant-system/authorino-7498df8756-ww67f" Apr 16 23:59:26.504317 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:26.504261 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-b5k6m"] Apr 16 23:59:26.506835 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:59:26.506792 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01d3d21e_4a4d_4c30_9f39_43683ba3eb48.slice/crio-c319e9c2d8717a72720ed711192c862794eb8e809820957b8f572ced9937ca45 WatchSource:0}: Error finding container c319e9c2d8717a72720ed711192c862794eb8e809820957b8f572ced9937ca45: Status 404 returned error can't find the container with id c319e9c2d8717a72720ed711192c862794eb8e809820957b8f572ced9937ca45 Apr 16 23:59:26.512945 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:26.512924 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-ww67f" Apr 16 23:59:26.631749 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:26.631726 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-ww67f"] Apr 16 23:59:26.633900 ip-10-0-128-98 kubenswrapper[2562]: W0416 23:59:26.633870 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb48480e3_8a01_4791_9071_7570800d2982.slice/crio-e68adfda4bb4662f666b058251d16c07ccd2a6b06aa146e3cb09546fe235a966 WatchSource:0}: Error finding container e68adfda4bb4662f666b058251d16c07ccd2a6b06aa146e3cb09546fe235a966: Status 404 returned error can't find the container with id e68adfda4bb4662f666b058251d16c07ccd2a6b06aa146e3cb09546fe235a966 Apr 16 23:59:27.107922 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:27.107817 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-b5k6m" event={"ID":"01d3d21e-4a4d-4c30-9f39-43683ba3eb48","Type":"ContainerStarted","Data":"c319e9c2d8717a72720ed711192c862794eb8e809820957b8f572ced9937ca45"} Apr 16 23:59:27.109542 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:27.109495 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-ww67f" event={"ID":"b48480e3-8a01-4791-9071-7570800d2982","Type":"ContainerStarted","Data":"e68adfda4bb4662f666b058251d16c07ccd2a6b06aa146e3cb09546fe235a966"} Apr 16 23:59:30.128003 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:30.127931 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-ww67f" event={"ID":"b48480e3-8a01-4791-9071-7570800d2982","Type":"ContainerStarted","Data":"bcd66b4f10915192f68517e91936156ec1e4b263e2b4eb558fecce940c1ca4e7"} Apr 16 23:59:30.130079 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:30.130048 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-b5k6m" event={"ID":"01d3d21e-4a4d-4c30-9f39-43683ba3eb48","Type":"ContainerStarted","Data":"c616c5ff4f49e0e505b29aed4c52cd6e17c99ddd715be8b95dde29e85cae0900"} Apr 16 23:59:30.147735 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:30.147677 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-ww67f" podStartSLOduration=0.783297194 podStartE2EDuration="4.147656517s" podCreationTimestamp="2026-04-16 23:59:26 +0000 UTC" firstStartedPulling="2026-04-16 23:59:26.635137686 +0000 UTC m=+538.843902335" lastFinishedPulling="2026-04-16 23:59:29.999497007 +0000 UTC m=+542.208261658" observedRunningTime="2026-04-16 23:59:30.146338183 +0000 UTC m=+542.355102853" watchObservedRunningTime="2026-04-16 23:59:30.147656517 +0000 UTC m=+542.356421254" Apr 16 23:59:30.159311 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:30.159258 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-b5k6m" podStartSLOduration=0.656745964 podStartE2EDuration="4.1592418s" podCreationTimestamp="2026-04-16 23:59:26 +0000 UTC" firstStartedPulling="2026-04-16 23:59:26.508071134 +0000 UTC m=+538.716835783" lastFinishedPulling="2026-04-16 23:59:30.010566955 +0000 UTC m=+542.219331619" observedRunningTime="2026-04-16 23:59:30.157817717 +0000 UTC m=+542.366582387" watchObservedRunningTime="2026-04-16 23:59:30.1592418 +0000 UTC m=+542.368006469" Apr 16 23:59:30.180654 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:30.180627 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-b5k6m"] Apr 16 23:59:32.137132 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:32.137090 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-b5k6m" podUID="01d3d21e-4a4d-4c30-9f39-43683ba3eb48" containerName="authorino" containerID="cri-o://c616c5ff4f49e0e505b29aed4c52cd6e17c99ddd715be8b95dde29e85cae0900" gracePeriod=30 Apr 16 23:59:32.383136 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:32.383115 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-b5k6m" Apr 16 23:59:32.486784 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:32.486749 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sls67\" (UniqueName: \"kubernetes.io/projected/01d3d21e-4a4d-4c30-9f39-43683ba3eb48-kube-api-access-sls67\") pod \"01d3d21e-4a4d-4c30-9f39-43683ba3eb48\" (UID: \"01d3d21e-4a4d-4c30-9f39-43683ba3eb48\") " Apr 16 23:59:32.488839 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:32.488808 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01d3d21e-4a4d-4c30-9f39-43683ba3eb48-kube-api-access-sls67" (OuterVolumeSpecName: "kube-api-access-sls67") pod "01d3d21e-4a4d-4c30-9f39-43683ba3eb48" (UID: "01d3d21e-4a4d-4c30-9f39-43683ba3eb48"). InnerVolumeSpecName "kube-api-access-sls67". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:59:32.587665 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:32.587640 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sls67\" (UniqueName: \"kubernetes.io/projected/01d3d21e-4a4d-4c30-9f39-43683ba3eb48-kube-api-access-sls67\") on node \"ip-10-0-128-98.ec2.internal\" DevicePath \"\"" Apr 16 23:59:33.141610 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:33.141570 2562 generic.go:358] "Generic (PLEG): container finished" podID="01d3d21e-4a4d-4c30-9f39-43683ba3eb48" containerID="c616c5ff4f49e0e505b29aed4c52cd6e17c99ddd715be8b95dde29e85cae0900" exitCode=0 Apr 16 23:59:33.141971 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:33.141626 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-b5k6m" Apr 16 23:59:33.141971 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:33.141656 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-b5k6m" event={"ID":"01d3d21e-4a4d-4c30-9f39-43683ba3eb48","Type":"ContainerDied","Data":"c616c5ff4f49e0e505b29aed4c52cd6e17c99ddd715be8b95dde29e85cae0900"} Apr 16 23:59:33.141971 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:33.141696 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-b5k6m" event={"ID":"01d3d21e-4a4d-4c30-9f39-43683ba3eb48","Type":"ContainerDied","Data":"c319e9c2d8717a72720ed711192c862794eb8e809820957b8f572ced9937ca45"} Apr 16 23:59:33.141971 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:33.141715 2562 scope.go:117] "RemoveContainer" containerID="c616c5ff4f49e0e505b29aed4c52cd6e17c99ddd715be8b95dde29e85cae0900" Apr 16 23:59:33.150627 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:33.150612 2562 scope.go:117] "RemoveContainer" containerID="c616c5ff4f49e0e505b29aed4c52cd6e17c99ddd715be8b95dde29e85cae0900" Apr 16 23:59:33.150876 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:59:33.150858 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c616c5ff4f49e0e505b29aed4c52cd6e17c99ddd715be8b95dde29e85cae0900\": container with ID starting with c616c5ff4f49e0e505b29aed4c52cd6e17c99ddd715be8b95dde29e85cae0900 not found: ID does not exist" containerID="c616c5ff4f49e0e505b29aed4c52cd6e17c99ddd715be8b95dde29e85cae0900" Apr 16 23:59:33.150935 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:33.150883 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c616c5ff4f49e0e505b29aed4c52cd6e17c99ddd715be8b95dde29e85cae0900"} err="failed to get container status \"c616c5ff4f49e0e505b29aed4c52cd6e17c99ddd715be8b95dde29e85cae0900\": rpc error: code = NotFound desc = could not find container \"c616c5ff4f49e0e505b29aed4c52cd6e17c99ddd715be8b95dde29e85cae0900\": container with ID starting with c616c5ff4f49e0e505b29aed4c52cd6e17c99ddd715be8b95dde29e85cae0900 not found: ID does not exist" Apr 16 23:59:33.161006 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:33.160985 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-b5k6m"] Apr 16 23:59:33.163948 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:33.163929 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-b5k6m"] Apr 16 23:59:34.353293 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:34.353263 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01d3d21e-4a4d-4c30-9f39-43683ba3eb48" path="/var/lib/kubelet/pods/01d3d21e-4a4d-4c30-9f39-43683ba3eb48/volumes" Apr 16 23:59:56.281354 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:56.281325 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-9d4s2"] Apr 16 23:59:56.281943 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:56.281849 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01d3d21e-4a4d-4c30-9f39-43683ba3eb48" containerName="authorino" Apr 16 23:59:56.281943 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:56.281870 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="01d3d21e-4a4d-4c30-9f39-43683ba3eb48" containerName="authorino" Apr 16 23:59:56.282064 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:56.281974 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="01d3d21e-4a4d-4c30-9f39-43683ba3eb48" containerName="authorino" Apr 16 23:59:56.286832 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:56.286812 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-9d4s2" Apr 16 23:59:56.291926 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:56.291905 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-9d4s2"] Apr 16 23:59:56.465167 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:56.465134 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw28m\" (UniqueName: \"kubernetes.io/projected/30c4adab-ad0b-4f11-9579-bbd061287622-kube-api-access-pw28m\") pod \"authorino-8b475cf9f-9d4s2\" (UID: \"30c4adab-ad0b-4f11-9579-bbd061287622\") " pod="kuadrant-system/authorino-8b475cf9f-9d4s2" Apr 16 23:59:56.543097 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:56.543029 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-9d4s2"] Apr 16 23:59:56.543258 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:59:56.543239 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-pw28m], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-8b475cf9f-9d4s2" podUID="30c4adab-ad0b-4f11-9579-bbd061287622" Apr 16 23:59:56.566349 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:56.566325 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pw28m\" (UniqueName: \"kubernetes.io/projected/30c4adab-ad0b-4f11-9579-bbd061287622-kube-api-access-pw28m\") pod \"authorino-8b475cf9f-9d4s2\" (UID: \"30c4adab-ad0b-4f11-9579-bbd061287622\") " pod="kuadrant-system/authorino-8b475cf9f-9d4s2" Apr 16 23:59:56.579631 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:56.579601 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw28m\" (UniqueName: \"kubernetes.io/projected/30c4adab-ad0b-4f11-9579-bbd061287622-kube-api-access-pw28m\") pod \"authorino-8b475cf9f-9d4s2\" (UID: \"30c4adab-ad0b-4f11-9579-bbd061287622\") " pod="kuadrant-system/authorino-8b475cf9f-9d4s2" Apr 16 23:59:56.587920 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:56.587901 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7cf74bb495-src5m"] Apr 16 23:59:56.591085 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:56.591070 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7cf74bb495-src5m" Apr 16 23:59:56.605970 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:56.605951 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7cf74bb495-src5m"] Apr 16 23:59:56.768300 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:56.768268 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b76wk\" (UniqueName: \"kubernetes.io/projected/fa3b3deb-a649-4f02-969a-37049dfd1a5c-kube-api-access-b76wk\") pod \"authorino-7cf74bb495-src5m\" (UID: \"fa3b3deb-a649-4f02-969a-37049dfd1a5c\") " pod="kuadrant-system/authorino-7cf74bb495-src5m" Apr 16 23:59:56.783246 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:56.783219 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7cf74bb495-src5m"] Apr 16 23:59:56.783411 ip-10-0-128-98 kubenswrapper[2562]: E0416 23:59:56.783391 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-b76wk], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-7cf74bb495-src5m" podUID="fa3b3deb-a649-4f02-969a-37049dfd1a5c" Apr 16 23:59:56.869597 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:56.869535 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b76wk\" (UniqueName: \"kubernetes.io/projected/fa3b3deb-a649-4f02-969a-37049dfd1a5c-kube-api-access-b76wk\") pod \"authorino-7cf74bb495-src5m\" (UID: \"fa3b3deb-a649-4f02-969a-37049dfd1a5c\") " pod="kuadrant-system/authorino-7cf74bb495-src5m" Apr 16 23:59:56.877774 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:56.877745 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b76wk\" (UniqueName: \"kubernetes.io/projected/fa3b3deb-a649-4f02-969a-37049dfd1a5c-kube-api-access-b76wk\") pod \"authorino-7cf74bb495-src5m\" (UID: \"fa3b3deb-a649-4f02-969a-37049dfd1a5c\") " pod="kuadrant-system/authorino-7cf74bb495-src5m" Apr 16 23:59:57.224706 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:57.224675 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7cf74bb495-src5m" Apr 16 23:59:57.224875 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:57.224675 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-9d4s2" Apr 16 23:59:57.229659 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:57.229641 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7cf74bb495-src5m" Apr 16 23:59:57.232761 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:57.232743 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-9d4s2" Apr 16 23:59:57.373930 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:57.373904 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw28m\" (UniqueName: \"kubernetes.io/projected/30c4adab-ad0b-4f11-9579-bbd061287622-kube-api-access-pw28m\") pod \"30c4adab-ad0b-4f11-9579-bbd061287622\" (UID: \"30c4adab-ad0b-4f11-9579-bbd061287622\") " Apr 16 23:59:57.374325 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:57.373954 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b76wk\" (UniqueName: \"kubernetes.io/projected/fa3b3deb-a649-4f02-969a-37049dfd1a5c-kube-api-access-b76wk\") pod \"fa3b3deb-a649-4f02-969a-37049dfd1a5c\" (UID: \"fa3b3deb-a649-4f02-969a-37049dfd1a5c\") " Apr 16 23:59:57.375949 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:57.375928 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30c4adab-ad0b-4f11-9579-bbd061287622-kube-api-access-pw28m" (OuterVolumeSpecName: "kube-api-access-pw28m") pod "30c4adab-ad0b-4f11-9579-bbd061287622" (UID: "30c4adab-ad0b-4f11-9579-bbd061287622"). InnerVolumeSpecName "kube-api-access-pw28m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:59:57.376034 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:57.375961 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa3b3deb-a649-4f02-969a-37049dfd1a5c-kube-api-access-b76wk" (OuterVolumeSpecName: "kube-api-access-b76wk") pod "fa3b3deb-a649-4f02-969a-37049dfd1a5c" (UID: "fa3b3deb-a649-4f02-969a-37049dfd1a5c"). InnerVolumeSpecName "kube-api-access-b76wk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:59:57.475410 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:57.475351 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pw28m\" (UniqueName: \"kubernetes.io/projected/30c4adab-ad0b-4f11-9579-bbd061287622-kube-api-access-pw28m\") on node \"ip-10-0-128-98.ec2.internal\" DevicePath \"\"" Apr 16 23:59:57.475410 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:57.475380 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b76wk\" (UniqueName: \"kubernetes.io/projected/fa3b3deb-a649-4f02-969a-37049dfd1a5c-kube-api-access-b76wk\") on node \"ip-10-0-128-98.ec2.internal\" DevicePath \"\"" Apr 16 23:59:58.228074 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:58.228045 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7cf74bb495-src5m" Apr 16 23:59:58.228259 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:58.228045 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-9d4s2" Apr 16 23:59:58.255549 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:58.255524 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7cf74bb495-src5m"] Apr 16 23:59:58.257731 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:58.257708 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7cf74bb495-src5m"] Apr 16 23:59:58.275328 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:58.275298 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-9d4s2"] Apr 16 23:59:58.276943 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:58.276924 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-9d4s2"] Apr 16 23:59:58.359767 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:58.359739 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30c4adab-ad0b-4f11-9579-bbd061287622" path="/var/lib/kubelet/pods/30c4adab-ad0b-4f11-9579-bbd061287622/volumes" Apr 16 23:59:58.359961 ip-10-0-128-98 kubenswrapper[2562]: I0416 23:59:58.359949 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa3b3deb-a649-4f02-969a-37049dfd1a5c" path="/var/lib/kubelet/pods/fa3b3deb-a649-4f02-969a-37049dfd1a5c/volumes" Apr 17 00:00:00.000045 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:00.000015 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-ww67f"] Apr 17 00:00:00.000444 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:00.000225 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-ww67f" podUID="b48480e3-8a01-4791-9071-7570800d2982" containerName="authorino" containerID="cri-o://bcd66b4f10915192f68517e91936156ec1e4b263e2b4eb558fecce940c1ca4e7" gracePeriod=30 Apr 17 00:00:00.142472 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:00.142436 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29606400-dmsdh"] Apr 17 00:00:00.146547 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:00.146514 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29606400-dmsdh" Apr 17 00:00:00.149015 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:00.148993 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-6fjhg\"" Apr 17 00:00:00.157659 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:00.157635 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606400-dmsdh"] Apr 17 00:00:00.175075 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:00.175051 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29606400-kq2wv"] Apr 17 00:00:00.181706 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:00.181686 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29606400-kq2wv" Apr 17 00:00:00.184359 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:00.184339 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"pruner-dockercfg-xb46j\"" Apr 17 00:00:00.184458 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:00.184338 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"serviceca\"" Apr 17 00:00:00.201416 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:00.201393 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29606400-kq2wv"] Apr 17 00:00:00.235777 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:00.235753 2562 generic.go:358] "Generic (PLEG): container finished" podID="b48480e3-8a01-4791-9071-7570800d2982" containerID="bcd66b4f10915192f68517e91936156ec1e4b263e2b4eb558fecce940c1ca4e7" exitCode=0 Apr 17 00:00:00.235878 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:00.235801 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-ww67f" event={"ID":"b48480e3-8a01-4791-9071-7570800d2982","Type":"ContainerDied","Data":"bcd66b4f10915192f68517e91936156ec1e4b263e2b4eb558fecce940c1ca4e7"} Apr 17 00:00:00.247896 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:00.247881 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-ww67f" Apr 17 00:00:00.298429 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:00.298371 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtvjz\" (UniqueName: \"kubernetes.io/projected/d9028d25-1c06-432c-b3ac-b1bb4670b579-kube-api-access-rtvjz\") pod \"image-pruner-29606400-kq2wv\" (UID: \"d9028d25-1c06-432c-b3ac-b1bb4670b579\") " pod="openshift-image-registry/image-pruner-29606400-kq2wv" Apr 17 00:00:00.298534 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:00.298458 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d9028d25-1c06-432c-b3ac-b1bb4670b579-serviceca\") pod \"image-pruner-29606400-kq2wv\" (UID: \"d9028d25-1c06-432c-b3ac-b1bb4670b579\") " pod="openshift-image-registry/image-pruner-29606400-kq2wv" Apr 17 00:00:00.298534 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:00.298516 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxzwc\" (UniqueName: \"kubernetes.io/projected/6322aaa2-fb86-4b69-b178-717447df7868-kube-api-access-jxzwc\") pod \"maas-api-key-cleanup-29606400-dmsdh\" (UID: \"6322aaa2-fb86-4b69-b178-717447df7868\") " pod="opendatahub/maas-api-key-cleanup-29606400-dmsdh" Apr 17 00:00:00.399020 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:00.398996 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hd9zj\" (UniqueName: \"kubernetes.io/projected/b48480e3-8a01-4791-9071-7570800d2982-kube-api-access-hd9zj\") pod \"b48480e3-8a01-4791-9071-7570800d2982\" (UID: \"b48480e3-8a01-4791-9071-7570800d2982\") " Apr 17 00:00:00.399136 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:00.399101 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d9028d25-1c06-432c-b3ac-b1bb4670b579-serviceca\") pod \"image-pruner-29606400-kq2wv\" (UID: \"d9028d25-1c06-432c-b3ac-b1bb4670b579\") " pod="openshift-image-registry/image-pruner-29606400-kq2wv" Apr 17 00:00:00.399215 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:00.399137 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jxzwc\" (UniqueName: \"kubernetes.io/projected/6322aaa2-fb86-4b69-b178-717447df7868-kube-api-access-jxzwc\") pod \"maas-api-key-cleanup-29606400-dmsdh\" (UID: \"6322aaa2-fb86-4b69-b178-717447df7868\") " pod="opendatahub/maas-api-key-cleanup-29606400-dmsdh" Apr 17 00:00:00.399215 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:00.399162 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rtvjz\" (UniqueName: \"kubernetes.io/projected/d9028d25-1c06-432c-b3ac-b1bb4670b579-kube-api-access-rtvjz\") pod \"image-pruner-29606400-kq2wv\" (UID: \"d9028d25-1c06-432c-b3ac-b1bb4670b579\") " pod="openshift-image-registry/image-pruner-29606400-kq2wv" Apr 17 00:00:00.399712 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:00.399692 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d9028d25-1c06-432c-b3ac-b1bb4670b579-serviceca\") pod \"image-pruner-29606400-kq2wv\" (UID: \"d9028d25-1c06-432c-b3ac-b1bb4670b579\") " pod="openshift-image-registry/image-pruner-29606400-kq2wv" Apr 17 00:00:00.400872 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:00.400851 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b48480e3-8a01-4791-9071-7570800d2982-kube-api-access-hd9zj" (OuterVolumeSpecName: "kube-api-access-hd9zj") pod "b48480e3-8a01-4791-9071-7570800d2982" (UID: "b48480e3-8a01-4791-9071-7570800d2982"). InnerVolumeSpecName "kube-api-access-hd9zj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 00:00:00.406175 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:00.406150 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxzwc\" (UniqueName: \"kubernetes.io/projected/6322aaa2-fb86-4b69-b178-717447df7868-kube-api-access-jxzwc\") pod \"maas-api-key-cleanup-29606400-dmsdh\" (UID: \"6322aaa2-fb86-4b69-b178-717447df7868\") " pod="opendatahub/maas-api-key-cleanup-29606400-dmsdh" Apr 17 00:00:00.406278 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:00.406252 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtvjz\" (UniqueName: \"kubernetes.io/projected/d9028d25-1c06-432c-b3ac-b1bb4670b579-kube-api-access-rtvjz\") pod \"image-pruner-29606400-kq2wv\" (UID: \"d9028d25-1c06-432c-b3ac-b1bb4670b579\") " pod="openshift-image-registry/image-pruner-29606400-kq2wv" Apr 17 00:00:00.458945 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:00.458924 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29606400-dmsdh" Apr 17 00:00:00.493710 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:00.493675 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29606400-kq2wv" Apr 17 00:00:00.500829 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:00.500803 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hd9zj\" (UniqueName: \"kubernetes.io/projected/b48480e3-8a01-4791-9071-7570800d2982-kube-api-access-hd9zj\") on node \"ip-10-0-128-98.ec2.internal\" DevicePath \"\"" Apr 17 00:00:00.620099 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:00.620068 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29606400-kq2wv"] Apr 17 00:00:00.623294 ip-10-0-128-98 kubenswrapper[2562]: W0417 00:00:00.623266 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9028d25_1c06_432c_b3ac_b1bb4670b579.slice/crio-fb6bf719d2231dd002266fba9200608c1b4956a4012419e607967e3e6cf2a41c WatchSource:0}: Error finding container fb6bf719d2231dd002266fba9200608c1b4956a4012419e607967e3e6cf2a41c: Status 404 returned error can't find the container with id fb6bf719d2231dd002266fba9200608c1b4956a4012419e607967e3e6cf2a41c Apr 17 00:00:00.789335 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:00.789295 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606400-dmsdh"] Apr 17 00:00:01.240823 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:01.240736 2562 generic.go:358] "Generic (PLEG): container finished" podID="d9028d25-1c06-432c-b3ac-b1bb4670b579" containerID="370f2227b1be5825f97772e908651fbd53fe6f9b75133e6d901f43aaee480f79" exitCode=0 Apr 17 00:00:01.240823 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:01.240810 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29606400-kq2wv" event={"ID":"d9028d25-1c06-432c-b3ac-b1bb4670b579","Type":"ContainerDied","Data":"370f2227b1be5825f97772e908651fbd53fe6f9b75133e6d901f43aaee480f79"} Apr 17 00:00:01.241253 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:01.240839 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29606400-kq2wv" event={"ID":"d9028d25-1c06-432c-b3ac-b1bb4670b579","Type":"ContainerStarted","Data":"fb6bf719d2231dd002266fba9200608c1b4956a4012419e607967e3e6cf2a41c"} Apr 17 00:00:01.241857 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:01.241826 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606400-dmsdh" event={"ID":"6322aaa2-fb86-4b69-b178-717447df7868","Type":"ContainerStarted","Data":"2dc13d9bc4b434ebdb214c27f4ded57842b3d52a875cf9dcd3c681be82daecce"} Apr 17 00:00:01.242820 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:01.242802 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-ww67f" event={"ID":"b48480e3-8a01-4791-9071-7570800d2982","Type":"ContainerDied","Data":"e68adfda4bb4662f666b058251d16c07ccd2a6b06aa146e3cb09546fe235a966"} Apr 17 00:00:01.242820 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:01.242812 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-ww67f" Apr 17 00:00:01.242820 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:01.242829 2562 scope.go:117] "RemoveContainer" containerID="bcd66b4f10915192f68517e91936156ec1e4b263e2b4eb558fecce940c1ca4e7" Apr 17 00:00:01.269036 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:01.269001 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-ww67f"] Apr 17 00:00:01.269914 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:01.269890 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-ww67f"] Apr 17 00:00:02.353625 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:02.353590 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b48480e3-8a01-4791-9071-7570800d2982" path="/var/lib/kubelet/pods/b48480e3-8a01-4791-9071-7570800d2982/volumes" Apr 17 00:00:02.375235 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:02.375215 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29606400-kq2wv" Apr 17 00:00:02.417309 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:02.417278 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d9028d25-1c06-432c-b3ac-b1bb4670b579-serviceca\") pod \"d9028d25-1c06-432c-b3ac-b1bb4670b579\" (UID: \"d9028d25-1c06-432c-b3ac-b1bb4670b579\") " Apr 17 00:00:02.417461 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:02.417319 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtvjz\" (UniqueName: \"kubernetes.io/projected/d9028d25-1c06-432c-b3ac-b1bb4670b579-kube-api-access-rtvjz\") pod \"d9028d25-1c06-432c-b3ac-b1bb4670b579\" (UID: \"d9028d25-1c06-432c-b3ac-b1bb4670b579\") " Apr 17 00:00:02.417632 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:02.417602 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9028d25-1c06-432c-b3ac-b1bb4670b579-serviceca" (OuterVolumeSpecName: "serviceca") pod "d9028d25-1c06-432c-b3ac-b1bb4670b579" (UID: "d9028d25-1c06-432c-b3ac-b1bb4670b579"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 00:00:02.419350 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:02.419328 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9028d25-1c06-432c-b3ac-b1bb4670b579-kube-api-access-rtvjz" (OuterVolumeSpecName: "kube-api-access-rtvjz") pod "d9028d25-1c06-432c-b3ac-b1bb4670b579" (UID: "d9028d25-1c06-432c-b3ac-b1bb4670b579"). InnerVolumeSpecName "kube-api-access-rtvjz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 00:00:02.517943 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:02.517868 2562 reconciler_common.go:299] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d9028d25-1c06-432c-b3ac-b1bb4670b579-serviceca\") on node \"ip-10-0-128-98.ec2.internal\" DevicePath \"\"" Apr 17 00:00:02.517943 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:02.517897 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rtvjz\" (UniqueName: \"kubernetes.io/projected/d9028d25-1c06-432c-b3ac-b1bb4670b579-kube-api-access-rtvjz\") on node \"ip-10-0-128-98.ec2.internal\" DevicePath \"\"" Apr 17 00:00:03.251571 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:03.251537 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29606400-kq2wv" event={"ID":"d9028d25-1c06-432c-b3ac-b1bb4670b579","Type":"ContainerDied","Data":"fb6bf719d2231dd002266fba9200608c1b4956a4012419e607967e3e6cf2a41c"} Apr 17 00:00:03.251571 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:03.251570 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb6bf719d2231dd002266fba9200608c1b4956a4012419e607967e3e6cf2a41c" Apr 17 00:00:03.251766 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:03.251580 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29606400-kq2wv" Apr 17 00:00:28.287217 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:28.287173 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrpzx_5fddaa09-016b-490d-a9ab-9d50ff167b22/console-operator/2.log" Apr 17 00:00:28.288437 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:28.288418 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrpzx_5fddaa09-016b-490d-a9ab-9d50ff167b22/console-operator/2.log" Apr 17 00:00:31.437137 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:31.437105 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-zcj96"] Apr 17 00:00:31.437560 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:31.437509 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b48480e3-8a01-4791-9071-7570800d2982" containerName="authorino" Apr 17 00:00:31.437560 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:31.437524 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="b48480e3-8a01-4791-9071-7570800d2982" containerName="authorino" Apr 17 00:00:31.437560 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:31.437534 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9028d25-1c06-432c-b3ac-b1bb4670b579" containerName="image-pruner" Apr 17 00:00:31.437560 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:31.437540 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9028d25-1c06-432c-b3ac-b1bb4670b579" containerName="image-pruner" Apr 17 00:00:31.437697 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:31.437633 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="d9028d25-1c06-432c-b3ac-b1bb4670b579" containerName="image-pruner" Apr 17 00:00:31.437697 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:31.437650 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="b48480e3-8a01-4791-9071-7570800d2982" containerName="authorino" Apr 17 00:00:31.442120 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:31.442104 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zcj96" Apr 17 00:00:31.444768 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:31.444745 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 17 00:00:31.444768 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:31.444763 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 17 00:00:31.444956 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:31.444747 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-g4992\"" Apr 17 00:00:31.444956 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:31.444795 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 17 00:00:31.449742 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:31.449446 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-zcj96"] Apr 17 00:00:31.543854 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:31.543821 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cd0aa85c-579b-4ee5-9346-65425ddde1b9-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-zcj96\" (UID: \"cd0aa85c-579b-4ee5-9346-65425ddde1b9\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zcj96" Apr 17 00:00:31.544032 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:31.543889 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cd0aa85c-579b-4ee5-9346-65425ddde1b9-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-zcj96\" (UID: \"cd0aa85c-579b-4ee5-9346-65425ddde1b9\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zcj96" Apr 17 00:00:31.544032 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:31.543924 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cd0aa85c-579b-4ee5-9346-65425ddde1b9-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-zcj96\" (UID: \"cd0aa85c-579b-4ee5-9346-65425ddde1b9\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zcj96" Apr 17 00:00:31.544032 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:31.543959 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cd0aa85c-579b-4ee5-9346-65425ddde1b9-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-zcj96\" (UID: \"cd0aa85c-579b-4ee5-9346-65425ddde1b9\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zcj96" Apr 17 00:00:31.544281 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:31.544034 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cd0aa85c-579b-4ee5-9346-65425ddde1b9-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-zcj96\" (UID: \"cd0aa85c-579b-4ee5-9346-65425ddde1b9\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zcj96" Apr 17 00:00:31.544281 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:31.544070 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pz9q\" (UniqueName: \"kubernetes.io/projected/cd0aa85c-579b-4ee5-9346-65425ddde1b9-kube-api-access-6pz9q\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-zcj96\" (UID: \"cd0aa85c-579b-4ee5-9346-65425ddde1b9\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zcj96" Apr 17 00:00:31.644830 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:31.644792 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cd0aa85c-579b-4ee5-9346-65425ddde1b9-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-zcj96\" (UID: \"cd0aa85c-579b-4ee5-9346-65425ddde1b9\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zcj96" Apr 17 00:00:31.644830 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:31.644835 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cd0aa85c-579b-4ee5-9346-65425ddde1b9-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-zcj96\" (UID: \"cd0aa85c-579b-4ee5-9346-65425ddde1b9\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zcj96" Apr 17 00:00:31.645052 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:31.644871 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cd0aa85c-579b-4ee5-9346-65425ddde1b9-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-zcj96\" (UID: \"cd0aa85c-579b-4ee5-9346-65425ddde1b9\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zcj96" Apr 17 00:00:31.645052 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:31.644922 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cd0aa85c-579b-4ee5-9346-65425ddde1b9-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-zcj96\" (UID: \"cd0aa85c-579b-4ee5-9346-65425ddde1b9\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zcj96" Apr 17 00:00:31.645052 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:31.644958 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6pz9q\" (UniqueName: \"kubernetes.io/projected/cd0aa85c-579b-4ee5-9346-65425ddde1b9-kube-api-access-6pz9q\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-zcj96\" (UID: \"cd0aa85c-579b-4ee5-9346-65425ddde1b9\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zcj96" Apr 17 00:00:31.645052 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:31.645022 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cd0aa85c-579b-4ee5-9346-65425ddde1b9-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-zcj96\" (UID: \"cd0aa85c-579b-4ee5-9346-65425ddde1b9\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zcj96" Apr 17 00:00:31.645310 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:31.645288 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cd0aa85c-579b-4ee5-9346-65425ddde1b9-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-zcj96\" (UID: \"cd0aa85c-579b-4ee5-9346-65425ddde1b9\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zcj96" Apr 17 00:00:31.645370 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:31.645315 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cd0aa85c-579b-4ee5-9346-65425ddde1b9-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-zcj96\" (UID: \"cd0aa85c-579b-4ee5-9346-65425ddde1b9\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zcj96" Apr 17 00:00:31.645370 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:31.645343 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cd0aa85c-579b-4ee5-9346-65425ddde1b9-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-zcj96\" (UID: \"cd0aa85c-579b-4ee5-9346-65425ddde1b9\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zcj96" Apr 17 00:00:31.647177 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:31.647148 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cd0aa85c-579b-4ee5-9346-65425ddde1b9-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-zcj96\" (UID: \"cd0aa85c-579b-4ee5-9346-65425ddde1b9\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zcj96" Apr 17 00:00:31.647408 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:31.647393 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cd0aa85c-579b-4ee5-9346-65425ddde1b9-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-zcj96\" (UID: \"cd0aa85c-579b-4ee5-9346-65425ddde1b9\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zcj96" Apr 17 00:00:31.651640 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:31.651620 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pz9q\" (UniqueName: \"kubernetes.io/projected/cd0aa85c-579b-4ee5-9346-65425ddde1b9-kube-api-access-6pz9q\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-zcj96\" (UID: \"cd0aa85c-579b-4ee5-9346-65425ddde1b9\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zcj96" Apr 17 00:00:31.753069 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:31.752993 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zcj96" Apr 17 00:00:31.875397 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:31.875374 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-zcj96"] Apr 17 00:00:31.877650 ip-10-0-128-98 kubenswrapper[2562]: W0417 00:00:31.877623 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd0aa85c_579b_4ee5_9346_65425ddde1b9.slice/crio-376e5ba0dbe52e825b296e4d694c4492e514bbd5e432dc24317976e383cd2477 WatchSource:0}: Error finding container 376e5ba0dbe52e825b296e4d694c4492e514bbd5e432dc24317976e383cd2477: Status 404 returned error can't find the container with id 376e5ba0dbe52e825b296e4d694c4492e514bbd5e432dc24317976e383cd2477 Apr 17 00:00:32.354996 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:32.354960 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zcj96" event={"ID":"cd0aa85c-579b-4ee5-9346-65425ddde1b9","Type":"ContainerStarted","Data":"376e5ba0dbe52e825b296e4d694c4492e514bbd5e432dc24317976e383cd2477"} Apr 17 00:00:38.379794 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:38.379700 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zcj96" event={"ID":"cd0aa85c-579b-4ee5-9346-65425ddde1b9","Type":"ContainerStarted","Data":"6093a2bb2607c98bae8ea6b64bfc1fd17d5997d4b359cb00d7512c2f19c2644e"} Apr 17 00:00:44.398973 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:44.398939 2562 generic.go:358] "Generic (PLEG): container finished" podID="cd0aa85c-579b-4ee5-9346-65425ddde1b9" containerID="6093a2bb2607c98bae8ea6b64bfc1fd17d5997d4b359cb00d7512c2f19c2644e" exitCode=0 Apr 17 00:00:44.399416 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:44.399018 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zcj96" event={"ID":"cd0aa85c-579b-4ee5-9346-65425ddde1b9","Type":"ContainerDied","Data":"6093a2bb2607c98bae8ea6b64bfc1fd17d5997d4b359cb00d7512c2f19c2644e"} Apr 17 00:00:50.422389 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:50.422281 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zcj96" event={"ID":"cd0aa85c-579b-4ee5-9346-65425ddde1b9","Type":"ContainerStarted","Data":"d7f61435a2484d41e86aae9cdfe353ae9d22e029b85f9694df2c15e8e4e3db62"} Apr 17 00:00:50.433405 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:50.422495 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zcj96" Apr 17 00:00:50.439873 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:00:50.439828 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zcj96" podStartSLOduration=1.5210556610000001 podStartE2EDuration="19.439815011s" podCreationTimestamp="2026-04-17 00:00:31 +0000 UTC" firstStartedPulling="2026-04-17 00:00:31.879335805 +0000 UTC m=+604.088100453" lastFinishedPulling="2026-04-17 00:00:49.798095151 +0000 UTC m=+622.006859803" observedRunningTime="2026-04-17 00:00:50.4393065 +0000 UTC m=+622.648071182" watchObservedRunningTime="2026-04-17 00:00:50.439815011 +0000 UTC m=+622.648579681" Apr 17 00:01:00.008525 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:00.008486 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606400-dmsdh"] Apr 17 00:01:01.439049 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:01.439020 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-zcj96" Apr 17 00:01:02.315379 ip-10-0-128-98 kubenswrapper[2562]: E0417 00:01:02.315277 2562 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: copying system image from manifest list: reading blob sha256:b1ed13c5ef0ac6dbcd255a5c1be9e3c9c2903872aa4ae5fa877850a48fdaee26: fetching blob: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" image="registry.redhat.io/ubi9/ubi-minimal:9.7" Apr 17 00:01:02.315547 ip-10-0-128-98 kubenswrapper[2562]: E0417 00:01:02.315489 2562 kuberuntime_manager.go:1358] "Unhandled Error" err=< Apr 17 00:01:02.315547 ip-10-0-128-98 kubenswrapper[2562]: container &Container{Name:cleanup,Image:registry.redhat.io/ubi9/ubi-minimal:9.7,Command:[/bin/sh -c curl -sf -X POST http://maas-api:8080/internal/v1/api-keys/cleanup Apr 17 00:01:02.315547 ip-10-0-128-98 kubenswrapper[2562]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{33554432 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{16777216 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jxzwc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod maas-api-key-cleanup-29606400-dmsdh_opendatahub(6322aaa2-fb86-4b69-b178-717447df7868): ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: reading blob sha256:b1ed13c5ef0ac6dbcd255a5c1be9e3c9c2903872aa4ae5fa877850a48fdaee26: fetching blob: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image Apr 17 00:01:02.315547 ip-10-0-128-98 kubenswrapper[2562]: > logger="UnhandledError" Apr 17 00:01:02.316645 ip-10-0-128-98 kubenswrapper[2562]: E0417 00:01:02.316613 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: copying system image from manifest list: reading blob sha256:b1ed13c5ef0ac6dbcd255a5c1be9e3c9c2903872aa4ae5fa877850a48fdaee26: fetching blob: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="opendatahub/maas-api-key-cleanup-29606400-dmsdh" podUID="6322aaa2-fb86-4b69-b178-717447df7868" Apr 17 00:01:02.585943 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:02.585922 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29606400-dmsdh" Apr 17 00:01:02.607971 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:02.607949 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxzwc\" (UniqueName: \"kubernetes.io/projected/6322aaa2-fb86-4b69-b178-717447df7868-kube-api-access-jxzwc\") pod \"6322aaa2-fb86-4b69-b178-717447df7868\" (UID: \"6322aaa2-fb86-4b69-b178-717447df7868\") " Apr 17 00:01:02.609855 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:02.609832 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6322aaa2-fb86-4b69-b178-717447df7868-kube-api-access-jxzwc" (OuterVolumeSpecName: "kube-api-access-jxzwc") pod "6322aaa2-fb86-4b69-b178-717447df7868" (UID: "6322aaa2-fb86-4b69-b178-717447df7868"). InnerVolumeSpecName "kube-api-access-jxzwc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 00:01:02.708507 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:02.708471 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jxzwc\" (UniqueName: \"kubernetes.io/projected/6322aaa2-fb86-4b69-b178-717447df7868-kube-api-access-jxzwc\") on node \"ip-10-0-128-98.ec2.internal\" DevicePath \"\"" Apr 17 00:01:03.466929 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:03.466890 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606400-dmsdh" event={"ID":"6322aaa2-fb86-4b69-b178-717447df7868","Type":"ContainerDied","Data":"2dc13d9bc4b434ebdb214c27f4ded57842b3d52a875cf9dcd3c681be82daecce"} Apr 17 00:01:03.466929 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:03.466917 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29606400-dmsdh" Apr 17 00:01:03.499260 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:03.499229 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606400-dmsdh"] Apr 17 00:01:03.501014 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:03.500994 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606400-dmsdh"] Apr 17 00:01:04.240048 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.239974 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw"] Apr 17 00:01:04.299098 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.299060 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw"] Apr 17 00:01:04.299270 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.299150 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw" Apr 17 00:01:04.301684 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.301663 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 17 00:01:04.319934 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.319911 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1e73630f-a6ce-4465-9737-70bb0108a35f-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw\" (UID: \"1e73630f-a6ce-4465-9737-70bb0108a35f\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw" Apr 17 00:01:04.320056 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.319946 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh4js\" (UniqueName: \"kubernetes.io/projected/1e73630f-a6ce-4465-9737-70bb0108a35f-kube-api-access-qh4js\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw\" (UID: \"1e73630f-a6ce-4465-9737-70bb0108a35f\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw" Apr 17 00:01:04.320056 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.320017 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1e73630f-a6ce-4465-9737-70bb0108a35f-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw\" (UID: \"1e73630f-a6ce-4465-9737-70bb0108a35f\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw" Apr 17 00:01:04.320172 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.320078 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1e73630f-a6ce-4465-9737-70bb0108a35f-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw\" (UID: \"1e73630f-a6ce-4465-9737-70bb0108a35f\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw" Apr 17 00:01:04.320172 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.320125 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1e73630f-a6ce-4465-9737-70bb0108a35f-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw\" (UID: \"1e73630f-a6ce-4465-9737-70bb0108a35f\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw" Apr 17 00:01:04.320298 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.320176 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1e73630f-a6ce-4465-9737-70bb0108a35f-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw\" (UID: \"1e73630f-a6ce-4465-9737-70bb0108a35f\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw" Apr 17 00:01:04.352710 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.352684 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6322aaa2-fb86-4b69-b178-717447df7868" path="/var/lib/kubelet/pods/6322aaa2-fb86-4b69-b178-717447df7868/volumes" Apr 17 00:01:04.421012 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.420983 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1e73630f-a6ce-4465-9737-70bb0108a35f-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw\" (UID: \"1e73630f-a6ce-4465-9737-70bb0108a35f\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw" Apr 17 00:01:04.421169 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.421025 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1e73630f-a6ce-4465-9737-70bb0108a35f-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw\" (UID: \"1e73630f-a6ce-4465-9737-70bb0108a35f\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw" Apr 17 00:01:04.421169 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.421065 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1e73630f-a6ce-4465-9737-70bb0108a35f-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw\" (UID: \"1e73630f-a6ce-4465-9737-70bb0108a35f\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw" Apr 17 00:01:04.421169 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.421084 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qh4js\" (UniqueName: \"kubernetes.io/projected/1e73630f-a6ce-4465-9737-70bb0108a35f-kube-api-access-qh4js\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw\" (UID: \"1e73630f-a6ce-4465-9737-70bb0108a35f\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw" Apr 17 00:01:04.421169 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.421110 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1e73630f-a6ce-4465-9737-70bb0108a35f-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw\" (UID: \"1e73630f-a6ce-4465-9737-70bb0108a35f\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw" Apr 17 00:01:04.421169 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.421145 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1e73630f-a6ce-4465-9737-70bb0108a35f-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw\" (UID: \"1e73630f-a6ce-4465-9737-70bb0108a35f\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw" Apr 17 00:01:04.421583 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.421556 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1e73630f-a6ce-4465-9737-70bb0108a35f-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw\" (UID: \"1e73630f-a6ce-4465-9737-70bb0108a35f\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw" Apr 17 00:01:04.421674 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.421593 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1e73630f-a6ce-4465-9737-70bb0108a35f-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw\" (UID: \"1e73630f-a6ce-4465-9737-70bb0108a35f\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw" Apr 17 00:01:04.421674 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.421621 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1e73630f-a6ce-4465-9737-70bb0108a35f-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw\" (UID: \"1e73630f-a6ce-4465-9737-70bb0108a35f\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw" Apr 17 00:01:04.423401 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.423377 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1e73630f-a6ce-4465-9737-70bb0108a35f-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw\" (UID: \"1e73630f-a6ce-4465-9737-70bb0108a35f\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw" Apr 17 00:01:04.423523 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.423506 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1e73630f-a6ce-4465-9737-70bb0108a35f-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw\" (UID: \"1e73630f-a6ce-4465-9737-70bb0108a35f\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw" Apr 17 00:01:04.435035 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.435006 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh4js\" (UniqueName: \"kubernetes.io/projected/1e73630f-a6ce-4465-9737-70bb0108a35f-kube-api-access-qh4js\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw\" (UID: \"1e73630f-a6ce-4465-9737-70bb0108a35f\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw" Apr 17 00:01:04.440792 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.440762 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-87xjb"] Apr 17 00:01:04.452665 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.452628 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-87xjb" Apr 17 00:01:04.454805 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.454777 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-87xjb"] Apr 17 00:01:04.454899 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.454857 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 17 00:01:04.522130 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.522041 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/40fa06a7-6f3f-49c6-9c7f-015774a95220-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-87xjb\" (UID: \"40fa06a7-6f3f-49c6-9c7f-015774a95220\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-87xjb" Apr 17 00:01:04.522130 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.522082 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/40fa06a7-6f3f-49c6-9c7f-015774a95220-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-87xjb\" (UID: \"40fa06a7-6f3f-49c6-9c7f-015774a95220\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-87xjb" Apr 17 00:01:04.522360 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.522150 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40fa06a7-6f3f-49c6-9c7f-015774a95220-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-87xjb\" (UID: \"40fa06a7-6f3f-49c6-9c7f-015774a95220\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-87xjb" Apr 17 00:01:04.522360 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.522264 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/40fa06a7-6f3f-49c6-9c7f-015774a95220-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-87xjb\" (UID: \"40fa06a7-6f3f-49c6-9c7f-015774a95220\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-87xjb" Apr 17 00:01:04.522360 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.522285 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swf2j\" (UniqueName: \"kubernetes.io/projected/40fa06a7-6f3f-49c6-9c7f-015774a95220-kube-api-access-swf2j\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-87xjb\" (UID: \"40fa06a7-6f3f-49c6-9c7f-015774a95220\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-87xjb" Apr 17 00:01:04.522360 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.522313 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/40fa06a7-6f3f-49c6-9c7f-015774a95220-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-87xjb\" (UID: \"40fa06a7-6f3f-49c6-9c7f-015774a95220\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-87xjb" Apr 17 00:01:04.609107 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.609073 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw" Apr 17 00:01:04.623379 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.623347 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/40fa06a7-6f3f-49c6-9c7f-015774a95220-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-87xjb\" (UID: \"40fa06a7-6f3f-49c6-9c7f-015774a95220\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-87xjb" Apr 17 00:01:04.623505 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.623401 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/40fa06a7-6f3f-49c6-9c7f-015774a95220-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-87xjb\" (UID: \"40fa06a7-6f3f-49c6-9c7f-015774a95220\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-87xjb" Apr 17 00:01:04.623505 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.623444 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40fa06a7-6f3f-49c6-9c7f-015774a95220-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-87xjb\" (UID: \"40fa06a7-6f3f-49c6-9c7f-015774a95220\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-87xjb" Apr 17 00:01:04.623635 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.623511 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/40fa06a7-6f3f-49c6-9c7f-015774a95220-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-87xjb\" (UID: \"40fa06a7-6f3f-49c6-9c7f-015774a95220\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-87xjb" Apr 17 00:01:04.623635 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.623536 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-swf2j\" (UniqueName: \"kubernetes.io/projected/40fa06a7-6f3f-49c6-9c7f-015774a95220-kube-api-access-swf2j\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-87xjb\" (UID: \"40fa06a7-6f3f-49c6-9c7f-015774a95220\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-87xjb" Apr 17 00:01:04.623635 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.623581 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/40fa06a7-6f3f-49c6-9c7f-015774a95220-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-87xjb\" (UID: \"40fa06a7-6f3f-49c6-9c7f-015774a95220\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-87xjb" Apr 17 00:01:04.623834 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.623813 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/40fa06a7-6f3f-49c6-9c7f-015774a95220-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-87xjb\" (UID: \"40fa06a7-6f3f-49c6-9c7f-015774a95220\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-87xjb" Apr 17 00:01:04.624235 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.624212 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40fa06a7-6f3f-49c6-9c7f-015774a95220-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-87xjb\" (UID: \"40fa06a7-6f3f-49c6-9c7f-015774a95220\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-87xjb" Apr 17 00:01:04.624361 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.624341 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/40fa06a7-6f3f-49c6-9c7f-015774a95220-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-87xjb\" (UID: \"40fa06a7-6f3f-49c6-9c7f-015774a95220\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-87xjb" Apr 17 00:01:04.626146 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.626125 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/40fa06a7-6f3f-49c6-9c7f-015774a95220-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-87xjb\" (UID: \"40fa06a7-6f3f-49c6-9c7f-015774a95220\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-87xjb" Apr 17 00:01:04.626247 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.626160 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/40fa06a7-6f3f-49c6-9c7f-015774a95220-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-87xjb\" (UID: \"40fa06a7-6f3f-49c6-9c7f-015774a95220\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-87xjb" Apr 17 00:01:04.630595 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.630573 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-swf2j\" (UniqueName: \"kubernetes.io/projected/40fa06a7-6f3f-49c6-9c7f-015774a95220-kube-api-access-swf2j\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-87xjb\" (UID: \"40fa06a7-6f3f-49c6-9c7f-015774a95220\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-87xjb" Apr 17 00:01:04.726830 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.726803 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw"] Apr 17 00:01:04.728871 ip-10-0-128-98 kubenswrapper[2562]: W0417 00:01:04.728846 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e73630f_a6ce_4465_9737_70bb0108a35f.slice/crio-8cdc06645196f8631ff52b64a47941700f5fae6c1aa695de6f549855bc45d31e WatchSource:0}: Error finding container 8cdc06645196f8631ff52b64a47941700f5fae6c1aa695de6f549855bc45d31e: Status 404 returned error can't find the container with id 8cdc06645196f8631ff52b64a47941700f5fae6c1aa695de6f549855bc45d31e Apr 17 00:01:04.730545 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.730530 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 00:01:04.764884 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.764861 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-87xjb" Apr 17 00:01:04.890609 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:04.890585 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-87xjb"] Apr 17 00:01:04.892096 ip-10-0-128-98 kubenswrapper[2562]: W0417 00:01:04.892069 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40fa06a7_6f3f_49c6_9c7f_015774a95220.slice/crio-7504c66ce71817853e8a75d2297f0717f5eb6728fc39d4208d292a55299e66cd WatchSource:0}: Error finding container 7504c66ce71817853e8a75d2297f0717f5eb6728fc39d4208d292a55299e66cd: Status 404 returned error can't find the container with id 7504c66ce71817853e8a75d2297f0717f5eb6728fc39d4208d292a55299e66cd Apr 17 00:01:05.475275 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:05.475221 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-87xjb" event={"ID":"40fa06a7-6f3f-49c6-9c7f-015774a95220","Type":"ContainerStarted","Data":"be6c456cc8b0c76279121b52729316b479a10368778832cad5415a83a4256151"} Apr 17 00:01:05.475275 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:05.475270 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-87xjb" event={"ID":"40fa06a7-6f3f-49c6-9c7f-015774a95220","Type":"ContainerStarted","Data":"7504c66ce71817853e8a75d2297f0717f5eb6728fc39d4208d292a55299e66cd"} Apr 17 00:01:05.476702 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:05.476678 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw" event={"ID":"1e73630f-a6ce-4465-9737-70bb0108a35f","Type":"ContainerStarted","Data":"f54d2f6b37036731617b26a06cd3d55b80b8fd04b8e7bf4e2753baca2a91aa26"} Apr 17 00:01:05.476702 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:05.476704 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw" event={"ID":"1e73630f-a6ce-4465-9737-70bb0108a35f","Type":"ContainerStarted","Data":"8cdc06645196f8631ff52b64a47941700f5fae6c1aa695de6f549855bc45d31e"} Apr 17 00:01:10.495918 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:10.495832 2562 generic.go:358] "Generic (PLEG): container finished" podID="1e73630f-a6ce-4465-9737-70bb0108a35f" containerID="f54d2f6b37036731617b26a06cd3d55b80b8fd04b8e7bf4e2753baca2a91aa26" exitCode=0 Apr 17 00:01:10.495918 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:10.495892 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw" event={"ID":"1e73630f-a6ce-4465-9737-70bb0108a35f","Type":"ContainerDied","Data":"f54d2f6b37036731617b26a06cd3d55b80b8fd04b8e7bf4e2753baca2a91aa26"} Apr 17 00:01:11.504705 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:11.504665 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw" event={"ID":"1e73630f-a6ce-4465-9737-70bb0108a35f","Type":"ContainerStarted","Data":"d3d2701cc69d8e1360ff0cf7b620f3c384ff4eb912e347151a9baf8034e8c7a7"} Apr 17 00:01:11.505167 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:11.504987 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw" Apr 17 00:01:11.522772 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:11.522724 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw" podStartSLOduration=7.349941524 podStartE2EDuration="7.522709876s" podCreationTimestamp="2026-04-17 00:01:04 +0000 UTC" firstStartedPulling="2026-04-17 00:01:10.496692756 +0000 UTC m=+642.705457405" lastFinishedPulling="2026-04-17 00:01:10.669461105 +0000 UTC m=+642.878225757" observedRunningTime="2026-04-17 00:01:11.520488002 +0000 UTC m=+643.729252671" watchObservedRunningTime="2026-04-17 00:01:11.522709876 +0000 UTC m=+643.731474546" Apr 17 00:01:13.513047 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:13.513015 2562 generic.go:358] "Generic (PLEG): container finished" podID="40fa06a7-6f3f-49c6-9c7f-015774a95220" containerID="be6c456cc8b0c76279121b52729316b479a10368778832cad5415a83a4256151" exitCode=0 Apr 17 00:01:13.513402 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:13.513054 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-87xjb" event={"ID":"40fa06a7-6f3f-49c6-9c7f-015774a95220","Type":"ContainerDied","Data":"be6c456cc8b0c76279121b52729316b479a10368778832cad5415a83a4256151"} Apr 17 00:01:14.518414 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:14.518380 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-87xjb" event={"ID":"40fa06a7-6f3f-49c6-9c7f-015774a95220","Type":"ContainerStarted","Data":"1c69bb10868d8907473c25b06e2c01b72b268e67325790feabf603230a64d253"} Apr 17 00:01:14.518877 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:14.518586 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-87xjb" Apr 17 00:01:14.534563 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:14.534498 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-87xjb" podStartSLOduration=10.374050481 podStartE2EDuration="10.534484903s" podCreationTimestamp="2026-04-17 00:01:04 +0000 UTC" firstStartedPulling="2026-04-17 00:01:13.513680712 +0000 UTC m=+645.722445360" lastFinishedPulling="2026-04-17 00:01:13.674115134 +0000 UTC m=+645.882879782" observedRunningTime="2026-04-17 00:01:14.533325798 +0000 UTC m=+646.742090467" watchObservedRunningTime="2026-04-17 00:01:14.534484903 +0000 UTC m=+646.743249572" Apr 17 00:01:22.521685 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:22.521653 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw" Apr 17 00:01:25.534253 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:01:25.534221 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-87xjb" Apr 17 00:05:28.313350 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:05:28.313269 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrpzx_5fddaa09-016b-490d-a9ab-9d50ff167b22/console-operator/2.log" Apr 17 00:05:28.316156 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:05:28.316135 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrpzx_5fddaa09-016b-490d-a9ab-9d50ff167b22/console-operator/2.log" Apr 17 00:10:28.342343 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:10:28.342314 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrpzx_5fddaa09-016b-490d-a9ab-9d50ff167b22/console-operator/2.log" Apr 17 00:10:28.347397 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:10:28.347375 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrpzx_5fddaa09-016b-490d-a9ab-9d50ff167b22/console-operator/2.log" Apr 17 00:15:28.368177 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:15:28.368140 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrpzx_5fddaa09-016b-490d-a9ab-9d50ff167b22/console-operator/2.log" Apr 17 00:15:28.374716 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:15:28.374689 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrpzx_5fddaa09-016b-490d-a9ab-9d50ff167b22/console-operator/2.log" Apr 17 00:20:28.393146 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:20:28.393117 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrpzx_5fddaa09-016b-490d-a9ab-9d50ff167b22/console-operator/2.log" Apr 17 00:20:28.400764 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:20:28.400742 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrpzx_5fddaa09-016b-490d-a9ab-9d50ff167b22/console-operator/2.log" Apr 17 00:24:31.891511 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:31.891483 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-bf54d8685-5pzhh_dabfacec-d089-4662-a3f0-dfad7cab3b53/manager/0.log" Apr 17 00:24:34.280971 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:34.280945 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-zhlq6_4048cf65-31fc-4b21-869b-07bdf03f9acb/manager/0.log" Apr 17 00:24:34.837399 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:34.837370 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-5cd78b4564-6xccd_2b5911e9-0274-4fa8-b0b5-12d942461233/kube-auth-proxy/0.log" Apr 17 00:24:35.730080 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:35.730047 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-6d5965695-zcj96_cd0aa85c-579b-4ee5-9346-65425ddde1b9/storage-initializer/0.log" Apr 17 00:24:35.737459 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:35.737442 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-6d5965695-zcj96_cd0aa85c-579b-4ee5-9346-65425ddde1b9/main/0.log" Apr 17 00:24:35.959184 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:35.959157 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw_1e73630f-a6ce-4465-9737-70bb0108a35f/main/0.log" Apr 17 00:24:35.965325 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:35.965305 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-7889cd8c78-flwlw_1e73630f-a6ce-4465-9737-70bb0108a35f/storage-initializer/0.log" Apr 17 00:24:36.074691 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:36.074634 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-f5df4587b-87xjb_40fa06a7-6f3f-49c6-9c7f-015774a95220/storage-initializer/0.log" Apr 17 00:24:36.081840 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:36.081817 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-f5df4587b-87xjb_40fa06a7-6f3f-49c6-9c7f-015774a95220/main/0.log" Apr 17 00:24:42.718694 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:42.718665 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-rnrxc_78ecb6de-eebd-4a1c-a6d7-206ea9999103/global-pull-secret-syncer/0.log" Apr 17 00:24:42.827443 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:42.827414 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-h2hdz_e6f655b8-f886-471e-9fd2-4685907346a7/konnectivity-agent/0.log" Apr 17 00:24:42.871254 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:42.871231 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-98.ec2.internal_1143ec080bc3618a70cd0c3fccc6942e/haproxy/0.log" Apr 17 00:24:47.661992 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:47.661914 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-zhlq6_4048cf65-31fc-4b21-869b-07bdf03f9acb/manager/0.log" Apr 17 00:24:49.368556 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:49.368486 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-xcgwl_f349e78a-8ba4-44d0-86d8-b96de7477daa/kube-state-metrics/0.log" Apr 17 00:24:49.398798 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:49.398774 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-xcgwl_f349e78a-8ba4-44d0-86d8-b96de7477daa/kube-rbac-proxy-main/0.log" Apr 17 00:24:49.425617 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:49.425595 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-xcgwl_f349e78a-8ba4-44d0-86d8-b96de7477daa/kube-rbac-proxy-self/0.log" Apr 17 00:24:49.517181 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:49.517154 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bf25v_d6890379-66f0-4714-b0f4-da3de91cbdc8/node-exporter/0.log" Apr 17 00:24:49.538817 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:49.538795 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bf25v_d6890379-66f0-4714-b0f4-da3de91cbdc8/kube-rbac-proxy/0.log" Apr 17 00:24:49.562880 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:49.562860 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bf25v_d6890379-66f0-4714-b0f4-da3de91cbdc8/init-textfile/0.log" Apr 17 00:24:49.740585 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:49.740565 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-c4nhb_75c3d93e-0ccd-45ed-87ab-1abfec9fc614/kube-rbac-proxy-main/0.log" Apr 17 00:24:49.769111 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:49.769091 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-c4nhb_75c3d93e-0ccd-45ed-87ab-1abfec9fc614/kube-rbac-proxy-self/0.log" Apr 17 00:24:49.796750 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:49.796727 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-c4nhb_75c3d93e-0ccd-45ed-87ab-1abfec9fc614/openshift-state-metrics/0.log" Apr 17 00:24:49.841684 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:49.841662 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_12db5959-3135-4d08-a456-a00777f84b9e/prometheus/0.log" Apr 17 00:24:49.862048 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:49.862027 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_12db5959-3135-4d08-a456-a00777f84b9e/config-reloader/0.log" Apr 17 00:24:49.904236 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:49.904216 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_12db5959-3135-4d08-a456-a00777f84b9e/thanos-sidecar/0.log" Apr 17 00:24:49.926288 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:49.926267 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_12db5959-3135-4d08-a456-a00777f84b9e/kube-rbac-proxy-web/0.log" Apr 17 00:24:49.948904 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:49.948889 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_12db5959-3135-4d08-a456-a00777f84b9e/kube-rbac-proxy/0.log" Apr 17 00:24:49.971400 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:49.971378 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_12db5959-3135-4d08-a456-a00777f84b9e/kube-rbac-proxy-thanos/0.log" Apr 17 00:24:49.993024 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:49.992973 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_12db5959-3135-4d08-a456-a00777f84b9e/init-config-reloader/0.log" Apr 17 00:24:50.022620 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:50.022600 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-dh4k6_04c6d30a-a073-440c-bd31-f046f4e9c445/prometheus-operator/0.log" Apr 17 00:24:50.048419 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:50.048399 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-dh4k6_04c6d30a-a073-440c-bd31-f046f4e9c445/kube-rbac-proxy/0.log" Apr 17 00:24:50.100715 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:50.100677 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-59ff69d868-kzfwh_d4d50529-d814-4e89-9eaf-fae37ca01719/telemeter-client/0.log" Apr 17 00:24:50.122598 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:50.122573 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-59ff69d868-kzfwh_d4d50529-d814-4e89-9eaf-fae37ca01719/reload/0.log" Apr 17 00:24:50.144773 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:50.144748 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-59ff69d868-kzfwh_d4d50529-d814-4e89-9eaf-fae37ca01719/kube-rbac-proxy/0.log" Apr 17 00:24:50.178557 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:50.178520 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-55898d7b96-zltwd_d6a1a371-ed23-4e03-8408-d1ea20b07e24/thanos-query/0.log" Apr 17 00:24:50.198794 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:50.198771 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-55898d7b96-zltwd_d6a1a371-ed23-4e03-8408-d1ea20b07e24/kube-rbac-proxy-web/0.log" Apr 17 00:24:50.228399 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:50.228378 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-55898d7b96-zltwd_d6a1a371-ed23-4e03-8408-d1ea20b07e24/kube-rbac-proxy/0.log" Apr 17 00:24:50.244741 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:50.244685 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-55898d7b96-zltwd_d6a1a371-ed23-4e03-8408-d1ea20b07e24/prom-label-proxy/0.log" Apr 17 00:24:50.266938 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:50.266906 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-55898d7b96-zltwd_d6a1a371-ed23-4e03-8408-d1ea20b07e24/kube-rbac-proxy-rules/0.log" Apr 17 00:24:50.288693 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:50.288676 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-55898d7b96-zltwd_d6a1a371-ed23-4e03-8408-d1ea20b07e24/kube-rbac-proxy-metrics/0.log" Apr 17 00:24:51.618912 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:51.618875 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vl859/perf-node-gather-daemonset-5vlfg"] Apr 17 00:24:51.622462 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:51.622438 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vl859/perf-node-gather-daemonset-5vlfg" Apr 17 00:24:51.624718 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:51.624695 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vl859\"/\"openshift-service-ca.crt\"" Apr 17 00:24:51.625412 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:51.625392 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vl859\"/\"kube-root-ca.crt\"" Apr 17 00:24:51.625504 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:51.625454 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-vl859\"/\"default-dockercfg-cqxln\"" Apr 17 00:24:51.628103 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:51.628083 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vl859/perf-node-gather-daemonset-5vlfg"] Apr 17 00:24:51.653463 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:51.653441 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3d2d92c4-66bf-40f7-b262-636e95a41f22-lib-modules\") pod \"perf-node-gather-daemonset-5vlfg\" (UID: \"3d2d92c4-66bf-40f7-b262-636e95a41f22\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-5vlfg" Apr 17 00:24:51.653553 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:51.653480 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzvq4\" (UniqueName: \"kubernetes.io/projected/3d2d92c4-66bf-40f7-b262-636e95a41f22-kube-api-access-jzvq4\") pod \"perf-node-gather-daemonset-5vlfg\" (UID: \"3d2d92c4-66bf-40f7-b262-636e95a41f22\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-5vlfg" Apr 17 00:24:51.653553 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:51.653503 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3d2d92c4-66bf-40f7-b262-636e95a41f22-podres\") pod \"perf-node-gather-daemonset-5vlfg\" (UID: \"3d2d92c4-66bf-40f7-b262-636e95a41f22\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-5vlfg" Apr 17 00:24:51.653628 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:51.653557 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3d2d92c4-66bf-40f7-b262-636e95a41f22-sys\") pod \"perf-node-gather-daemonset-5vlfg\" (UID: \"3d2d92c4-66bf-40f7-b262-636e95a41f22\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-5vlfg" Apr 17 00:24:51.653662 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:51.653624 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3d2d92c4-66bf-40f7-b262-636e95a41f22-proc\") pod \"perf-node-gather-daemonset-5vlfg\" (UID: \"3d2d92c4-66bf-40f7-b262-636e95a41f22\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-5vlfg" Apr 17 00:24:51.754648 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:51.754619 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3d2d92c4-66bf-40f7-b262-636e95a41f22-proc\") pod \"perf-node-gather-daemonset-5vlfg\" (UID: \"3d2d92c4-66bf-40f7-b262-636e95a41f22\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-5vlfg" Apr 17 00:24:51.754753 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:51.754674 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3d2d92c4-66bf-40f7-b262-636e95a41f22-lib-modules\") pod \"perf-node-gather-daemonset-5vlfg\" (UID: \"3d2d92c4-66bf-40f7-b262-636e95a41f22\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-5vlfg" Apr 17 00:24:51.754753 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:51.754698 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jzvq4\" (UniqueName: \"kubernetes.io/projected/3d2d92c4-66bf-40f7-b262-636e95a41f22-kube-api-access-jzvq4\") pod \"perf-node-gather-daemonset-5vlfg\" (UID: \"3d2d92c4-66bf-40f7-b262-636e95a41f22\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-5vlfg" Apr 17 00:24:51.754753 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:51.754704 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3d2d92c4-66bf-40f7-b262-636e95a41f22-proc\") pod \"perf-node-gather-daemonset-5vlfg\" (UID: \"3d2d92c4-66bf-40f7-b262-636e95a41f22\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-5vlfg" Apr 17 00:24:51.754753 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:51.754714 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3d2d92c4-66bf-40f7-b262-636e95a41f22-podres\") pod \"perf-node-gather-daemonset-5vlfg\" (UID: \"3d2d92c4-66bf-40f7-b262-636e95a41f22\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-5vlfg" Apr 17 00:24:51.754922 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:51.754792 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3d2d92c4-66bf-40f7-b262-636e95a41f22-sys\") pod \"perf-node-gather-daemonset-5vlfg\" (UID: \"3d2d92c4-66bf-40f7-b262-636e95a41f22\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-5vlfg" Apr 17 00:24:51.754922 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:51.754831 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3d2d92c4-66bf-40f7-b262-636e95a41f22-lib-modules\") pod \"perf-node-gather-daemonset-5vlfg\" (UID: \"3d2d92c4-66bf-40f7-b262-636e95a41f22\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-5vlfg" Apr 17 00:24:51.754922 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:51.754823 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3d2d92c4-66bf-40f7-b262-636e95a41f22-podres\") pod \"perf-node-gather-daemonset-5vlfg\" (UID: \"3d2d92c4-66bf-40f7-b262-636e95a41f22\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-5vlfg" Apr 17 00:24:51.754922 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:51.754876 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3d2d92c4-66bf-40f7-b262-636e95a41f22-sys\") pod \"perf-node-gather-daemonset-5vlfg\" (UID: \"3d2d92c4-66bf-40f7-b262-636e95a41f22\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-5vlfg" Apr 17 00:24:51.762058 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:51.762031 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzvq4\" (UniqueName: \"kubernetes.io/projected/3d2d92c4-66bf-40f7-b262-636e95a41f22-kube-api-access-jzvq4\") pod \"perf-node-gather-daemonset-5vlfg\" (UID: \"3d2d92c4-66bf-40f7-b262-636e95a41f22\") " pod="openshift-must-gather-vl859/perf-node-gather-daemonset-5vlfg" Apr 17 00:24:51.888247 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:51.888148 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrpzx_5fddaa09-016b-490d-a9ab-9d50ff167b22/console-operator/2.log" Apr 17 00:24:51.892973 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:51.892956 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrpzx_5fddaa09-016b-490d-a9ab-9d50ff167b22/console-operator/3.log" Apr 17 00:24:51.933198 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:51.933165 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vl859/perf-node-gather-daemonset-5vlfg" Apr 17 00:24:52.050867 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:52.050842 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vl859/perf-node-gather-daemonset-5vlfg"] Apr 17 00:24:52.052990 ip-10-0-128-98 kubenswrapper[2562]: W0417 00:24:52.052959 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3d2d92c4_66bf_40f7_b262_636e95a41f22.slice/crio-da0a2800fcd1a2dc1a97845134d91d5629020417afd1f3d475b0cc37958b285a WatchSource:0}: Error finding container da0a2800fcd1a2dc1a97845134d91d5629020417afd1f3d475b0cc37958b285a: Status 404 returned error can't find the container with id da0a2800fcd1a2dc1a97845134d91d5629020417afd1f3d475b0cc37958b285a Apr 17 00:24:52.054729 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:52.054708 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 00:24:52.333478 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:52.333444 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vl859/perf-node-gather-daemonset-5vlfg" event={"ID":"3d2d92c4-66bf-40f7-b262-636e95a41f22","Type":"ContainerStarted","Data":"0c7a6800bd78d5e16eeb702395e29dc237a63154755507223d9d56b5ff48b703"} Apr 17 00:24:52.333478 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:52.333479 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vl859/perf-node-gather-daemonset-5vlfg" event={"ID":"3d2d92c4-66bf-40f7-b262-636e95a41f22","Type":"ContainerStarted","Data":"da0a2800fcd1a2dc1a97845134d91d5629020417afd1f3d475b0cc37958b285a"} Apr 17 00:24:52.333715 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:52.333509 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-vl859/perf-node-gather-daemonset-5vlfg" Apr 17 00:24:52.349484 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:52.349441 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vl859/perf-node-gather-daemonset-5vlfg" podStartSLOduration=1.349428782 podStartE2EDuration="1.349428782s" podCreationTimestamp="2026-04-17 00:24:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 00:24:52.346914127 +0000 UTC m=+2064.555678794" watchObservedRunningTime="2026-04-17 00:24:52.349428782 +0000 UTC m=+2064.558193484" Apr 17 00:24:53.641329 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:53.641305 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-dxzbw_66c9f5e2-4d93-4cf6-96de-a4ade7175b77/dns/0.log" Apr 17 00:24:53.662086 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:53.662065 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-dxzbw_66c9f5e2-4d93-4cf6-96de-a4ade7175b77/kube-rbac-proxy/0.log" Apr 17 00:24:53.785684 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:53.785651 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mhvhh_e55c5371-047f-4464-8977-501dfa689dd1/dns-node-resolver/0.log" Apr 17 00:24:54.242860 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:54.242831 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-pruner-29606400-kq2wv_d9028d25-1c06-432c-b3ac-b1bb4670b579/image-pruner/0.log" Apr 17 00:24:54.341010 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:54.340982 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-hb7s9_ac3d5de2-ce49-4327-89f8-f4642c3bc2f3/node-ca/0.log" Apr 17 00:24:55.239021 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:55.238981 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-5cd78b4564-6xccd_2b5911e9-0274-4fa8-b0b5-12d942461233/kube-auth-proxy/0.log" Apr 17 00:24:55.860753 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:55.860702 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-495hb_6d46b010-8236-429d-af09-8fb8c4618d50/serve-healthcheck-canary/0.log" Apr 17 00:24:56.387508 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:56.387480 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-prmqz_e44cc99f-6cf4-4c00-ac07-e08e682c6db6/insights-operator/0.log" Apr 17 00:24:56.388991 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:56.388975 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-prmqz_e44cc99f-6cf4-4c00-ac07-e08e682c6db6/insights-operator/1.log" Apr 17 00:24:56.526130 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:56.526105 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-cqkvw_53632d89-82ac-405b-8bbf-09bc404147e0/kube-rbac-proxy/0.log" Apr 17 00:24:56.547244 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:56.547204 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-cqkvw_53632d89-82ac-405b-8bbf-09bc404147e0/exporter/0.log" Apr 17 00:24:56.567383 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:56.567356 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-cqkvw_53632d89-82ac-405b-8bbf-09bc404147e0/extractor/0.log" Apr 17 00:24:58.347971 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:58.347946 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-vl859/perf-node-gather-daemonset-5vlfg" Apr 17 00:24:58.493755 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:58.493714 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-bf54d8685-5pzhh_dabfacec-d089-4662-a3f0-dfad7cab3b53/manager/0.log" Apr 17 00:24:59.681579 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:59.681552 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-6b6988ccb7-qs2zh_d01bbe12-b55d-461e-8495-1be44d06bbd3/manager/0.log" Apr 17 00:24:59.732989 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:24:59.732966 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-4p47t_f3068172-bd48-41a9-82aa-a6114fc0fd73/openshift-lws-operator/0.log" Apr 17 00:25:04.161157 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:25:04.161125 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-v8hbv_26f2616f-10db-4f0c-8b82-86c22caa6d59/migrator/0.log" Apr 17 00:25:04.181951 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:25:04.181921 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-v8hbv_26f2616f-10db-4f0c-8b82-86c22caa6d59/graceful-termination/0.log" Apr 17 00:25:04.547611 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:25:04.547582 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-c7mwx_c2e5d2fe-09a0-4110-9c80-2ae937a2a115/kube-storage-version-migrator-operator/1.log" Apr 17 00:25:04.548408 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:25:04.548390 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-c7mwx_c2e5d2fe-09a0-4110-9c80-2ae937a2a115/kube-storage-version-migrator-operator/0.log" Apr 17 00:25:05.776084 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:25:05.776055 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kwxk9_987f3171-af5b-41d5-91e1-d31f667a8755/kube-multus-additional-cni-plugins/0.log" Apr 17 00:25:05.798906 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:25:05.798882 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kwxk9_987f3171-af5b-41d5-91e1-d31f667a8755/egress-router-binary-copy/0.log" Apr 17 00:25:05.820618 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:25:05.820594 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kwxk9_987f3171-af5b-41d5-91e1-d31f667a8755/cni-plugins/0.log" Apr 17 00:25:05.841825 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:25:05.841801 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kwxk9_987f3171-af5b-41d5-91e1-d31f667a8755/bond-cni-plugin/0.log" Apr 17 00:25:05.863250 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:25:05.863222 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kwxk9_987f3171-af5b-41d5-91e1-d31f667a8755/routeoverride-cni/0.log" Apr 17 00:25:05.884438 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:25:05.884413 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kwxk9_987f3171-af5b-41d5-91e1-d31f667a8755/whereabouts-cni-bincopy/0.log" Apr 17 00:25:05.905818 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:25:05.905797 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kwxk9_987f3171-af5b-41d5-91e1-d31f667a8755/whereabouts-cni/0.log" Apr 17 00:25:06.104251 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:25:06.104172 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mxms2_e3f26c03-4fef-458e-8cf1-6d18222c3545/kube-multus/0.log" Apr 17 00:25:06.164483 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:25:06.164459 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-f6rhw_86301427-2e66-4c3a-ab22-55ef8ddc5580/network-metrics-daemon/0.log" Apr 17 00:25:06.183915 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:25:06.183898 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-f6rhw_86301427-2e66-4c3a-ab22-55ef8ddc5580/kube-rbac-proxy/0.log" Apr 17 00:25:07.258851 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:25:07.258824 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-frzl2_94b51208-78d1-423e-a0fd-88a34e175744/ovn-controller/0.log" Apr 17 00:25:07.288712 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:25:07.288689 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-frzl2_94b51208-78d1-423e-a0fd-88a34e175744/ovn-acl-logging/0.log" Apr 17 00:25:07.307173 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:25:07.307155 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-frzl2_94b51208-78d1-423e-a0fd-88a34e175744/kube-rbac-proxy-node/0.log" Apr 17 00:25:07.327964 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:25:07.327942 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-frzl2_94b51208-78d1-423e-a0fd-88a34e175744/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 00:25:07.349082 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:25:07.349066 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-frzl2_94b51208-78d1-423e-a0fd-88a34e175744/northd/0.log" Apr 17 00:25:07.371838 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:25:07.371821 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-frzl2_94b51208-78d1-423e-a0fd-88a34e175744/nbdb/0.log" Apr 17 00:25:07.393050 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:25:07.393031 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-frzl2_94b51208-78d1-423e-a0fd-88a34e175744/sbdb/0.log" Apr 17 00:25:07.483262 ip-10-0-128-98 kubenswrapper[2562]: I0417 00:25:07.483242 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-frzl2_94b51208-78d1-423e-a0fd-88a34e175744/ovnkube-controller/0.log"