Apr 23 16:32:29.126509 ip-10-0-128-102 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 23 16:32:29.126537 ip-10-0-128-102 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 23 16:32:29.126546 ip-10-0-128-102 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 23 16:32:29.126787 ip-10-0-128-102 systemd[1]: Failed to start Kubernetes Kubelet. Apr 23 16:32:39.364617 ip-10-0-128-102 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 23 16:32:39.364634 ip-10-0-128-102 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 1756be821be842cf89b5979353f98480 -- Apr 23 16:35:14.528942 ip-10-0-128-102 systemd[1]: Starting Kubernetes Kubelet... Apr 23 16:35:15.039119 ip-10-0-128-102 kubenswrapper[2569]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 16:35:15.039119 ip-10-0-128-102 kubenswrapper[2569]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 16:35:15.039119 ip-10-0-128-102 kubenswrapper[2569]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 16:35:15.039119 ip-10-0-128-102 kubenswrapper[2569]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 16:35:15.039119 ip-10-0-128-102 kubenswrapper[2569]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 16:35:15.039846 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.039187 2569 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 16:35:15.042646 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042632 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:15.042646 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042646 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:15.042719 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042652 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:15.042719 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042667 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:15.042719 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042671 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:15.042719 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042675 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:15.042719 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042678 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:15.042719 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042681 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:15.042719 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042684 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:15.042719 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042687 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:15.042719 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042689 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:15.042719 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042692 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:15.042719 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042695 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:15.042719 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042698 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:15.042719 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042700 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:15.042719 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042703 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:15.042719 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042705 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:15.042719 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042708 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:15.042719 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042711 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:15.042719 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042714 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:15.042719 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042717 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:15.043172 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042720 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:15.043172 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042723 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:15.043172 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042726 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:15.043172 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042730 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:15.043172 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042734 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:15.043172 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042737 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:15.043172 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042739 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:15.043172 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042742 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:15.043172 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042745 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:15.043172 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042748 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:15.043172 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042750 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:15.043172 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042754 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:15.043172 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042757 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:15.043172 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042761 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:15.043172 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042763 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:15.043172 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042766 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:15.043172 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042769 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:15.043172 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042771 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:15.043172 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042774 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:15.043626 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042776 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:15.043626 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042779 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:15.043626 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042781 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:15.043626 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042784 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:15.043626 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042786 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:15.043626 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042789 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:15.043626 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042793 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:15.043626 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042795 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:15.043626 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042798 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:15.043626 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042800 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:15.043626 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042803 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:15.043626 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042805 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:15.043626 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042808 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:15.043626 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042810 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:15.043626 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042814 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:15.043626 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042817 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:15.043626 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042819 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:15.043626 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042822 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:15.043626 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042824 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:15.043626 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042827 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:15.044135 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042830 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:15.044135 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042832 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:15.044135 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042835 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:15.044135 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042837 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:15.044135 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042841 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:15.044135 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042843 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:15.044135 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042846 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:15.044135 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042848 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:15.044135 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042851 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:15.044135 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042853 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:15.044135 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042856 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:15.044135 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042858 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:15.044135 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042861 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:15.044135 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042864 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:15.044135 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042867 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:15.044135 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042869 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:15.044135 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042872 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:15.044135 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042874 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:15.044135 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042877 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:15.044135 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042879 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:15.044615 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042882 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:15.044615 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042884 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:15.044615 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042887 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:15.044615 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042889 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:15.044615 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042892 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:15.044615 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.042894 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:15.044615 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043293 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:15.044615 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043298 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:15.044615 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043302 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:15.044615 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043305 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:15.044615 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043308 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:15.044615 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043312 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:15.044615 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043314 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:15.044615 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043317 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:15.044615 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043319 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:15.044615 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043322 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:15.044615 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043324 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:15.044615 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043327 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:15.044615 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043330 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:15.044615 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043332 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:15.045115 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043335 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:15.045115 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043338 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:15.045115 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043341 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:15.045115 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043344 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:15.045115 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043348 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:15.045115 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043351 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:15.045115 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043354 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:15.045115 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043357 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:15.045115 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043360 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:15.045115 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043362 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:15.045115 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043365 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:15.045115 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043367 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:15.045115 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043371 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:15.045115 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043375 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:15.045115 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043378 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:15.045115 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043381 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:15.045115 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043384 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:15.045115 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043387 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:15.045115 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043390 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:15.045619 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043393 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:15.045619 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043396 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:15.045619 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043398 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:15.045619 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043401 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:15.045619 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043403 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:15.045619 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043406 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:15.045619 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043409 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:15.045619 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043411 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:15.045619 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043414 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:15.045619 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043416 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:15.045619 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043419 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:15.045619 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043421 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:15.045619 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043424 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:15.045619 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043427 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:15.045619 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043429 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:15.045619 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043432 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:15.045619 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043435 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:15.045619 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043437 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:15.045619 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043440 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:15.045619 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043443 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:15.046115 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043445 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:15.046115 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043448 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:15.046115 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043450 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:15.046115 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043453 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:15.046115 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043456 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:15.046115 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043458 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:15.046115 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043461 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:15.046115 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043463 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:15.046115 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043466 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:15.046115 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043468 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:15.046115 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043471 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:15.046115 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043474 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:15.046115 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043476 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:15.046115 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043479 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:15.046115 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043481 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:15.046115 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043484 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:15.046115 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043486 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:15.046115 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043489 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:15.046115 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043492 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:15.046115 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043494 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:15.046604 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043496 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:15.046604 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043499 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:15.046604 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043502 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:15.046604 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043504 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:15.046604 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043507 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:15.046604 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043509 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:15.046604 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043512 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:15.046604 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043515 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:15.046604 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043518 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:15.046604 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043521 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:15.046604 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043523 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:15.046604 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043526 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:15.046604 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.043530 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:15.046604 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044538 2569 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 16:35:15.046604 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044548 2569 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 16:35:15.046604 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044556 2569 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 16:35:15.046604 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044561 2569 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 16:35:15.046604 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044566 2569 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 16:35:15.046604 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044569 2569 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 16:35:15.046604 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044574 2569 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 16:35:15.046604 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044579 2569 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 16:35:15.047134 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044582 2569 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 16:35:15.047134 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044585 2569 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 16:35:15.047134 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044588 2569 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 16:35:15.047134 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044592 2569 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 16:35:15.047134 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044595 2569 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 16:35:15.047134 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044598 2569 flags.go:64] FLAG: --cgroup-root="" Apr 23 16:35:15.047134 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044601 2569 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 16:35:15.047134 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044604 2569 flags.go:64] FLAG: --client-ca-file="" Apr 23 16:35:15.047134 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044607 2569 flags.go:64] FLAG: --cloud-config="" Apr 23 16:35:15.047134 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044610 2569 flags.go:64] FLAG: --cloud-provider="external" Apr 23 16:35:15.047134 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044613 2569 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 16:35:15.047134 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044617 2569 flags.go:64] FLAG: --cluster-domain="" Apr 23 16:35:15.047134 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044620 2569 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 16:35:15.047134 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044623 2569 flags.go:64] FLAG: --config-dir="" Apr 23 16:35:15.047134 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044626 2569 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 16:35:15.047134 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044630 2569 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 16:35:15.047134 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044634 2569 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 16:35:15.047134 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044640 2569 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 16:35:15.047134 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044643 2569 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 16:35:15.047134 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044646 2569 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 16:35:15.047134 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044650 2569 flags.go:64] FLAG: --contention-profiling="false" Apr 23 16:35:15.047134 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044653 2569 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 16:35:15.047134 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044671 2569 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 16:35:15.047134 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044676 2569 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 16:35:15.047134 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044679 2569 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 16:35:15.047734 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044683 2569 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 16:35:15.047734 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044686 2569 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 16:35:15.047734 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044689 2569 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 16:35:15.047734 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044693 2569 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 16:35:15.047734 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044696 2569 flags.go:64] FLAG: --enable-server="true" Apr 23 16:35:15.047734 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044699 2569 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 16:35:15.047734 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044704 2569 flags.go:64] FLAG: --event-burst="100" Apr 23 16:35:15.047734 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044707 2569 flags.go:64] FLAG: --event-qps="50" Apr 23 16:35:15.047734 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044710 2569 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 16:35:15.047734 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044713 2569 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 16:35:15.047734 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044716 2569 flags.go:64] FLAG: --eviction-hard="" Apr 23 16:35:15.047734 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044720 2569 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 16:35:15.047734 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044723 2569 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 16:35:15.047734 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044726 2569 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 16:35:15.047734 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044729 2569 flags.go:64] FLAG: --eviction-soft="" Apr 23 16:35:15.047734 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044732 2569 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 16:35:15.047734 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044735 2569 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 16:35:15.047734 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044738 2569 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 16:35:15.047734 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044741 2569 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 16:35:15.047734 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044744 2569 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 16:35:15.047734 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044747 2569 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 16:35:15.047734 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044750 2569 flags.go:64] FLAG: --feature-gates="" Apr 23 16:35:15.047734 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044754 2569 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 16:35:15.047734 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044757 2569 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 16:35:15.047734 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044761 2569 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 16:35:15.048333 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044765 2569 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 16:35:15.048333 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044768 2569 flags.go:64] FLAG: --healthz-port="10248" Apr 23 16:35:15.048333 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044771 2569 flags.go:64] FLAG: --help="false" Apr 23 16:35:15.048333 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044774 2569 flags.go:64] FLAG: --hostname-override="ip-10-0-128-102.ec2.internal" Apr 23 16:35:15.048333 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044777 2569 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 16:35:15.048333 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044780 2569 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 16:35:15.048333 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044783 2569 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 16:35:15.048333 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044787 2569 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 16:35:15.048333 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044791 2569 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 16:35:15.048333 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044794 2569 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 16:35:15.048333 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044797 2569 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 16:35:15.048333 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044800 2569 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 16:35:15.048333 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044803 2569 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 16:35:15.048333 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044806 2569 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 16:35:15.048333 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044809 2569 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 16:35:15.048333 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044812 2569 flags.go:64] FLAG: --kube-reserved="" Apr 23 16:35:15.048333 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044815 2569 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 16:35:15.048333 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044818 2569 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 16:35:15.048333 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044821 2569 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 16:35:15.048333 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044824 2569 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 16:35:15.048333 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044826 2569 flags.go:64] FLAG: --lock-file="" Apr 23 16:35:15.048333 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044829 2569 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 16:35:15.048333 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044832 2569 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 16:35:15.048333 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044835 2569 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 16:35:15.048924 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044840 2569 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 16:35:15.048924 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044843 2569 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 16:35:15.048924 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044846 2569 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 16:35:15.048924 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044849 2569 flags.go:64] FLAG: --logging-format="text" Apr 23 16:35:15.048924 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044852 2569 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 16:35:15.048924 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044855 2569 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 16:35:15.048924 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044858 2569 flags.go:64] FLAG: --manifest-url="" Apr 23 16:35:15.048924 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044862 2569 flags.go:64] FLAG: --manifest-url-header="" Apr 23 16:35:15.048924 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044866 2569 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 16:35:15.048924 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044869 2569 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 16:35:15.048924 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044874 2569 flags.go:64] FLAG: --max-pods="110" Apr 23 16:35:15.048924 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044877 2569 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 16:35:15.048924 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044880 2569 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 16:35:15.048924 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044883 2569 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 16:35:15.048924 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044886 2569 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 16:35:15.048924 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044890 2569 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 16:35:15.048924 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044893 2569 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 16:35:15.048924 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044896 2569 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 16:35:15.048924 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044904 2569 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 16:35:15.048924 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044908 2569 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 16:35:15.048924 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044911 2569 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 16:35:15.048924 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044914 2569 flags.go:64] FLAG: --pod-cidr="" Apr 23 16:35:15.048924 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044917 2569 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 16:35:15.049548 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044923 2569 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 16:35:15.049548 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044926 2569 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 16:35:15.049548 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044929 2569 flags.go:64] FLAG: --pods-per-core="0" Apr 23 16:35:15.049548 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044932 2569 flags.go:64] FLAG: --port="10250" Apr 23 16:35:15.049548 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044936 2569 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 16:35:15.049548 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044939 2569 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-08a4c839d565df00f" Apr 23 16:35:15.049548 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044942 2569 flags.go:64] FLAG: --qos-reserved="" Apr 23 16:35:15.049548 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044945 2569 flags.go:64] FLAG: --read-only-port="10255" Apr 23 16:35:15.049548 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044948 2569 flags.go:64] FLAG: --register-node="true" Apr 23 16:35:15.049548 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044951 2569 flags.go:64] FLAG: --register-schedulable="true" Apr 23 16:35:15.049548 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044954 2569 flags.go:64] FLAG: --register-with-taints="" Apr 23 16:35:15.049548 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044957 2569 flags.go:64] FLAG: --registry-burst="10" Apr 23 16:35:15.049548 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044960 2569 flags.go:64] FLAG: --registry-qps="5" Apr 23 16:35:15.049548 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044963 2569 flags.go:64] FLAG: --reserved-cpus="" Apr 23 16:35:15.049548 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044966 2569 flags.go:64] FLAG: --reserved-memory="" Apr 23 16:35:15.049548 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044969 2569 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 16:35:15.049548 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044974 2569 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 16:35:15.049548 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044977 2569 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 16:35:15.049548 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044980 2569 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 16:35:15.049548 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044983 2569 flags.go:64] FLAG: --runonce="false" Apr 23 16:35:15.049548 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044986 2569 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 16:35:15.049548 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044989 2569 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 16:35:15.049548 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044992 2569 flags.go:64] FLAG: --seccomp-default="false" Apr 23 16:35:15.049548 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044996 2569 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 16:35:15.049548 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.044998 2569 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 16:35:15.049548 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.045006 2569 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 16:35:15.050222 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.045009 2569 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 16:35:15.050222 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.045013 2569 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 16:35:15.050222 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.045016 2569 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 16:35:15.050222 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.045019 2569 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 16:35:15.050222 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.045022 2569 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 16:35:15.050222 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.045025 2569 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 16:35:15.050222 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.045028 2569 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 16:35:15.050222 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.045031 2569 flags.go:64] FLAG: --system-cgroups="" Apr 23 16:35:15.050222 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.045034 2569 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 16:35:15.050222 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.045040 2569 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 16:35:15.050222 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.045043 2569 flags.go:64] FLAG: --tls-cert-file="" Apr 23 16:35:15.050222 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.045045 2569 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 16:35:15.050222 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.045049 2569 flags.go:64] FLAG: --tls-min-version="" Apr 23 16:35:15.050222 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.045052 2569 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 16:35:15.050222 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.045055 2569 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 16:35:15.050222 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.045057 2569 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 16:35:15.050222 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.045060 2569 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 16:35:15.050222 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.045063 2569 flags.go:64] FLAG: --v="2" Apr 23 16:35:15.050222 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.045068 2569 flags.go:64] FLAG: --version="false" Apr 23 16:35:15.050222 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.045072 2569 flags.go:64] FLAG: --vmodule="" Apr 23 16:35:15.050222 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.045076 2569 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 16:35:15.050222 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.045079 2569 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 16:35:15.050222 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045175 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:15.050222 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045178 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:15.050849 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045181 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:15.050849 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045186 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:15.050849 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045189 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:15.050849 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045193 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:15.050849 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045196 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:15.050849 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045199 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:15.050849 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045202 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:15.050849 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045205 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:15.050849 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045208 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:15.050849 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045211 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:15.050849 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045214 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:15.050849 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045217 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:15.050849 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045219 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:15.050849 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045222 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:15.050849 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045225 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:15.050849 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045227 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:15.050849 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045230 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:15.050849 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045233 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:15.050849 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045235 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:15.051487 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045238 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:15.051487 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045241 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:15.051487 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045243 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:15.051487 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045247 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:15.051487 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045250 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:15.051487 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045253 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:15.051487 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045255 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:15.051487 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045258 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:15.051487 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045260 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:15.051487 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045263 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:15.051487 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045265 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:15.051487 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045268 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:15.051487 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045271 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:15.051487 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045273 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:15.051487 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045276 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:15.051487 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045278 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:15.051487 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045281 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:15.051487 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045283 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:15.051487 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045286 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:15.052319 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045288 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:15.052319 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045291 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:15.052319 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045297 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:15.052319 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045301 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:15.052319 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045303 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:15.052319 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045306 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:15.052319 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045309 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:15.052319 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045311 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:15.052319 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045314 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:15.052319 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045317 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:15.052319 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045320 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:15.052319 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045322 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:15.052319 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045325 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:15.052319 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045328 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:15.052319 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045331 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:15.052319 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045333 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:15.052319 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045336 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:15.052319 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045339 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:15.052319 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045341 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:15.052319 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045344 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:15.053202 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045346 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:15.053202 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045349 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:15.053202 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045351 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:15.053202 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045354 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:15.053202 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045357 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:15.053202 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045359 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:15.053202 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045362 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:15.053202 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045364 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:15.053202 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045366 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:15.053202 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045369 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:15.053202 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045372 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:15.053202 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045374 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:15.053202 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045377 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:15.053202 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045379 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:15.053202 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045383 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:15.053202 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045385 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:15.053202 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045402 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:15.053202 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045405 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:15.053202 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045407 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:15.053202 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045410 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:15.054091 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045413 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:15.054091 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045415 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:15.054091 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045418 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:15.054091 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045420 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:15.054091 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045423 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:15.054091 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.045426 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:15.054091 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.046150 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 16:35:15.054091 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.053274 2569 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 16:35:15.054091 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.053294 2569 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 16:35:15.054091 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053364 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:15.054091 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053372 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:15.054091 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053378 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:15.054091 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053383 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:15.054091 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053388 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:15.054091 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053393 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:15.054091 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053397 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:15.054753 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053403 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:15.054753 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053407 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:15.054753 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053411 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:15.054753 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053416 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:15.054753 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053420 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:15.054753 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053424 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:15.054753 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053429 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:15.054753 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053433 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:15.054753 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053438 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:15.054753 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053442 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:15.054753 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053447 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:15.054753 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053451 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:15.054753 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053456 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:15.054753 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053460 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:15.054753 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053464 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:15.054753 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053468 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:15.054753 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053473 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:15.054753 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053477 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:15.054753 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053481 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:15.055224 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053488 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:15.055224 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053494 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:15.055224 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053500 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:15.055224 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053505 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:15.055224 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053510 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:15.055224 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053517 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:15.055224 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053523 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:15.055224 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053528 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:15.055224 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053533 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:15.055224 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053538 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:15.055224 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053542 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:15.055224 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053546 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:15.055224 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053550 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:15.055224 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053555 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:15.055224 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053559 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:15.055224 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053563 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:15.055224 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053567 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:15.055224 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053571 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:15.055224 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053576 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:15.055839 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053581 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:15.055839 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053585 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:15.055839 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053589 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:15.055839 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053593 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:15.055839 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053597 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:15.055839 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053602 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:15.055839 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053606 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:15.055839 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053610 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:15.055839 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053614 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:15.055839 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053618 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:15.055839 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053623 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:15.055839 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053627 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:15.055839 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053631 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:15.055839 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053635 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:15.055839 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053639 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:15.055839 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053644 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:15.055839 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053648 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:15.055839 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053652 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:15.055839 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053677 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:15.055839 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053683 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:15.056616 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053687 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:15.056616 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053691 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:15.056616 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053695 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:15.056616 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053700 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:15.056616 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053704 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:15.056616 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053708 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:15.056616 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053711 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:15.056616 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053716 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:15.056616 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053722 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:15.056616 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053728 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:15.056616 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053732 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:15.056616 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053737 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:15.056616 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053741 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:15.056616 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053745 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:15.056616 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053749 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:15.056616 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053753 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:15.056616 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053757 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:15.056616 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053761 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:15.056616 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053765 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:15.057432 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053769 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:15.057432 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053773 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:15.057432 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.053782 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 16:35:15.057432 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053953 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:15.057432 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053962 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:15.057432 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053967 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:15.057432 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053972 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:15.057432 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053976 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:15.057432 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053981 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:15.057432 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053985 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:15.057432 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053990 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:15.057432 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053994 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:15.057432 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.053999 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:15.057432 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054004 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:15.057432 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054008 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:15.057432 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054013 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:15.057987 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054017 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:15.057987 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054022 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:15.057987 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054026 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:15.057987 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054030 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:15.057987 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054036 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:15.057987 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054040 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:15.057987 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054044 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:15.057987 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054048 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:15.057987 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054052 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:15.057987 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054057 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:15.057987 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054061 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:15.057987 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054065 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:15.057987 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054069 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:15.057987 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054074 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:15.057987 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054078 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:15.057987 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054083 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:15.057987 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054087 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:15.057987 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054091 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:15.057987 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054095 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:15.057987 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054099 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:15.058510 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054103 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:15.058510 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054107 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:15.058510 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054111 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:15.058510 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054115 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:15.058510 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054119 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:15.058510 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054123 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:15.058510 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054130 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:15.058510 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054137 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:15.058510 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054142 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:15.058510 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054147 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:15.058510 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054153 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:15.058510 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054157 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:15.058510 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054162 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:15.058510 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054166 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:15.058510 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054171 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:15.058510 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054175 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:15.058510 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054179 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:15.058510 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054183 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:15.058510 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054188 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:15.058510 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054192 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:15.059050 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054196 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:15.059050 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054200 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:15.059050 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054204 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:15.059050 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054210 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:15.059050 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054215 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:15.059050 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054219 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:15.059050 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054223 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:15.059050 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054227 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:15.059050 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054232 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:15.059050 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054236 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:15.059050 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054240 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:15.059050 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054244 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:15.059050 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054248 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:15.059050 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054252 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:15.059050 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054257 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:15.059050 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054261 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:15.059050 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054265 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:15.059050 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054270 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:15.059050 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054274 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:15.059618 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054278 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:15.059618 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054282 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:15.059618 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054286 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:15.059618 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054290 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:15.059618 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054296 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:15.059618 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054300 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:15.059618 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054304 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:15.059618 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054308 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:15.059618 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054313 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:15.059618 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054317 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:15.059618 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054321 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:15.059618 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054325 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:15.059618 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054329 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:15.059618 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:15.054334 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:15.059618 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.054343 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 16:35:15.059618 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.055335 2569 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 16:35:15.060076 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.059069 2569 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 16:35:15.060245 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.060230 2569 server.go:1019] "Starting client certificate rotation" Apr 23 16:35:15.060347 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.060331 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 16:35:15.060919 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.060908 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 16:35:15.091723 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.091704 2569 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 16:35:15.097303 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.097173 2569 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 16:35:15.111877 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.111855 2569 log.go:25] "Validated CRI v1 runtime API" Apr 23 16:35:15.117981 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.117967 2569 log.go:25] "Validated CRI v1 image API" Apr 23 16:35:15.119160 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.119144 2569 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 16:35:15.126271 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.126245 2569 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 7df1b3f0-7388-4415-a223-33d6dabc34f5:/dev/nvme0n1p4 bf8f5068-6c03-4655-ab1b-4da240ab984b:/dev/nvme0n1p3] Apr 23 16:35:15.126338 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.126267 2569 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 16:35:15.126827 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.126811 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 16:35:15.132346 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.132235 2569 manager.go:217] Machine: {Timestamp:2026-04-23 16:35:15.129991122 +0000 UTC m=+0.467254078 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099111 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec23ca254293cee191d05680660154c1 SystemUUID:ec23ca25-4293-cee1-91d0-5680660154c1 BootID:1756be82-1be8-42cf-89b5-979353f98480 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:8f:1e:09:29:13 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:8f:1e:09:29:13 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:d2:4a:44:54:44:d1 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 16:35:15.132346 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.132337 2569 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 16:35:15.132465 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.132419 2569 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 16:35:15.133598 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.133573 2569 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 16:35:15.133762 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.133600 2569 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-102.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 16:35:15.133805 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.133770 2569 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 16:35:15.133805 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.133779 2569 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 16:35:15.133805 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.133792 2569 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 16:35:15.134521 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.134511 2569 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 16:35:15.135745 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.135735 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 23 16:35:15.135865 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.135856 2569 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 16:35:15.138725 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.138716 2569 kubelet.go:491] "Attempting to sync node with API server" Apr 23 16:35:15.138766 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.138734 2569 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 16:35:15.138766 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.138746 2569 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 16:35:15.138766 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.138755 2569 kubelet.go:397] "Adding apiserver pod source" Apr 23 16:35:15.138766 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.138763 2569 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 16:35:15.140004 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.139986 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 16:35:15.140091 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.140015 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 16:35:15.143605 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.143588 2569 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 16:35:15.145876 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.145860 2569 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 16:35:15.147627 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.147609 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 16:35:15.147705 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.147674 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 16:35:15.147705 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.147682 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 16:35:15.147705 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.147687 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 16:35:15.147705 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.147693 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 16:35:15.147705 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.147698 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 16:35:15.147705 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.147704 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 16:35:15.147889 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.147709 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 16:35:15.147889 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.147717 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 16:35:15.147889 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.147723 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 16:35:15.147889 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.147736 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 16:35:15.147889 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.147744 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 16:35:15.148771 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.148762 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 16:35:15.148771 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.148771 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 16:35:15.152182 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.152168 2569 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 16:35:15.152261 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.152202 2569 server.go:1295] "Started kubelet" Apr 23 16:35:15.152319 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.152293 2569 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 16:35:15.152411 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.152358 2569 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 16:35:15.152463 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.152448 2569 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 16:35:15.153217 ip-10-0-128-102 systemd[1]: Started Kubernetes Kubelet. Apr 23 16:35:15.159775 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.159751 2569 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 16:35:15.160211 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.160190 2569 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-102.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 16:35:15.160698 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:15.160649 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 16:35:15.160839 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:15.160796 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-102.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 16:35:15.161346 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.161328 2569 server.go:317] "Adding debug handlers to kubelet server" Apr 23 16:35:15.164976 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:15.164956 2569 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 16:35:15.168789 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.168771 2569 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 16:35:15.168883 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.168791 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 16:35:15.169545 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.169528 2569 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 16:35:15.169616 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.169556 2569 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 16:35:15.169616 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.169527 2569 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 16:35:15.169741 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:15.168584 2569 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-102.ec2.internal.18a909a104f8f2aa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-102.ec2.internal,UID:ip-10-0-128-102.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-128-102.ec2.internal,},FirstTimestamp:2026-04-23 16:35:15.152179882 +0000 UTC m=+0.489442837,LastTimestamp:2026-04-23 16:35:15.152179882 +0000 UTC m=+0.489442837,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-102.ec2.internal,}" Apr 23 16:35:15.169810 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:15.169787 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-102.ec2.internal\" not found" Apr 23 16:35:15.170145 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.170127 2569 factory.go:55] Registering systemd factory Apr 23 16:35:15.170249 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.170204 2569 factory.go:223] Registration of the systemd container factory successfully Apr 23 16:35:15.170539 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.170520 2569 reconstruct.go:97] "Volume reconstruction finished" Apr 23 16:35:15.170539 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.170537 2569 reconciler.go:26] "Reconciler: start to sync state" Apr 23 16:35:15.170792 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.170716 2569 factory.go:153] Registering CRI-O factory Apr 23 16:35:15.170792 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.170728 2569 factory.go:223] Registration of the crio container factory successfully Apr 23 16:35:15.170792 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.170777 2569 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 16:35:15.170892 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.170807 2569 factory.go:103] Registering Raw factory Apr 23 16:35:15.170892 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.170817 2569 manager.go:1196] Started watching for new ooms in manager Apr 23 16:35:15.171260 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.171247 2569 manager.go:319] Starting recovery of all containers Apr 23 16:35:15.171870 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:15.171840 2569 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-128-102.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 23 16:35:15.172045 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:15.172023 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 23 16:35:15.181527 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.181514 2569 manager.go:324] Recovery completed Apr 23 16:35:15.184221 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.184203 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-tfjx8" Apr 23 16:35:15.185386 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.185374 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:15.187480 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.187463 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-102.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:15.187571 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.187490 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-102.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:15.187571 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.187502 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-102.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:15.187994 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.187978 2569 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 16:35:15.187994 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.187993 2569 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 16:35:15.188102 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.188010 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 23 16:35:15.189821 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:15.189744 2569 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-102.ec2.internal.18a909a107138a14 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-102.ec2.internal,UID:ip-10-0-128-102.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-128-102.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-128-102.ec2.internal,},FirstTimestamp:2026-04-23 16:35:15.187477012 +0000 UTC m=+0.524739967,LastTimestamp:2026-04-23 16:35:15.187477012 +0000 UTC m=+0.524739967,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-102.ec2.internal,}" Apr 23 16:35:15.190487 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.190475 2569 policy_none.go:49] "None policy: Start" Apr 23 16:35:15.190530 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.190496 2569 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 16:35:15.190530 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.190506 2569 state_mem.go:35] "Initializing new in-memory state store" Apr 23 16:35:15.191214 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.191197 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-tfjx8" Apr 23 16:35:15.221212 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.221195 2569 manager.go:341] "Starting Device Plugin manager" Apr 23 16:35:15.235270 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:15.221245 2569 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 16:35:15.235270 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.221259 2569 server.go:85] "Starting device plugin registration server" Apr 23 16:35:15.235270 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.221493 2569 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 16:35:15.235270 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.221505 2569 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 16:35:15.235270 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.221584 2569 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 16:35:15.235270 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.221652 2569 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 16:35:15.235270 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.221678 2569 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 16:35:15.235270 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:15.222404 2569 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 16:35:15.235270 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:15.222432 2569 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-102.ec2.internal\" not found" Apr 23 16:35:15.297768 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.297709 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 16:35:15.298957 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.298930 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 16:35:15.298957 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.298961 2569 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 16:35:15.299102 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.298984 2569 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 16:35:15.299102 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.298993 2569 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 16:35:15.299102 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:15.299037 2569 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 16:35:15.303503 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.303482 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:15.322559 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.322539 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:15.323528 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.323513 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-102.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:15.323614 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.323545 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-102.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:15.323614 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.323560 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-102.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:15.323614 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.323589 2569 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-102.ec2.internal" Apr 23 16:35:15.334794 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.334772 2569 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-102.ec2.internal" Apr 23 16:35:15.334794 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:15.334793 2569 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-102.ec2.internal\": node \"ip-10-0-128-102.ec2.internal\" not found" Apr 23 16:35:15.356845 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:15.356823 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-102.ec2.internal\" not found" Apr 23 16:35:15.399877 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.399844 2569 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-102.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-128-102.ec2.internal"] Apr 23 16:35:15.399982 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.399912 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:15.400753 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.400737 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-102.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:15.400826 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.400769 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-102.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:15.400826 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.400779 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-102.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:15.401853 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.401841 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:15.401990 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.401977 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-102.ec2.internal" Apr 23 16:35:15.402055 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.402006 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:15.402496 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.402482 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-102.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:15.402496 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.402490 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-102.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:15.402600 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.402502 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-102.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:15.402600 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.402514 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-102.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:15.402600 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.402514 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-102.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:15.402600 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.402560 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-102.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:15.403732 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.403715 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-102.ec2.internal" Apr 23 16:35:15.403812 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.403741 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:15.404440 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.404425 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-102.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:15.404506 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.404455 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-102.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:15.404506 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.404468 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-102.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:15.428209 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:15.428190 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-102.ec2.internal\" not found" node="ip-10-0-128-102.ec2.internal" Apr 23 16:35:15.432567 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:15.432549 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-102.ec2.internal\" not found" node="ip-10-0-128-102.ec2.internal" Apr 23 16:35:15.457001 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:15.456979 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-102.ec2.internal\" not found" Apr 23 16:35:15.471590 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.471566 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/05adcf3a2d254ff254dca89fd1c3545d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-102.ec2.internal\" (UID: \"05adcf3a2d254ff254dca89fd1c3545d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-102.ec2.internal" Apr 23 16:35:15.471693 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.471594 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/05adcf3a2d254ff254dca89fd1c3545d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-102.ec2.internal\" (UID: \"05adcf3a2d254ff254dca89fd1c3545d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-102.ec2.internal" Apr 23 16:35:15.471693 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.471612 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a0c005bc2e76bc3454363e5204d0f408-config\") pod \"kube-apiserver-proxy-ip-10-0-128-102.ec2.internal\" (UID: \"a0c005bc2e76bc3454363e5204d0f408\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-102.ec2.internal" Apr 23 16:35:15.557371 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:15.557302 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-102.ec2.internal\" not found" Apr 23 16:35:15.572775 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.572753 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/05adcf3a2d254ff254dca89fd1c3545d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-102.ec2.internal\" (UID: \"05adcf3a2d254ff254dca89fd1c3545d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-102.ec2.internal" Apr 23 16:35:15.572830 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.572782 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/05adcf3a2d254ff254dca89fd1c3545d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-102.ec2.internal\" (UID: \"05adcf3a2d254ff254dca89fd1c3545d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-102.ec2.internal" Apr 23 16:35:15.572830 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.572799 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a0c005bc2e76bc3454363e5204d0f408-config\") pod \"kube-apiserver-proxy-ip-10-0-128-102.ec2.internal\" (UID: \"a0c005bc2e76bc3454363e5204d0f408\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-102.ec2.internal" Apr 23 16:35:15.572895 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.572860 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a0c005bc2e76bc3454363e5204d0f408-config\") pod \"kube-apiserver-proxy-ip-10-0-128-102.ec2.internal\" (UID: \"a0c005bc2e76bc3454363e5204d0f408\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-102.ec2.internal" Apr 23 16:35:15.572895 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.572881 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/05adcf3a2d254ff254dca89fd1c3545d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-102.ec2.internal\" (UID: \"05adcf3a2d254ff254dca89fd1c3545d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-102.ec2.internal" Apr 23 16:35:15.572954 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.572860 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/05adcf3a2d254ff254dca89fd1c3545d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-102.ec2.internal\" (UID: \"05adcf3a2d254ff254dca89fd1c3545d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-102.ec2.internal" Apr 23 16:35:15.658215 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:15.658174 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-102.ec2.internal\" not found" Apr 23 16:35:15.730768 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.730724 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-102.ec2.internal" Apr 23 16:35:15.735302 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:15.735284 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-102.ec2.internal" Apr 23 16:35:15.758982 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:15.758956 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-102.ec2.internal\" not found" Apr 23 16:35:15.859565 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:15.859538 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-102.ec2.internal\" not found" Apr 23 16:35:15.960030 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:15.960004 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-102.ec2.internal\" not found" Apr 23 16:35:16.059704 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.059672 2569 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 16:35:16.060335 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.059833 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 16:35:16.060754 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:16.060735 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-102.ec2.internal\" not found" Apr 23 16:35:16.093608 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.093581 2569 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:16.139175 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.139126 2569 apiserver.go:52] "Watching apiserver" Apr 23 16:35:16.148305 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.148287 2569 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 16:35:16.150172 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.150153 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jlhpn","openshift-cluster-node-tuning-operator/tuned-s4jdv","openshift-dns/node-resolver-fgwnc","openshift-image-registry/node-ca-pmktz","openshift-multus/multus-8slg2","openshift-multus/network-metrics-daemon-k7n97","openshift-network-operator/iptables-alerter-6xw6f","openshift-multus/multus-additional-cni-plugins-z8jnl","openshift-network-diagnostics/network-check-target-ktsss","openshift-ovn-kubernetes/ovnkube-node-ddr2h","kube-system/konnectivity-agent-dnws5"] Apr 23 16:35:16.152399 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.152381 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jlhpn" Apr 23 16:35:16.153213 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.153194 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.153322 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.153299 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fgwnc" Apr 23 16:35:16.154547 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.154521 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pmktz" Apr 23 16:35:16.154838 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.154812 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-hlwxr\"" Apr 23 16:35:16.154947 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.154928 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-cjg42\"" Apr 23 16:35:16.155015 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.154996 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:35:16.155304 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.155285 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 16:35:16.155755 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.155738 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.155914 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.155899 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 16:35:16.156127 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.156108 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 16:35:16.156584 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.156564 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 16:35:16.156653 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.156628 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 16:35:16.156858 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.156843 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 16:35:16.156972 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.156955 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 16:35:16.157171 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.157154 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-dwlr4\"" Apr 23 16:35:16.157232 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.157205 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7n97" Apr 23 16:35:16.157315 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:16.157290 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7n97" podUID="1eabe990-f610-4e94-8a89-7cff1c9a6a23" Apr 23 16:35:16.158493 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.158475 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-6xw6f" Apr 23 16:35:16.159555 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.159540 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-z8jnl" Apr 23 16:35:16.160107 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.160091 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 16:35:16.160194 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.160143 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-zd6rx\"" Apr 23 16:35:16.160476 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.160461 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ktsss" Apr 23 16:35:16.160539 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:16.160514 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ktsss" podUID="1ba421e3-97e2-473e-a145-bf072f2b9393" Apr 23 16:35:16.160631 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.160595 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-vggrm\"" Apr 23 16:35:16.160690 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.160624 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 16:35:16.161138 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.161123 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 16:35:16.161506 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.161492 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 16:35:16.161714 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.161700 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.161781 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.161708 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 16:35:16.162536 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.162521 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:35:16.162881 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.162868 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-dnws5" Apr 23 16:35:16.163543 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.163513 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 16:35:16.163631 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.163546 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 16:35:16.164161 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.164144 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 16:35:16.164241 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.164231 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-rndf6\"" Apr 23 16:35:16.164398 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.164384 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 16:35:16.164840 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.164823 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 16:35:16.164921 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.164905 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 16:35:16.165413 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.165397 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 16:35:16.165579 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.165563 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 16:35:16.165954 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.165929 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 16:35:16.166011 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.165996 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 16:35:16.166180 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.166166 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 16:35:16.166281 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.166268 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 16:35:16.166545 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.166530 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-b995b\"" Apr 23 16:35:16.166838 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.166756 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-75hxj\"" Apr 23 16:35:16.166838 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.166794 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-nlksj\"" Apr 23 16:35:16.166984 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.166848 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 16:35:16.168896 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.168881 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 16:35:16.169265 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.169253 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-102.ec2.internal" Apr 23 16:35:16.170451 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.170434 2569 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 16:35:16.176775 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.176757 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svdfq\" (UniqueName: \"kubernetes.io/projected/1ba421e3-97e2-473e-a145-bf072f2b9393-kube-api-access-svdfq\") pod \"network-check-target-ktsss\" (UID: \"1ba421e3-97e2-473e-a145-bf072f2b9393\") " pod="openshift-network-diagnostics/network-check-target-ktsss" Apr 23 16:35:16.176853 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.176782 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6f9db1de-039e-440a-ad1a-5e30719153d9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jlhpn\" (UID: \"6f9db1de-039e-440a-ad1a-5e30719153d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jlhpn" Apr 23 16:35:16.176853 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.176798 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-lib-modules\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.176853 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.176825 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-os-release\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.176956 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.176859 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqbzt\" (UniqueName: \"kubernetes.io/projected/64f2e8d8-0a24-4b00-a66e-91dd67594081-kube-api-access-rqbzt\") pod \"multus-additional-cni-plugins-z8jnl\" (UID: \"64f2e8d8-0a24-4b00-a66e-91dd67594081\") " pod="openshift-multus/multus-additional-cni-plugins-z8jnl" Apr 23 16:35:16.176956 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.176875 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a063c6b7-80d4-45d7-815d-88d94693a0b1-ovnkube-script-lib\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.176956 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.176892 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hcc4\" (UniqueName: \"kubernetes.io/projected/a063c6b7-80d4-45d7-815d-88d94693a0b1-kube-api-access-4hcc4\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.176956 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.176913 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-etc-sysconfig\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.176956 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.176950 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-sys\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.177110 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.176989 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/63340da2-b7c8-4798-a1ed-d8a80bf900b6-multus-daemon-config\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.177110 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.177035 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-etc-kubernetes\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.177110 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.177062 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-etc-modprobe-d\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.177110 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.177076 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-var-lib-kubelet\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.177110 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.177100 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/919d79c7-8b2d-41ad-b0ba-bf48e8815841-host\") pod \"node-ca-pmktz\" (UID: \"919d79c7-8b2d-41ad-b0ba-bf48e8815841\") " pod="openshift-image-registry/node-ca-pmktz" Apr 23 16:35:16.177298 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.177118 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54sps\" (UniqueName: \"kubernetes.io/projected/919d79c7-8b2d-41ad-b0ba-bf48e8815841-kube-api-access-54sps\") pod \"node-ca-pmktz\" (UID: \"919d79c7-8b2d-41ad-b0ba-bf48e8815841\") " pod="openshift-image-registry/node-ca-pmktz" Apr 23 16:35:16.177298 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.177134 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/64f2e8d8-0a24-4b00-a66e-91dd67594081-cnibin\") pod \"multus-additional-cni-plugins-z8jnl\" (UID: \"64f2e8d8-0a24-4b00-a66e-91dd67594081\") " pod="openshift-multus/multus-additional-cni-plugins-z8jnl" Apr 23 16:35:16.177298 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.177150 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.177298 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.177178 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2v5x\" (UniqueName: \"kubernetes.io/projected/6f9db1de-039e-440a-ad1a-5e30719153d9-kube-api-access-s2v5x\") pod \"aws-ebs-csi-driver-node-jlhpn\" (UID: \"6f9db1de-039e-440a-ad1a-5e30719153d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jlhpn" Apr 23 16:35:16.177298 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.177228 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-host-var-lib-cni-bin\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.177298 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.177260 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/64f2e8d8-0a24-4b00-a66e-91dd67594081-system-cni-dir\") pod \"multus-additional-cni-plugins-z8jnl\" (UID: \"64f2e8d8-0a24-4b00-a66e-91dd67594081\") " pod="openshift-multus/multus-additional-cni-plugins-z8jnl" Apr 23 16:35:16.177732 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.177718 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-host-kubelet\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.177796 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.177745 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6f9db1de-039e-440a-ad1a-5e30719153d9-etc-selinux\") pod \"aws-ebs-csi-driver-node-jlhpn\" (UID: \"6f9db1de-039e-440a-ad1a-5e30719153d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jlhpn" Apr 23 16:35:16.177796 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.177768 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-etc-sysctl-conf\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.177889 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.177791 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-etc-tuned\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.177889 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.177821 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6f9db1de-039e-440a-ad1a-5e30719153d9-sys-fs\") pod \"aws-ebs-csi-driver-node-jlhpn\" (UID: \"6f9db1de-039e-440a-ad1a-5e30719153d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jlhpn" Apr 23 16:35:16.177889 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.177843 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-etc-systemd\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.177889 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.177875 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-host-run-multus-certs\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.178014 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.177895 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt8h5\" (UniqueName: \"kubernetes.io/projected/2c38e0bd-62a0-43a1-b0e2-05e9ca0f084b-kube-api-access-pt8h5\") pod \"iptables-alerter-6xw6f\" (UID: \"2c38e0bd-62a0-43a1-b0e2-05e9ca0f084b\") " pod="openshift-network-operator/iptables-alerter-6xw6f" Apr 23 16:35:16.178014 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.177912 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/486e65b1-cb27-4533-8ab9-9a91c79c58b1-tmp-dir\") pod \"node-resolver-fgwnc\" (UID: \"486e65b1-cb27-4533-8ab9-9a91c79c58b1\") " pod="openshift-dns/node-resolver-fgwnc" Apr 23 16:35:16.178014 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.177928 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2d2s\" (UniqueName: \"kubernetes.io/projected/486e65b1-cb27-4533-8ab9-9a91c79c58b1-kube-api-access-t2d2s\") pod \"node-resolver-fgwnc\" (UID: \"486e65b1-cb27-4533-8ab9-9a91c79c58b1\") " pod="openshift-dns/node-resolver-fgwnc" Apr 23 16:35:16.178014 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.177943 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6f9db1de-039e-440a-ad1a-5e30719153d9-device-dir\") pod \"aws-ebs-csi-driver-node-jlhpn\" (UID: \"6f9db1de-039e-440a-ad1a-5e30719153d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jlhpn" Apr 23 16:35:16.178014 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.177957 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-etc-kubernetes\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.178014 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.177970 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-host-var-lib-cni-multus\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.178014 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.177986 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-etc-openvswitch\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.178014 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178008 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a063c6b7-80d4-45d7-815d-88d94693a0b1-ovn-node-metrics-cert\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.178300 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178030 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6f9db1de-039e-440a-ad1a-5e30719153d9-registration-dir\") pod \"aws-ebs-csi-driver-node-jlhpn\" (UID: \"6f9db1de-039e-440a-ad1a-5e30719153d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jlhpn" Apr 23 16:35:16.178300 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178050 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-run\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.178300 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178065 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-host-run-netns\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.178300 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178082 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/64f2e8d8-0a24-4b00-a66e-91dd67594081-tuning-conf-dir\") pod \"multus-additional-cni-plugins-z8jnl\" (UID: \"64f2e8d8-0a24-4b00-a66e-91dd67594081\") " pod="openshift-multus/multus-additional-cni-plugins-z8jnl" Apr 23 16:35:16.178300 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178104 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/64f2e8d8-0a24-4b00-a66e-91dd67594081-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-z8jnl\" (UID: \"64f2e8d8-0a24-4b00-a66e-91dd67594081\") " pod="openshift-multus/multus-additional-cni-plugins-z8jnl" Apr 23 16:35:16.178300 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178126 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-var-lib-openvswitch\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.178300 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178149 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-host-run-ovn-kubernetes\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.178300 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178168 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6f9db1de-039e-440a-ad1a-5e30719153d9-socket-dir\") pod \"aws-ebs-csi-driver-node-jlhpn\" (UID: \"6f9db1de-039e-440a-ad1a-5e30719153d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jlhpn" Apr 23 16:35:16.178300 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178183 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d626\" (UniqueName: \"kubernetes.io/projected/63340da2-b7c8-4798-a1ed-d8a80bf900b6-kube-api-access-4d626\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.178300 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178201 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2c38e0bd-62a0-43a1-b0e2-05e9ca0f084b-iptables-alerter-script\") pod \"iptables-alerter-6xw6f\" (UID: \"2c38e0bd-62a0-43a1-b0e2-05e9ca0f084b\") " pod="openshift-network-operator/iptables-alerter-6xw6f" Apr 23 16:35:16.178300 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178233 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/64f2e8d8-0a24-4b00-a66e-91dd67594081-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-z8jnl\" (UID: \"64f2e8d8-0a24-4b00-a66e-91dd67594081\") " pod="openshift-multus/multus-additional-cni-plugins-z8jnl" Apr 23 16:35:16.178300 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178262 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/486e65b1-cb27-4533-8ab9-9a91c79c58b1-hosts-file\") pod \"node-resolver-fgwnc\" (UID: \"486e65b1-cb27-4533-8ab9-9a91c79c58b1\") " pod="openshift-dns/node-resolver-fgwnc" Apr 23 16:35:16.178300 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178288 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3d17f781-cff6-4f98-92ab-d090568476a4-agent-certs\") pod \"konnectivity-agent-dnws5\" (UID: \"3d17f781-cff6-4f98-92ab-d090568476a4\") " pod="kube-system/konnectivity-agent-dnws5" Apr 23 16:35:16.178804 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178318 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbqzz\" (UniqueName: \"kubernetes.io/projected/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-kube-api-access-tbqzz\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.178804 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178345 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/63340da2-b7c8-4798-a1ed-d8a80bf900b6-cni-binary-copy\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.178804 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178383 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2c38e0bd-62a0-43a1-b0e2-05e9ca0f084b-host-slash\") pod \"iptables-alerter-6xw6f\" (UID: \"2c38e0bd-62a0-43a1-b0e2-05e9ca0f084b\") " pod="openshift-network-operator/iptables-alerter-6xw6f" Apr 23 16:35:16.178804 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178439 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-systemd-units\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.178804 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178468 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-run-openvswitch\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.178804 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178503 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-node-log\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.178804 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178523 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-cnibin\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.178804 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178542 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-run-systemd\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.178804 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178569 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-host-cni-bin\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.178804 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178588 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-tmp\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.178804 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178608 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/919d79c7-8b2d-41ad-b0ba-bf48e8815841-serviceca\") pod \"node-ca-pmktz\" (UID: \"919d79c7-8b2d-41ad-b0ba-bf48e8815841\") " pod="openshift-image-registry/node-ca-pmktz" Apr 23 16:35:16.178804 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178620 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-hostroot\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.178804 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178649 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-host-run-netns\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.178804 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178697 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1eabe990-f610-4e94-8a89-7cff1c9a6a23-metrics-certs\") pod \"network-metrics-daemon-k7n97\" (UID: \"1eabe990-f610-4e94-8a89-7cff1c9a6a23\") " pod="openshift-multus/network-metrics-daemon-k7n97" Apr 23 16:35:16.178804 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178721 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfs4x\" (UniqueName: \"kubernetes.io/projected/1eabe990-f610-4e94-8a89-7cff1c9a6a23-kube-api-access-qfs4x\") pod \"network-metrics-daemon-k7n97\" (UID: \"1eabe990-f610-4e94-8a89-7cff1c9a6a23\") " pod="openshift-multus/network-metrics-daemon-k7n97" Apr 23 16:35:16.178804 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178742 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-multus-socket-dir-parent\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.178804 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178755 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-host-run-k8s-cni-cncf-io\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.179379 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178770 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-run-ovn\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.179379 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178823 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-log-socket\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.179379 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178851 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a063c6b7-80d4-45d7-815d-88d94693a0b1-env-overrides\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.179379 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178867 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-host-slash\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.179379 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178887 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-host-cni-netd\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.179379 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178902 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a063c6b7-80d4-45d7-815d-88d94693a0b1-ovnkube-config\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.179379 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178916 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-etc-sysctl-d\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.179379 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178932 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-host-var-lib-kubelet\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.179379 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178953 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/64f2e8d8-0a24-4b00-a66e-91dd67594081-os-release\") pod \"multus-additional-cni-plugins-z8jnl\" (UID: \"64f2e8d8-0a24-4b00-a66e-91dd67594081\") " pod="openshift-multus/multus-additional-cni-plugins-z8jnl" Apr 23 16:35:16.179379 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178977 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3d17f781-cff6-4f98-92ab-d090568476a4-konnectivity-ca\") pod \"konnectivity-agent-dnws5\" (UID: \"3d17f781-cff6-4f98-92ab-d090568476a4\") " pod="kube-system/konnectivity-agent-dnws5" Apr 23 16:35:16.179379 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.178994 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-host\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.179379 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.179012 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-system-cni-dir\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.179379 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.179028 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-multus-cni-dir\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.179379 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.179066 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-multus-conf-dir\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.179379 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.179104 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/64f2e8d8-0a24-4b00-a66e-91dd67594081-cni-binary-copy\") pod \"multus-additional-cni-plugins-z8jnl\" (UID: \"64f2e8d8-0a24-4b00-a66e-91dd67594081\") " pod="openshift-multus/multus-additional-cni-plugins-z8jnl" Apr 23 16:35:16.179892 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.179877 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 16:35:16.186352 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.186329 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-102.ec2.internal"] Apr 23 16:35:16.187673 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.186624 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 16:35:16.187673 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.186738 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-102.ec2.internal" Apr 23 16:35:16.193745 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.193697 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 16:30:15 +0000 UTC" deadline="2028-01-31 15:03:27.745996258 +0000 UTC" Apr 23 16:35:16.193745 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.193745 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15550h28m11.552254763s" Apr 23 16:35:16.199904 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.199879 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-128-102.ec2.internal"] Apr 23 16:35:16.199981 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.199974 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 16:35:16.206485 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.206467 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-6xc7d" Apr 23 16:35:16.215773 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.215752 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-6xc7d" Apr 23 16:35:16.280256 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.280232 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-host-var-lib-cni-bin\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.280418 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.280261 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/64f2e8d8-0a24-4b00-a66e-91dd67594081-system-cni-dir\") pod \"multus-additional-cni-plugins-z8jnl\" (UID: \"64f2e8d8-0a24-4b00-a66e-91dd67594081\") " pod="openshift-multus/multus-additional-cni-plugins-z8jnl" Apr 23 16:35:16.280418 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.280280 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-host-kubelet\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.280418 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.280303 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6f9db1de-039e-440a-ad1a-5e30719153d9-etc-selinux\") pod \"aws-ebs-csi-driver-node-jlhpn\" (UID: \"6f9db1de-039e-440a-ad1a-5e30719153d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jlhpn" Apr 23 16:35:16.280418 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.280326 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-etc-sysctl-conf\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.280418 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.280326 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-host-var-lib-cni-bin\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.280418 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.280337 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/64f2e8d8-0a24-4b00-a66e-91dd67594081-system-cni-dir\") pod \"multus-additional-cni-plugins-z8jnl\" (UID: \"64f2e8d8-0a24-4b00-a66e-91dd67594081\") " pod="openshift-multus/multus-additional-cni-plugins-z8jnl" Apr 23 16:35:16.280418 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.280348 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-etc-tuned\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.280418 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.280362 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-host-kubelet\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.280418 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.280373 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6f9db1de-039e-440a-ad1a-5e30719153d9-sys-fs\") pod \"aws-ebs-csi-driver-node-jlhpn\" (UID: \"6f9db1de-039e-440a-ad1a-5e30719153d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jlhpn" Apr 23 16:35:16.280418 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.280399 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-etc-systemd\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.280418 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.280399 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6f9db1de-039e-440a-ad1a-5e30719153d9-etc-selinux\") pod \"aws-ebs-csi-driver-node-jlhpn\" (UID: \"6f9db1de-039e-440a-ad1a-5e30719153d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jlhpn" Apr 23 16:35:16.280862 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.280443 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6f9db1de-039e-440a-ad1a-5e30719153d9-sys-fs\") pod \"aws-ebs-csi-driver-node-jlhpn\" (UID: \"6f9db1de-039e-440a-ad1a-5e30719153d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jlhpn" Apr 23 16:35:16.280862 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.280464 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-etc-systemd\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.280862 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.280471 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-host-run-multus-certs\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.280862 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.280480 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-etc-sysctl-conf\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.280862 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.280495 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pt8h5\" (UniqueName: \"kubernetes.io/projected/2c38e0bd-62a0-43a1-b0e2-05e9ca0f084b-kube-api-access-pt8h5\") pod \"iptables-alerter-6xw6f\" (UID: \"2c38e0bd-62a0-43a1-b0e2-05e9ca0f084b\") " pod="openshift-network-operator/iptables-alerter-6xw6f" Apr 23 16:35:16.280862 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.280533 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/486e65b1-cb27-4533-8ab9-9a91c79c58b1-tmp-dir\") pod \"node-resolver-fgwnc\" (UID: \"486e65b1-cb27-4533-8ab9-9a91c79c58b1\") " pod="openshift-dns/node-resolver-fgwnc" Apr 23 16:35:16.280862 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.280559 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t2d2s\" (UniqueName: \"kubernetes.io/projected/486e65b1-cb27-4533-8ab9-9a91c79c58b1-kube-api-access-t2d2s\") pod \"node-resolver-fgwnc\" (UID: \"486e65b1-cb27-4533-8ab9-9a91c79c58b1\") " pod="openshift-dns/node-resolver-fgwnc" Apr 23 16:35:16.280862 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.280559 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-host-run-multus-certs\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.280862 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.280583 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6f9db1de-039e-440a-ad1a-5e30719153d9-device-dir\") pod \"aws-ebs-csi-driver-node-jlhpn\" (UID: \"6f9db1de-039e-440a-ad1a-5e30719153d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jlhpn" Apr 23 16:35:16.280862 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.280631 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6f9db1de-039e-440a-ad1a-5e30719153d9-device-dir\") pod \"aws-ebs-csi-driver-node-jlhpn\" (UID: \"6f9db1de-039e-440a-ad1a-5e30719153d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jlhpn" Apr 23 16:35:16.280862 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.280692 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-etc-kubernetes\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.280862 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.280729 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-host-var-lib-cni-multus\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.280862 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.280743 2569 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 16:35:16.280862 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.280758 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-etc-openvswitch\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.280862 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.280786 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a063c6b7-80d4-45d7-815d-88d94693a0b1-ovn-node-metrics-cert\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.280862 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.280801 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-etc-kubernetes\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.280862 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.280814 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6f9db1de-039e-440a-ad1a-5e30719153d9-registration-dir\") pod \"aws-ebs-csi-driver-node-jlhpn\" (UID: \"6f9db1de-039e-440a-ad1a-5e30719153d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jlhpn" Apr 23 16:35:16.280862 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.280840 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-run\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.281777 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.280867 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-host-run-netns\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.281777 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.280872 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-host-var-lib-cni-multus\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.281777 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.280871 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/486e65b1-cb27-4533-8ab9-9a91c79c58b1-tmp-dir\") pod \"node-resolver-fgwnc\" (UID: \"486e65b1-cb27-4533-8ab9-9a91c79c58b1\") " pod="openshift-dns/node-resolver-fgwnc" Apr 23 16:35:16.281777 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.280894 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/64f2e8d8-0a24-4b00-a66e-91dd67594081-tuning-conf-dir\") pod \"multus-additional-cni-plugins-z8jnl\" (UID: \"64f2e8d8-0a24-4b00-a66e-91dd67594081\") " pod="openshift-multus/multus-additional-cni-plugins-z8jnl" Apr 23 16:35:16.281777 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.280906 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6f9db1de-039e-440a-ad1a-5e30719153d9-registration-dir\") pod \"aws-ebs-csi-driver-node-jlhpn\" (UID: \"6f9db1de-039e-440a-ad1a-5e30719153d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jlhpn" Apr 23 16:35:16.281777 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.280920 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/64f2e8d8-0a24-4b00-a66e-91dd67594081-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-z8jnl\" (UID: \"64f2e8d8-0a24-4b00-a66e-91dd67594081\") " pod="openshift-multus/multus-additional-cni-plugins-z8jnl" Apr 23 16:35:16.281777 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.280940 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-run\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.281777 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.280948 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-var-lib-openvswitch\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.281777 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.280970 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-etc-openvswitch\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.281777 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.281085 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-host-run-netns\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.281777 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.281162 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/64f2e8d8-0a24-4b00-a66e-91dd67594081-tuning-conf-dir\") pod \"multus-additional-cni-plugins-z8jnl\" (UID: \"64f2e8d8-0a24-4b00-a66e-91dd67594081\") " pod="openshift-multus/multus-additional-cni-plugins-z8jnl" Apr 23 16:35:16.281777 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.281245 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-var-lib-openvswitch\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.281777 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.281288 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-host-run-ovn-kubernetes\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.281777 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.281317 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6f9db1de-039e-440a-ad1a-5e30719153d9-socket-dir\") pod \"aws-ebs-csi-driver-node-jlhpn\" (UID: \"6f9db1de-039e-440a-ad1a-5e30719153d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jlhpn" Apr 23 16:35:16.281777 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.281341 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4d626\" (UniqueName: \"kubernetes.io/projected/63340da2-b7c8-4798-a1ed-d8a80bf900b6-kube-api-access-4d626\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.281777 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.281366 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2c38e0bd-62a0-43a1-b0e2-05e9ca0f084b-iptables-alerter-script\") pod \"iptables-alerter-6xw6f\" (UID: \"2c38e0bd-62a0-43a1-b0e2-05e9ca0f084b\") " pod="openshift-network-operator/iptables-alerter-6xw6f" Apr 23 16:35:16.281777 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.281391 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/64f2e8d8-0a24-4b00-a66e-91dd67594081-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-z8jnl\" (UID: \"64f2e8d8-0a24-4b00-a66e-91dd67594081\") " pod="openshift-multus/multus-additional-cni-plugins-z8jnl" Apr 23 16:35:16.282617 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.281419 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/486e65b1-cb27-4533-8ab9-9a91c79c58b1-hosts-file\") pod \"node-resolver-fgwnc\" (UID: \"486e65b1-cb27-4533-8ab9-9a91c79c58b1\") " pod="openshift-dns/node-resolver-fgwnc" Apr 23 16:35:16.282617 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.281446 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3d17f781-cff6-4f98-92ab-d090568476a4-agent-certs\") pod \"konnectivity-agent-dnws5\" (UID: \"3d17f781-cff6-4f98-92ab-d090568476a4\") " pod="kube-system/konnectivity-agent-dnws5" Apr 23 16:35:16.282617 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.281482 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tbqzz\" (UniqueName: \"kubernetes.io/projected/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-kube-api-access-tbqzz\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.282617 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.281507 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/63340da2-b7c8-4798-a1ed-d8a80bf900b6-cni-binary-copy\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.282617 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.281531 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2c38e0bd-62a0-43a1-b0e2-05e9ca0f084b-host-slash\") pod \"iptables-alerter-6xw6f\" (UID: \"2c38e0bd-62a0-43a1-b0e2-05e9ca0f084b\") " pod="openshift-network-operator/iptables-alerter-6xw6f" Apr 23 16:35:16.282617 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.281555 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-systemd-units\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.282617 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.281579 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-run-openvswitch\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.282617 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.281605 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-node-log\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.282617 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.281610 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/64f2e8d8-0a24-4b00-a66e-91dd67594081-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-z8jnl\" (UID: \"64f2e8d8-0a24-4b00-a66e-91dd67594081\") " pod="openshift-multus/multus-additional-cni-plugins-z8jnl" Apr 23 16:35:16.282617 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.281675 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-node-log\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.282617 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.281688 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-cnibin\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.282617 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.281756 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-cnibin\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.282617 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.281778 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/486e65b1-cb27-4533-8ab9-9a91c79c58b1-hosts-file\") pod \"node-resolver-fgwnc\" (UID: \"486e65b1-cb27-4533-8ab9-9a91c79c58b1\") " pod="openshift-dns/node-resolver-fgwnc" Apr 23 16:35:16.282617 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.281719 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-run-systemd\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.282617 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.281816 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-host-cni-bin\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.282617 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.281841 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-tmp\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.282617 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.281863 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/919d79c7-8b2d-41ad-b0ba-bf48e8815841-serviceca\") pod \"node-ca-pmktz\" (UID: \"919d79c7-8b2d-41ad-b0ba-bf48e8815841\") " pod="openshift-image-registry/node-ca-pmktz" Apr 23 16:35:16.282617 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.281867 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-host-run-ovn-kubernetes\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.283466 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.281917 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-hostroot\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.283466 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.281969 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-host-run-netns\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.283466 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.281983 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6f9db1de-039e-440a-ad1a-5e30719153d9-socket-dir\") pod \"aws-ebs-csi-driver-node-jlhpn\" (UID: \"6f9db1de-039e-440a-ad1a-5e30719153d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jlhpn" Apr 23 16:35:16.283466 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.281997 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1eabe990-f610-4e94-8a89-7cff1c9a6a23-metrics-certs\") pod \"network-metrics-daemon-k7n97\" (UID: \"1eabe990-f610-4e94-8a89-7cff1c9a6a23\") " pod="openshift-multus/network-metrics-daemon-k7n97" Apr 23 16:35:16.283466 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.282035 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-run-systemd\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.283466 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.282063 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qfs4x\" (UniqueName: \"kubernetes.io/projected/1eabe990-f610-4e94-8a89-7cff1c9a6a23-kube-api-access-qfs4x\") pod \"network-metrics-daemon-k7n97\" (UID: \"1eabe990-f610-4e94-8a89-7cff1c9a6a23\") " pod="openshift-multus/network-metrics-daemon-k7n97" Apr 23 16:35:16.283466 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:16.282093 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:16.283466 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.282093 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-multus-socket-dir-parent\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.283466 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.282124 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-host-run-k8s-cni-cncf-io\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.283466 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.282148 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-run-ovn\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.283466 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.282182 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-run-ovn\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.283466 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:16.282192 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1eabe990-f610-4e94-8a89-7cff1c9a6a23-metrics-certs podName:1eabe990-f610-4e94-8a89-7cff1c9a6a23 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:16.782148748 +0000 UTC m=+2.119411712 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1eabe990-f610-4e94-8a89-7cff1c9a6a23-metrics-certs") pod "network-metrics-daemon-k7n97" (UID: "1eabe990-f610-4e94-8a89-7cff1c9a6a23") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:16.283466 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.282237 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-host-cni-bin\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.283466 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.282259 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2c38e0bd-62a0-43a1-b0e2-05e9ca0f084b-iptables-alerter-script\") pod \"iptables-alerter-6xw6f\" (UID: \"2c38e0bd-62a0-43a1-b0e2-05e9ca0f084b\") " pod="openshift-network-operator/iptables-alerter-6xw6f" Apr 23 16:35:16.283466 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.282427 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/64f2e8d8-0a24-4b00-a66e-91dd67594081-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-z8jnl\" (UID: \"64f2e8d8-0a24-4b00-a66e-91dd67594081\") " pod="openshift-multus/multus-additional-cni-plugins-z8jnl" Apr 23 16:35:16.283466 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.282488 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-hostroot\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.283466 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.282540 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-systemd-units\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.284313 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.282645 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/919d79c7-8b2d-41ad-b0ba-bf48e8815841-serviceca\") pod \"node-ca-pmktz\" (UID: \"919d79c7-8b2d-41ad-b0ba-bf48e8815841\") " pod="openshift-image-registry/node-ca-pmktz" Apr 23 16:35:16.284313 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.282898 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-run-openvswitch\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.284313 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.282956 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-multus-socket-dir-parent\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.284313 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.283165 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-host-run-netns\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.284313 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.283219 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/63340da2-b7c8-4798-a1ed-d8a80bf900b6-cni-binary-copy\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.284313 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.283250 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-log-socket\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.284313 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.283275 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2c38e0bd-62a0-43a1-b0e2-05e9ca0f084b-host-slash\") pod \"iptables-alerter-6xw6f\" (UID: \"2c38e0bd-62a0-43a1-b0e2-05e9ca0f084b\") " pod="openshift-network-operator/iptables-alerter-6xw6f" Apr 23 16:35:16.284313 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.283277 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a063c6b7-80d4-45d7-815d-88d94693a0b1-env-overrides\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.284313 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.283308 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-host-slash\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.284313 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.283318 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-log-socket\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.284313 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.283335 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-host-cni-netd\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.284313 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.283356 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-host-run-k8s-cni-cncf-io\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.284313 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.283359 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a063c6b7-80d4-45d7-815d-88d94693a0b1-ovnkube-config\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.284313 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.283389 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-etc-sysctl-d\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.284313 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.283415 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-host-var-lib-kubelet\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.284313 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.283441 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/64f2e8d8-0a24-4b00-a66e-91dd67594081-os-release\") pod \"multus-additional-cni-plugins-z8jnl\" (UID: \"64f2e8d8-0a24-4b00-a66e-91dd67594081\") " pod="openshift-multus/multus-additional-cni-plugins-z8jnl" Apr 23 16:35:16.284313 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.283466 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3d17f781-cff6-4f98-92ab-d090568476a4-konnectivity-ca\") pod \"konnectivity-agent-dnws5\" (UID: \"3d17f781-cff6-4f98-92ab-d090568476a4\") " pod="kube-system/konnectivity-agent-dnws5" Apr 23 16:35:16.284313 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.283490 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-host\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.285110 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.283506 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-etc-sysctl-d\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.285110 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.284133 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a063c6b7-80d4-45d7-815d-88d94693a0b1-ovnkube-config\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.285110 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.284178 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-system-cni-dir\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.285110 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.284227 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-multus-cni-dir\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.285110 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.284252 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-multus-conf-dir\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.285110 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.284286 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/64f2e8d8-0a24-4b00-a66e-91dd67594081-cni-binary-copy\") pod \"multus-additional-cni-plugins-z8jnl\" (UID: \"64f2e8d8-0a24-4b00-a66e-91dd67594081\") " pod="openshift-multus/multus-additional-cni-plugins-z8jnl" Apr 23 16:35:16.285110 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.284318 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svdfq\" (UniqueName: \"kubernetes.io/projected/1ba421e3-97e2-473e-a145-bf072f2b9393-kube-api-access-svdfq\") pod \"network-check-target-ktsss\" (UID: \"1ba421e3-97e2-473e-a145-bf072f2b9393\") " pod="openshift-network-diagnostics/network-check-target-ktsss" Apr 23 16:35:16.285110 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.284348 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6f9db1de-039e-440a-ad1a-5e30719153d9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jlhpn\" (UID: \"6f9db1de-039e-440a-ad1a-5e30719153d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jlhpn" Apr 23 16:35:16.285110 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.284378 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-lib-modules\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.285110 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.284412 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-system-cni-dir\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.285110 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.284436 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-host-var-lib-kubelet\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.285110 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.284463 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-os-release\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.285110 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.284471 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-etc-tuned\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.285110 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.284525 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/64f2e8d8-0a24-4b00-a66e-91dd67594081-os-release\") pod \"multus-additional-cni-plugins-z8jnl\" (UID: \"64f2e8d8-0a24-4b00-a66e-91dd67594081\") " pod="openshift-multus/multus-additional-cni-plugins-z8jnl" Apr 23 16:35:16.285110 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.284785 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-tmp\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.285110 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.284403 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-os-release\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.285110 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.284833 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6f9db1de-039e-440a-ad1a-5e30719153d9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jlhpn\" (UID: \"6f9db1de-039e-440a-ad1a-5e30719153d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jlhpn" Apr 23 16:35:16.285110 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.284875 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rqbzt\" (UniqueName: \"kubernetes.io/projected/64f2e8d8-0a24-4b00-a66e-91dd67594081-kube-api-access-rqbzt\") pod \"multus-additional-cni-plugins-z8jnl\" (UID: \"64f2e8d8-0a24-4b00-a66e-91dd67594081\") " pod="openshift-multus/multus-additional-cni-plugins-z8jnl" Apr 23 16:35:16.285922 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.284884 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-host-cni-netd\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.285922 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.284921 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a063c6b7-80d4-45d7-815d-88d94693a0b1-ovnkube-script-lib\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.285922 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.284950 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/64f2e8d8-0a24-4b00-a66e-91dd67594081-cni-binary-copy\") pod \"multus-additional-cni-plugins-z8jnl\" (UID: \"64f2e8d8-0a24-4b00-a66e-91dd67594081\") " pod="openshift-multus/multus-additional-cni-plugins-z8jnl" Apr 23 16:35:16.285922 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.284958 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hcc4\" (UniqueName: \"kubernetes.io/projected/a063c6b7-80d4-45d7-815d-88d94693a0b1-kube-api-access-4hcc4\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.285922 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.285003 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-etc-sysconfig\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.285922 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.285037 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-sys\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.285922 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.285073 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/63340da2-b7c8-4798-a1ed-d8a80bf900b6-multus-daemon-config\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.285922 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.285103 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-etc-kubernetes\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.285922 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.285140 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-etc-modprobe-d\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.285922 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.285164 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3d17f781-cff6-4f98-92ab-d090568476a4-agent-certs\") pod \"konnectivity-agent-dnws5\" (UID: \"3d17f781-cff6-4f98-92ab-d090568476a4\") " pod="kube-system/konnectivity-agent-dnws5" Apr 23 16:35:16.285922 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.285196 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-etc-sysconfig\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.285922 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.285246 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-var-lib-kubelet\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.285922 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.285283 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/919d79c7-8b2d-41ad-b0ba-bf48e8815841-host\") pod \"node-ca-pmktz\" (UID: \"919d79c7-8b2d-41ad-b0ba-bf48e8815841\") " pod="openshift-image-registry/node-ca-pmktz" Apr 23 16:35:16.285922 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.285311 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-54sps\" (UniqueName: \"kubernetes.io/projected/919d79c7-8b2d-41ad-b0ba-bf48e8815841-kube-api-access-54sps\") pod \"node-ca-pmktz\" (UID: \"919d79c7-8b2d-41ad-b0ba-bf48e8815841\") " pod="openshift-image-registry/node-ca-pmktz" Apr 23 16:35:16.285922 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.285346 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/64f2e8d8-0a24-4b00-a66e-91dd67594081-cnibin\") pod \"multus-additional-cni-plugins-z8jnl\" (UID: \"64f2e8d8-0a24-4b00-a66e-91dd67594081\") " pod="openshift-multus/multus-additional-cni-plugins-z8jnl" Apr 23 16:35:16.285922 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.285355 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-etc-modprobe-d\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.285922 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.285381 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.286683 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.285406 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-host-slash\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.286683 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.285428 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/919d79c7-8b2d-41ad-b0ba-bf48e8815841-host\") pod \"node-ca-pmktz\" (UID: \"919d79c7-8b2d-41ad-b0ba-bf48e8815841\") " pod="openshift-image-registry/node-ca-pmktz" Apr 23 16:35:16.286683 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.285453 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-etc-kubernetes\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.286683 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.285473 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s2v5x\" (UniqueName: \"kubernetes.io/projected/6f9db1de-039e-440a-ad1a-5e30719153d9-kube-api-access-s2v5x\") pod \"aws-ebs-csi-driver-node-jlhpn\" (UID: \"6f9db1de-039e-440a-ad1a-5e30719153d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jlhpn" Apr 23 16:35:16.286683 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.285710 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/63340da2-b7c8-4798-a1ed-d8a80bf900b6-multus-daemon-config\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.286683 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.285724 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-var-lib-kubelet\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.286683 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.285789 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-sys\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.286683 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.285842 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-host\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.286683 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.285859 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a063c6b7-80d4-45d7-815d-88d94693a0b1-env-overrides\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.286683 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.285925 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a063c6b7-80d4-45d7-815d-88d94693a0b1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.286683 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.285953 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a063c6b7-80d4-45d7-815d-88d94693a0b1-ovn-node-metrics-cert\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.286683 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.285956 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-lib-modules\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.286683 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.285988 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-multus-conf-dir\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.286683 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.285789 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/64f2e8d8-0a24-4b00-a66e-91dd67594081-cnibin\") pod \"multus-additional-cni-plugins-z8jnl\" (UID: \"64f2e8d8-0a24-4b00-a66e-91dd67594081\") " pod="openshift-multus/multus-additional-cni-plugins-z8jnl" Apr 23 16:35:16.286683 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.286053 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/63340da2-b7c8-4798-a1ed-d8a80bf900b6-multus-cni-dir\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.286683 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.286204 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a063c6b7-80d4-45d7-815d-88d94693a0b1-ovnkube-script-lib\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.286683 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.286351 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3d17f781-cff6-4f98-92ab-d090568476a4-konnectivity-ca\") pod \"konnectivity-agent-dnws5\" (UID: \"3d17f781-cff6-4f98-92ab-d090568476a4\") " pod="kube-system/konnectivity-agent-dnws5" Apr 23 16:35:16.290062 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.290036 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2d2s\" (UniqueName: \"kubernetes.io/projected/486e65b1-cb27-4533-8ab9-9a91c79c58b1-kube-api-access-t2d2s\") pod \"node-resolver-fgwnc\" (UID: \"486e65b1-cb27-4533-8ab9-9a91c79c58b1\") " pod="openshift-dns/node-resolver-fgwnc" Apr 23 16:35:16.290462 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.290440 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d626\" (UniqueName: \"kubernetes.io/projected/63340da2-b7c8-4798-a1ed-d8a80bf900b6-kube-api-access-4d626\") pod \"multus-8slg2\" (UID: \"63340da2-b7c8-4798-a1ed-d8a80bf900b6\") " pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.290543 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.290534 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt8h5\" (UniqueName: \"kubernetes.io/projected/2c38e0bd-62a0-43a1-b0e2-05e9ca0f084b-kube-api-access-pt8h5\") pod \"iptables-alerter-6xw6f\" (UID: \"2c38e0bd-62a0-43a1-b0e2-05e9ca0f084b\") " pod="openshift-network-operator/iptables-alerter-6xw6f" Apr 23 16:35:16.291515 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.291495 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbqzz\" (UniqueName: \"kubernetes.io/projected/14a8d0f4-f62f-4c14-8b41-bf2f7476215d-kube-api-access-tbqzz\") pod \"tuned-s4jdv\" (UID: \"14a8d0f4-f62f-4c14-8b41-bf2f7476215d\") " pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.292021 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.292001 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfs4x\" (UniqueName: \"kubernetes.io/projected/1eabe990-f610-4e94-8a89-7cff1c9a6a23-kube-api-access-qfs4x\") pod \"network-metrics-daemon-k7n97\" (UID: \"1eabe990-f610-4e94-8a89-7cff1c9a6a23\") " pod="openshift-multus/network-metrics-daemon-k7n97" Apr 23 16:35:16.296743 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:16.296726 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:16.296743 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:16.296744 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:16.296844 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:16.296753 2569 projected.go:194] Error preparing data for projected volume kube-api-access-svdfq for pod openshift-network-diagnostics/network-check-target-ktsss: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:16.296844 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:16.296827 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1ba421e3-97e2-473e-a145-bf072f2b9393-kube-api-access-svdfq podName:1ba421e3-97e2-473e-a145-bf072f2b9393 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:16.796815367 +0000 UTC m=+2.134078338 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-svdfq" (UniqueName: "kubernetes.io/projected/1ba421e3-97e2-473e-a145-bf072f2b9393-kube-api-access-svdfq") pod "network-check-target-ktsss" (UID: "1ba421e3-97e2-473e-a145-bf072f2b9393") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:16.299858 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.299828 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2v5x\" (UniqueName: \"kubernetes.io/projected/6f9db1de-039e-440a-ad1a-5e30719153d9-kube-api-access-s2v5x\") pod \"aws-ebs-csi-driver-node-jlhpn\" (UID: \"6f9db1de-039e-440a-ad1a-5e30719153d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jlhpn" Apr 23 16:35:16.300399 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.300376 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hcc4\" (UniqueName: \"kubernetes.io/projected/a063c6b7-80d4-45d7-815d-88d94693a0b1-kube-api-access-4hcc4\") pod \"ovnkube-node-ddr2h\" (UID: \"a063c6b7-80d4-45d7-815d-88d94693a0b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.300675 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.300608 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqbzt\" (UniqueName: \"kubernetes.io/projected/64f2e8d8-0a24-4b00-a66e-91dd67594081-kube-api-access-rqbzt\") pod \"multus-additional-cni-plugins-z8jnl\" (UID: \"64f2e8d8-0a24-4b00-a66e-91dd67594081\") " pod="openshift-multus/multus-additional-cni-plugins-z8jnl" Apr 23 16:35:16.301236 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.301214 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-54sps\" (UniqueName: \"kubernetes.io/projected/919d79c7-8b2d-41ad-b0ba-bf48e8815841-kube-api-access-54sps\") pod \"node-ca-pmktz\" (UID: \"919d79c7-8b2d-41ad-b0ba-bf48e8815841\") " pod="openshift-image-registry/node-ca-pmktz" Apr 23 16:35:16.302641 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.302628 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:16.321527 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.321513 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-dnws5" Apr 23 16:35:16.348300 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:16.348147 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d17f781_cff6_4f98_92ab_d090568476a4.slice/crio-9a15fcf7bf4f7b779b1e6a3d2c4ab1b135b8493fc6a5f8f620d870cbf806f6fc WatchSource:0}: Error finding container 9a15fcf7bf4f7b779b1e6a3d2c4ab1b135b8493fc6a5f8f620d870cbf806f6fc: Status 404 returned error can't find the container with id 9a15fcf7bf4f7b779b1e6a3d2c4ab1b135b8493fc6a5f8f620d870cbf806f6fc Apr 23 16:35:16.348689 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:16.348655 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0c005bc2e76bc3454363e5204d0f408.slice/crio-ef8862bff1a7cf116092453f7a47793349cdba2b14beb6505be70c998dae5d29 WatchSource:0}: Error finding container ef8862bff1a7cf116092453f7a47793349cdba2b14beb6505be70c998dae5d29: Status 404 returned error can't find the container with id ef8862bff1a7cf116092453f7a47793349cdba2b14beb6505be70c998dae5d29 Apr 23 16:35:16.352982 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.352969 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:35:16.357651 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:16.357628 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05adcf3a2d254ff254dca89fd1c3545d.slice/crio-4c56619dbf716a8ca7d04d0c839f0ff187cb77cec1c800ab624ede23e6b13a2d WatchSource:0}: Error finding container 4c56619dbf716a8ca7d04d0c839f0ff187cb77cec1c800ab624ede23e6b13a2d: Status 404 returned error can't find the container with id 4c56619dbf716a8ca7d04d0c839f0ff187cb77cec1c800ab624ede23e6b13a2d Apr 23 16:35:16.396571 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.396500 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:16.432277 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.432252 2569 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:16.484200 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.484159 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jlhpn" Apr 23 16:35:16.490032 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:16.490003 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f9db1de_039e_440a_ad1a_5e30719153d9.slice/crio-41ddbb10f3be0a2ffee2eb0bae741d0efe53aa8006c69a953e538fd110c55104 WatchSource:0}: Error finding container 41ddbb10f3be0a2ffee2eb0bae741d0efe53aa8006c69a953e538fd110c55104: Status 404 returned error can't find the container with id 41ddbb10f3be0a2ffee2eb0bae741d0efe53aa8006c69a953e538fd110c55104 Apr 23 16:35:16.500248 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.500231 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" Apr 23 16:35:16.505486 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:16.505456 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14a8d0f4_f62f_4c14_8b41_bf2f7476215d.slice/crio-24016ffb5d05043318adcd29812bbb7ce7613671ad7ddeb622bb45fd11a14e95 WatchSource:0}: Error finding container 24016ffb5d05043318adcd29812bbb7ce7613671ad7ddeb622bb45fd11a14e95: Status 404 returned error can't find the container with id 24016ffb5d05043318adcd29812bbb7ce7613671ad7ddeb622bb45fd11a14e95 Apr 23 16:35:16.520745 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.520722 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fgwnc" Apr 23 16:35:16.526614 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:16.526593 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod486e65b1_cb27_4533_8ab9_9a91c79c58b1.slice/crio-b0195966bcdebc82806c92245eab1a28b6ed0fbe320b315e1c07bcfd4bbc8999 WatchSource:0}: Error finding container b0195966bcdebc82806c92245eab1a28b6ed0fbe320b315e1c07bcfd4bbc8999: Status 404 returned error can't find the container with id b0195966bcdebc82806c92245eab1a28b6ed0fbe320b315e1c07bcfd4bbc8999 Apr 23 16:35:16.536235 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.536217 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pmktz" Apr 23 16:35:16.542233 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:16.542207 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod919d79c7_8b2d_41ad_b0ba_bf48e8815841.slice/crio-d9fa505c78cd95083ad7f354717602c7e835645e8f3823134c35fdec2d20e0af WatchSource:0}: Error finding container d9fa505c78cd95083ad7f354717602c7e835645e8f3823134c35fdec2d20e0af: Status 404 returned error can't find the container with id d9fa505c78cd95083ad7f354717602c7e835645e8f3823134c35fdec2d20e0af Apr 23 16:35:16.551750 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.551734 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8slg2" Apr 23 16:35:16.557004 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:16.556980 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63340da2_b7c8_4798_a1ed_d8a80bf900b6.slice/crio-6b50d05adfe514f89bf76e249191f797614eeb124bc87b0b670815b3d20fe0f5 WatchSource:0}: Error finding container 6b50d05adfe514f89bf76e249191f797614eeb124bc87b0b670815b3d20fe0f5: Status 404 returned error can't find the container with id 6b50d05adfe514f89bf76e249191f797614eeb124bc87b0b670815b3d20fe0f5 Apr 23 16:35:16.569669 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.569641 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-6xw6f" Apr 23 16:35:16.576728 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:16.576707 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c38e0bd_62a0_43a1_b0e2_05e9ca0f084b.slice/crio-5ada4f55b4552e593f856d451df441463fd396fbb3775032b369d4ad97369a01 WatchSource:0}: Error finding container 5ada4f55b4552e593f856d451df441463fd396fbb3775032b369d4ad97369a01: Status 404 returned error can't find the container with id 5ada4f55b4552e593f856d451df441463fd396fbb3775032b369d4ad97369a01 Apr 23 16:35:16.586721 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.586702 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-z8jnl" Apr 23 16:35:16.591894 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:16.591875 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64f2e8d8_0a24_4b00_a66e_91dd67594081.slice/crio-508045c7cc46b8586a963c8e8b9b2fee3ebe18bc3ed20e3394ca1638d92e1fad WatchSource:0}: Error finding container 508045c7cc46b8586a963c8e8b9b2fee3ebe18bc3ed20e3394ca1638d92e1fad: Status 404 returned error can't find the container with id 508045c7cc46b8586a963c8e8b9b2fee3ebe18bc3ed20e3394ca1638d92e1fad Apr 23 16:35:16.647815 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:35:16.647758 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda063c6b7_80d4_45d7_815d_88d94693a0b1.slice/crio-58e8443405f0f4628ca884018fb2beacccaf0c8a1d45c8e500e283f333a55dfb WatchSource:0}: Error finding container 58e8443405f0f4628ca884018fb2beacccaf0c8a1d45c8e500e283f333a55dfb: Status 404 returned error can't find the container with id 58e8443405f0f4628ca884018fb2beacccaf0c8a1d45c8e500e283f333a55dfb Apr 23 16:35:16.789677 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.789623 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1eabe990-f610-4e94-8a89-7cff1c9a6a23-metrics-certs\") pod \"network-metrics-daemon-k7n97\" (UID: \"1eabe990-f610-4e94-8a89-7cff1c9a6a23\") " pod="openshift-multus/network-metrics-daemon-k7n97" Apr 23 16:35:16.789848 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:16.789811 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:16.789906 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:16.789866 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1eabe990-f610-4e94-8a89-7cff1c9a6a23-metrics-certs podName:1eabe990-f610-4e94-8a89-7cff1c9a6a23 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:17.789852075 +0000 UTC m=+3.127115017 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1eabe990-f610-4e94-8a89-7cff1c9a6a23-metrics-certs") pod "network-metrics-daemon-k7n97" (UID: "1eabe990-f610-4e94-8a89-7cff1c9a6a23") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:16.891156 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:16.890527 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svdfq\" (UniqueName: \"kubernetes.io/projected/1ba421e3-97e2-473e-a145-bf072f2b9393-kube-api-access-svdfq\") pod \"network-check-target-ktsss\" (UID: \"1ba421e3-97e2-473e-a145-bf072f2b9393\") " pod="openshift-network-diagnostics/network-check-target-ktsss" Apr 23 16:35:16.891156 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:16.890712 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:16.891156 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:16.890732 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:16.891156 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:16.890745 2569 projected.go:194] Error preparing data for projected volume kube-api-access-svdfq for pod openshift-network-diagnostics/network-check-target-ktsss: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:16.891156 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:16.890803 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1ba421e3-97e2-473e-a145-bf072f2b9393-kube-api-access-svdfq podName:1ba421e3-97e2-473e-a145-bf072f2b9393 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:17.890784165 +0000 UTC m=+3.228047111 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-svdfq" (UniqueName: "kubernetes.io/projected/1ba421e3-97e2-473e-a145-bf072f2b9393-kube-api-access-svdfq") pod "network-check-target-ktsss" (UID: "1ba421e3-97e2-473e-a145-bf072f2b9393") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:17.216728 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:17.216432 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 16:30:16 +0000 UTC" deadline="2028-01-29 00:28:37.543435346 +0000 UTC" Apr 23 16:35:17.216728 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:17.216468 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15487h53m20.326971948s" Apr 23 16:35:17.320030 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:17.319867 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z8jnl" event={"ID":"64f2e8d8-0a24-4b00-a66e-91dd67594081","Type":"ContainerStarted","Data":"508045c7cc46b8586a963c8e8b9b2fee3ebe18bc3ed20e3394ca1638d92e1fad"} Apr 23 16:35:17.348197 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:17.348155 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-6xw6f" event={"ID":"2c38e0bd-62a0-43a1-b0e2-05e9ca0f084b","Type":"ContainerStarted","Data":"5ada4f55b4552e593f856d451df441463fd396fbb3775032b369d4ad97369a01"} Apr 23 16:35:17.361002 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:17.360939 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8slg2" event={"ID":"63340da2-b7c8-4798-a1ed-d8a80bf900b6","Type":"ContainerStarted","Data":"6b50d05adfe514f89bf76e249191f797614eeb124bc87b0b670815b3d20fe0f5"} Apr 23 16:35:17.388693 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:17.388637 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fgwnc" event={"ID":"486e65b1-cb27-4533-8ab9-9a91c79c58b1","Type":"ContainerStarted","Data":"b0195966bcdebc82806c92245eab1a28b6ed0fbe320b315e1c07bcfd4bbc8999"} Apr 23 16:35:17.400255 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:17.400197 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" event={"ID":"14a8d0f4-f62f-4c14-8b41-bf2f7476215d","Type":"ContainerStarted","Data":"24016ffb5d05043318adcd29812bbb7ce7613671ad7ddeb622bb45fd11a14e95"} Apr 23 16:35:17.420385 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:17.420274 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" event={"ID":"a063c6b7-80d4-45d7-815d-88d94693a0b1","Type":"ContainerStarted","Data":"58e8443405f0f4628ca884018fb2beacccaf0c8a1d45c8e500e283f333a55dfb"} Apr 23 16:35:17.455865 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:17.455796 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pmktz" event={"ID":"919d79c7-8b2d-41ad-b0ba-bf48e8815841","Type":"ContainerStarted","Data":"d9fa505c78cd95083ad7f354717602c7e835645e8f3823134c35fdec2d20e0af"} Apr 23 16:35:17.464356 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:17.464296 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jlhpn" event={"ID":"6f9db1de-039e-440a-ad1a-5e30719153d9","Type":"ContainerStarted","Data":"41ddbb10f3be0a2ffee2eb0bae741d0efe53aa8006c69a953e538fd110c55104"} Apr 23 16:35:17.471562 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:17.471456 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-102.ec2.internal" event={"ID":"05adcf3a2d254ff254dca89fd1c3545d","Type":"ContainerStarted","Data":"4c56619dbf716a8ca7d04d0c839f0ff187cb77cec1c800ab624ede23e6b13a2d"} Apr 23 16:35:17.487107 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:17.487063 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-102.ec2.internal" event={"ID":"a0c005bc2e76bc3454363e5204d0f408","Type":"ContainerStarted","Data":"ef8862bff1a7cf116092453f7a47793349cdba2b14beb6505be70c998dae5d29"} Apr 23 16:35:17.492698 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:17.492652 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-dnws5" event={"ID":"3d17f781-cff6-4f98-92ab-d090568476a4","Type":"ContainerStarted","Data":"9a15fcf7bf4f7b779b1e6a3d2c4ab1b135b8493fc6a5f8f620d870cbf806f6fc"} Apr 23 16:35:17.650340 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:17.650312 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:17.803693 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:17.803586 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1eabe990-f610-4e94-8a89-7cff1c9a6a23-metrics-certs\") pod \"network-metrics-daemon-k7n97\" (UID: \"1eabe990-f610-4e94-8a89-7cff1c9a6a23\") " pod="openshift-multus/network-metrics-daemon-k7n97" Apr 23 16:35:17.803852 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:17.803771 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:17.803852 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:17.803849 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1eabe990-f610-4e94-8a89-7cff1c9a6a23-metrics-certs podName:1eabe990-f610-4e94-8a89-7cff1c9a6a23 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:19.803821546 +0000 UTC m=+5.141084514 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1eabe990-f610-4e94-8a89-7cff1c9a6a23-metrics-certs") pod "network-metrics-daemon-k7n97" (UID: "1eabe990-f610-4e94-8a89-7cff1c9a6a23") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:17.904829 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:17.904758 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svdfq\" (UniqueName: \"kubernetes.io/projected/1ba421e3-97e2-473e-a145-bf072f2b9393-kube-api-access-svdfq\") pod \"network-check-target-ktsss\" (UID: \"1ba421e3-97e2-473e-a145-bf072f2b9393\") " pod="openshift-network-diagnostics/network-check-target-ktsss" Apr 23 16:35:17.905015 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:17.904963 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:17.905015 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:17.904984 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:17.905015 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:17.904997 2569 projected.go:194] Error preparing data for projected volume kube-api-access-svdfq for pod openshift-network-diagnostics/network-check-target-ktsss: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:17.905167 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:17.905058 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1ba421e3-97e2-473e-a145-bf072f2b9393-kube-api-access-svdfq podName:1ba421e3-97e2-473e-a145-bf072f2b9393 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:19.905038302 +0000 UTC m=+5.242301244 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-svdfq" (UniqueName: "kubernetes.io/projected/1ba421e3-97e2-473e-a145-bf072f2b9393-kube-api-access-svdfq") pod "network-check-target-ktsss" (UID: "1ba421e3-97e2-473e-a145-bf072f2b9393") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:18.217655 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:18.217614 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 16:30:16 +0000 UTC" deadline="2027-11-13 22:30:56.125616694 +0000 UTC" Apr 23 16:35:18.217655 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:18.217649 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13661h55m37.907971402s" Apr 23 16:35:18.299427 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:18.299381 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7n97" Apr 23 16:35:18.299640 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:18.299559 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7n97" podUID="1eabe990-f610-4e94-8a89-7cff1c9a6a23" Apr 23 16:35:18.300132 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:18.300111 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ktsss" Apr 23 16:35:18.300229 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:18.300214 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ktsss" podUID="1ba421e3-97e2-473e-a145-bf072f2b9393" Apr 23 16:35:19.822006 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:19.821968 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1eabe990-f610-4e94-8a89-7cff1c9a6a23-metrics-certs\") pod \"network-metrics-daemon-k7n97\" (UID: \"1eabe990-f610-4e94-8a89-7cff1c9a6a23\") " pod="openshift-multus/network-metrics-daemon-k7n97" Apr 23 16:35:19.822466 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:19.822111 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:19.822466 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:19.822182 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1eabe990-f610-4e94-8a89-7cff1c9a6a23-metrics-certs podName:1eabe990-f610-4e94-8a89-7cff1c9a6a23 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:23.822163746 +0000 UTC m=+9.159426692 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1eabe990-f610-4e94-8a89-7cff1c9a6a23-metrics-certs") pod "network-metrics-daemon-k7n97" (UID: "1eabe990-f610-4e94-8a89-7cff1c9a6a23") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:19.923394 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:19.923357 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svdfq\" (UniqueName: \"kubernetes.io/projected/1ba421e3-97e2-473e-a145-bf072f2b9393-kube-api-access-svdfq\") pod \"network-check-target-ktsss\" (UID: \"1ba421e3-97e2-473e-a145-bf072f2b9393\") " pod="openshift-network-diagnostics/network-check-target-ktsss" Apr 23 16:35:19.923581 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:19.923554 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:19.923647 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:19.923582 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:19.923647 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:19.923597 2569 projected.go:194] Error preparing data for projected volume kube-api-access-svdfq for pod openshift-network-diagnostics/network-check-target-ktsss: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:19.924085 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:19.923672 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1ba421e3-97e2-473e-a145-bf072f2b9393-kube-api-access-svdfq podName:1ba421e3-97e2-473e-a145-bf072f2b9393 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:23.923639482 +0000 UTC m=+9.260902431 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-svdfq" (UniqueName: "kubernetes.io/projected/1ba421e3-97e2-473e-a145-bf072f2b9393-kube-api-access-svdfq") pod "network-check-target-ktsss" (UID: "1ba421e3-97e2-473e-a145-bf072f2b9393") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:20.299180 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:20.299150 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7n97" Apr 23 16:35:20.299355 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:20.299292 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7n97" podUID="1eabe990-f610-4e94-8a89-7cff1c9a6a23" Apr 23 16:35:20.299557 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:20.299534 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ktsss" Apr 23 16:35:20.299678 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:20.299640 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ktsss" podUID="1ba421e3-97e2-473e-a145-bf072f2b9393" Apr 23 16:35:22.299302 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:22.299263 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ktsss" Apr 23 16:35:22.299779 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:22.299321 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7n97" Apr 23 16:35:22.299779 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:22.299412 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ktsss" podUID="1ba421e3-97e2-473e-a145-bf072f2b9393" Apr 23 16:35:22.299779 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:22.299541 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7n97" podUID="1eabe990-f610-4e94-8a89-7cff1c9a6a23" Apr 23 16:35:23.856160 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:23.856111 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1eabe990-f610-4e94-8a89-7cff1c9a6a23-metrics-certs\") pod \"network-metrics-daemon-k7n97\" (UID: \"1eabe990-f610-4e94-8a89-7cff1c9a6a23\") " pod="openshift-multus/network-metrics-daemon-k7n97" Apr 23 16:35:23.856603 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:23.856283 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:23.856603 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:23.856342 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1eabe990-f610-4e94-8a89-7cff1c9a6a23-metrics-certs podName:1eabe990-f610-4e94-8a89-7cff1c9a6a23 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:31.856323456 +0000 UTC m=+17.193586405 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1eabe990-f610-4e94-8a89-7cff1c9a6a23-metrics-certs") pod "network-metrics-daemon-k7n97" (UID: "1eabe990-f610-4e94-8a89-7cff1c9a6a23") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:23.957307 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:23.956719 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svdfq\" (UniqueName: \"kubernetes.io/projected/1ba421e3-97e2-473e-a145-bf072f2b9393-kube-api-access-svdfq\") pod \"network-check-target-ktsss\" (UID: \"1ba421e3-97e2-473e-a145-bf072f2b9393\") " pod="openshift-network-diagnostics/network-check-target-ktsss" Apr 23 16:35:23.957307 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:23.956894 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:23.957307 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:23.956914 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:23.957307 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:23.956928 2569 projected.go:194] Error preparing data for projected volume kube-api-access-svdfq for pod openshift-network-diagnostics/network-check-target-ktsss: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:23.957307 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:23.956984 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1ba421e3-97e2-473e-a145-bf072f2b9393-kube-api-access-svdfq podName:1ba421e3-97e2-473e-a145-bf072f2b9393 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:31.956965979 +0000 UTC m=+17.294228927 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-svdfq" (UniqueName: "kubernetes.io/projected/1ba421e3-97e2-473e-a145-bf072f2b9393-kube-api-access-svdfq") pod "network-check-target-ktsss" (UID: "1ba421e3-97e2-473e-a145-bf072f2b9393") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:24.300259 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:24.300124 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7n97" Apr 23 16:35:24.300430 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:24.300271 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7n97" podUID="1eabe990-f610-4e94-8a89-7cff1c9a6a23" Apr 23 16:35:24.300430 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:24.300327 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ktsss" Apr 23 16:35:24.300430 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:24.300393 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ktsss" podUID="1ba421e3-97e2-473e-a145-bf072f2b9393" Apr 23 16:35:26.299543 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:26.299508 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ktsss" Apr 23 16:35:26.299976 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:26.299629 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ktsss" podUID="1ba421e3-97e2-473e-a145-bf072f2b9393" Apr 23 16:35:26.299976 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:26.299512 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7n97" Apr 23 16:35:26.299976 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:26.299805 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7n97" podUID="1eabe990-f610-4e94-8a89-7cff1c9a6a23" Apr 23 16:35:28.299743 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:28.299704 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7n97" Apr 23 16:35:28.300294 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:28.299707 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ktsss" Apr 23 16:35:28.300294 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:28.299854 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7n97" podUID="1eabe990-f610-4e94-8a89-7cff1c9a6a23" Apr 23 16:35:28.300294 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:28.299899 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ktsss" podUID="1ba421e3-97e2-473e-a145-bf072f2b9393" Apr 23 16:35:30.299411 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:30.299375 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7n97" Apr 23 16:35:30.299411 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:30.299394 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ktsss" Apr 23 16:35:30.299931 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:30.299537 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7n97" podUID="1eabe990-f610-4e94-8a89-7cff1c9a6a23" Apr 23 16:35:30.300052 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:30.300016 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ktsss" podUID="1ba421e3-97e2-473e-a145-bf072f2b9393" Apr 23 16:35:31.916397 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:31.916348 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1eabe990-f610-4e94-8a89-7cff1c9a6a23-metrics-certs\") pod \"network-metrics-daemon-k7n97\" (UID: \"1eabe990-f610-4e94-8a89-7cff1c9a6a23\") " pod="openshift-multus/network-metrics-daemon-k7n97" Apr 23 16:35:31.916912 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:31.916520 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:31.916912 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:31.916603 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1eabe990-f610-4e94-8a89-7cff1c9a6a23-metrics-certs podName:1eabe990-f610-4e94-8a89-7cff1c9a6a23 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:47.916581462 +0000 UTC m=+33.253844417 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1eabe990-f610-4e94-8a89-7cff1c9a6a23-metrics-certs") pod "network-metrics-daemon-k7n97" (UID: "1eabe990-f610-4e94-8a89-7cff1c9a6a23") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:32.017758 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:32.017720 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svdfq\" (UniqueName: \"kubernetes.io/projected/1ba421e3-97e2-473e-a145-bf072f2b9393-kube-api-access-svdfq\") pod \"network-check-target-ktsss\" (UID: \"1ba421e3-97e2-473e-a145-bf072f2b9393\") " pod="openshift-network-diagnostics/network-check-target-ktsss" Apr 23 16:35:32.017929 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:32.017891 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:32.017929 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:32.017912 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:32.017929 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:32.017925 2569 projected.go:194] Error preparing data for projected volume kube-api-access-svdfq for pod openshift-network-diagnostics/network-check-target-ktsss: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:32.018076 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:32.017984 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1ba421e3-97e2-473e-a145-bf072f2b9393-kube-api-access-svdfq podName:1ba421e3-97e2-473e-a145-bf072f2b9393 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:48.017964239 +0000 UTC m=+33.355227182 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-svdfq" (UniqueName: "kubernetes.io/projected/1ba421e3-97e2-473e-a145-bf072f2b9393-kube-api-access-svdfq") pod "network-check-target-ktsss" (UID: "1ba421e3-97e2-473e-a145-bf072f2b9393") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:32.299583 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:32.299489 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ktsss" Apr 23 16:35:32.299846 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:32.299501 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7n97" Apr 23 16:35:32.299846 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:32.299616 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ktsss" podUID="1ba421e3-97e2-473e-a145-bf072f2b9393" Apr 23 16:35:32.299846 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:32.299712 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7n97" podUID="1eabe990-f610-4e94-8a89-7cff1c9a6a23" Apr 23 16:35:34.299608 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:34.299589 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ktsss" Apr 23 16:35:34.299974 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:34.299634 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7n97" Apr 23 16:35:34.299974 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:34.299767 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7n97" podUID="1eabe990-f610-4e94-8a89-7cff1c9a6a23" Apr 23 16:35:34.299974 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:34.299870 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ktsss" podUID="1ba421e3-97e2-473e-a145-bf072f2b9393" Apr 23 16:35:35.536259 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:35.536092 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-102.ec2.internal" event={"ID":"a0c005bc2e76bc3454363e5204d0f408","Type":"ContainerStarted","Data":"d863bf215a0dcc2e069a2d707530af526db005a0d1ab875e427c7956467b934a"} Apr 23 16:35:35.538281 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:35.538227 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-dnws5" event={"ID":"3d17f781-cff6-4f98-92ab-d090568476a4","Type":"ContainerStarted","Data":"2710210247bd4b72321ca1980f2e9258c4b5558e62417b74656c10a5e8a3ea13"} Apr 23 16:35:35.539699 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:35.539635 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z8jnl" event={"ID":"64f2e8d8-0a24-4b00-a66e-91dd67594081","Type":"ContainerStarted","Data":"718ac6603f15f0992109247e953414edcba7a9f536e1c8af7d5fedf93c39da2d"} Apr 23 16:35:35.541523 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:35.541451 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8slg2" event={"ID":"63340da2-b7c8-4798-a1ed-d8a80bf900b6","Type":"ContainerStarted","Data":"b74fcd70861d06452b1a4ff1f10e8ab5bc834c648dc25459f96cd60efab531be"} Apr 23 16:35:35.543152 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:35.543129 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fgwnc" event={"ID":"486e65b1-cb27-4533-8ab9-9a91c79c58b1","Type":"ContainerStarted","Data":"76ebe66c2f7898896b7e73f85a23d028d5188806d0cf854ca2dce873ae2afa59"} Apr 23 16:35:35.544631 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:35.544586 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" event={"ID":"14a8d0f4-f62f-4c14-8b41-bf2f7476215d","Type":"ContainerStarted","Data":"4338aa107dfa43308a3954023757ac69c2ef003f125059a88bc8e36601819120"} Apr 23 16:35:35.547607 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:35.547591 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddr2h_a063c6b7-80d4-45d7-815d-88d94693a0b1/ovn-acl-logging/0.log" Apr 23 16:35:35.547965 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:35.547943 2569 generic.go:358] "Generic (PLEG): container finished" podID="a063c6b7-80d4-45d7-815d-88d94693a0b1" containerID="ace9bf6f86962b273ceb614427fb35a8ad08edd563e861a080f76957c92a5020" exitCode=1 Apr 23 16:35:35.548063 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:35.548015 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" event={"ID":"a063c6b7-80d4-45d7-815d-88d94693a0b1","Type":"ContainerStarted","Data":"c6d0ec07375a298ca5ef764076ec39918e194c6c95032b1e07fbc6c1595f4ce0"} Apr 23 16:35:35.548063 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:35.548044 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" event={"ID":"a063c6b7-80d4-45d7-815d-88d94693a0b1","Type":"ContainerStarted","Data":"2c8138000918706813c7cad141a2bfc879f5a4b02ba00fc94f10190ea757f51a"} Apr 23 16:35:35.548063 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:35.548057 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" event={"ID":"a063c6b7-80d4-45d7-815d-88d94693a0b1","Type":"ContainerStarted","Data":"2cf56ec6919bc5e4c8d655c0db889bad36f8277e9871ec40b431b52fba13e1b0"} Apr 23 16:35:35.548234 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:35.548071 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" event={"ID":"a063c6b7-80d4-45d7-815d-88d94693a0b1","Type":"ContainerStarted","Data":"0100170f68eb7546f8c0ce3a56649547d936ad7b1784127631c35dfff273151c"} Apr 23 16:35:35.548234 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:35.548082 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" event={"ID":"a063c6b7-80d4-45d7-815d-88d94693a0b1","Type":"ContainerDied","Data":"ace9bf6f86962b273ceb614427fb35a8ad08edd563e861a080f76957c92a5020"} Apr 23 16:35:35.548234 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:35.548097 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" event={"ID":"a063c6b7-80d4-45d7-815d-88d94693a0b1","Type":"ContainerStarted","Data":"0c8176ee2e2ad11ac86936b3c5ebafb4c3ca05916094b8748917d1d51d017ddd"} Apr 23 16:35:35.549176 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:35.549155 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pmktz" event={"ID":"919d79c7-8b2d-41ad-b0ba-bf48e8815841","Type":"ContainerStarted","Data":"172610cb9645b1427f0003cbe6fc32b1c40f2daa76904d2e7150cca8d5bd84ef"} Apr 23 16:35:35.550594 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:35.550575 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jlhpn" event={"ID":"6f9db1de-039e-440a-ad1a-5e30719153d9","Type":"ContainerStarted","Data":"732a7f75969932c672ffb28e9ff1b5edbfb242b5c381d4626630e1e49c9902a1"} Apr 23 16:35:35.552024 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:35.551979 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-102.ec2.internal" podStartSLOduration=19.551965469 podStartE2EDuration="19.551965469s" podCreationTimestamp="2026-04-23 16:35:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:35:35.551278839 +0000 UTC m=+20.888541801" watchObservedRunningTime="2026-04-23 16:35:35.551965469 +0000 UTC m=+20.889228437" Apr 23 16:35:35.593601 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:35.593562 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-8slg2" podStartSLOduration=2.75962367 podStartE2EDuration="20.593548183s" podCreationTimestamp="2026-04-23 16:35:15 +0000 UTC" firstStartedPulling="2026-04-23 16:35:16.55869076 +0000 UTC m=+1.895953701" lastFinishedPulling="2026-04-23 16:35:34.392615258 +0000 UTC m=+19.729878214" observedRunningTime="2026-04-23 16:35:35.572101292 +0000 UTC m=+20.909364258" watchObservedRunningTime="2026-04-23 16:35:35.593548183 +0000 UTC m=+20.930811145" Apr 23 16:35:35.615510 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:35.615473 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-s4jdv" podStartSLOduration=2.826977807 podStartE2EDuration="20.615461557s" podCreationTimestamp="2026-04-23 16:35:15 +0000 UTC" firstStartedPulling="2026-04-23 16:35:16.507089796 +0000 UTC m=+1.844352738" lastFinishedPulling="2026-04-23 16:35:34.295573531 +0000 UTC m=+19.632836488" observedRunningTime="2026-04-23 16:35:35.615149259 +0000 UTC m=+20.952412221" watchObservedRunningTime="2026-04-23 16:35:35.615461557 +0000 UTC m=+20.952724519" Apr 23 16:35:35.645542 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:35.645508 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-fgwnc" podStartSLOduration=2.779174404 podStartE2EDuration="20.645497638s" podCreationTimestamp="2026-04-23 16:35:15 +0000 UTC" firstStartedPulling="2026-04-23 16:35:16.528033681 +0000 UTC m=+1.865296626" lastFinishedPulling="2026-04-23 16:35:34.394356902 +0000 UTC m=+19.731619860" observedRunningTime="2026-04-23 16:35:35.64311614 +0000 UTC m=+20.980379102" watchObservedRunningTime="2026-04-23 16:35:35.645497638 +0000 UTC m=+20.982760600" Apr 23 16:35:35.692867 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:35.692825 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-pmktz" podStartSLOduration=2.86746661 podStartE2EDuration="20.692816082s" podCreationTimestamp="2026-04-23 16:35:15 +0000 UTC" firstStartedPulling="2026-04-23 16:35:16.543646127 +0000 UTC m=+1.880909088" lastFinishedPulling="2026-04-23 16:35:34.368995614 +0000 UTC m=+19.706258560" observedRunningTime="2026-04-23 16:35:35.692633014 +0000 UTC m=+21.029895978" watchObservedRunningTime="2026-04-23 16:35:35.692816082 +0000 UTC m=+21.030079046" Apr 23 16:35:35.742416 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:35.742336 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-dnws5" podStartSLOduration=2.726583319 podStartE2EDuration="20.742324537s" podCreationTimestamp="2026-04-23 16:35:15 +0000 UTC" firstStartedPulling="2026-04-23 16:35:16.353187156 +0000 UTC m=+1.690450100" lastFinishedPulling="2026-04-23 16:35:34.368928376 +0000 UTC m=+19.706191318" observedRunningTime="2026-04-23 16:35:35.742007807 +0000 UTC m=+21.079270770" watchObservedRunningTime="2026-04-23 16:35:35.742324537 +0000 UTC m=+21.079587494" Apr 23 16:35:36.299880 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:36.299826 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ktsss" Apr 23 16:35:36.299880 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:36.299873 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7n97" Apr 23 16:35:36.300074 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:36.299967 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ktsss" podUID="1ba421e3-97e2-473e-a145-bf072f2b9393" Apr 23 16:35:36.300129 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:36.300094 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7n97" podUID="1eabe990-f610-4e94-8a89-7cff1c9a6a23" Apr 23 16:35:36.396564 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:36.396539 2569 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 16:35:36.555034 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:36.554955 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jlhpn" event={"ID":"6f9db1de-039e-440a-ad1a-5e30719153d9","Type":"ContainerStarted","Data":"ba46c292a77d74de20f1c2fa3bdd7b4def88cd6e521a3000aa360538eb1bd762"} Apr 23 16:35:36.556243 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:36.556217 2569 generic.go:358] "Generic (PLEG): container finished" podID="05adcf3a2d254ff254dca89fd1c3545d" containerID="bb30aab95b296171d77fa48cb8f623491a40d0984917b344cf8b040acfaebcc7" exitCode=0 Apr 23 16:35:36.556361 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:36.556283 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-102.ec2.internal" event={"ID":"05adcf3a2d254ff254dca89fd1c3545d","Type":"ContainerDied","Data":"bb30aab95b296171d77fa48cb8f623491a40d0984917b344cf8b040acfaebcc7"} Apr 23 16:35:36.557581 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:36.557559 2569 generic.go:358] "Generic (PLEG): container finished" podID="64f2e8d8-0a24-4b00-a66e-91dd67594081" containerID="718ac6603f15f0992109247e953414edcba7a9f536e1c8af7d5fedf93c39da2d" exitCode=0 Apr 23 16:35:36.557693 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:36.557626 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z8jnl" event={"ID":"64f2e8d8-0a24-4b00-a66e-91dd67594081","Type":"ContainerDied","Data":"718ac6603f15f0992109247e953414edcba7a9f536e1c8af7d5fedf93c39da2d"} Apr 23 16:35:36.558958 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:36.558939 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-6xw6f" event={"ID":"2c38e0bd-62a0-43a1-b0e2-05e9ca0f084b","Type":"ContainerStarted","Data":"3633b99444cfc0aab58fbfeaf8bdb9582bb2c10a0991c22a69c5571be08ff2b3"} Apr 23 16:35:36.619964 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:36.619909 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-6xw6f" podStartSLOduration=3.902368201 podStartE2EDuration="21.619890652s" podCreationTimestamp="2026-04-23 16:35:15 +0000 UTC" firstStartedPulling="2026-04-23 16:35:16.578047642 +0000 UTC m=+1.915310585" lastFinishedPulling="2026-04-23 16:35:34.29557008 +0000 UTC m=+19.632833036" observedRunningTime="2026-04-23 16:35:36.589983335 +0000 UTC m=+21.927246299" watchObservedRunningTime="2026-04-23 16:35:36.619890652 +0000 UTC m=+21.957153619" Apr 23 16:35:37.235119 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:37.234795 2569 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T16:35:36.396560594Z","UUID":"d8116a1a-1791-4ebc-9d70-2f983230e69f","Handler":null,"Name":"","Endpoint":""} Apr 23 16:35:37.236724 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:37.236701 2569 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 16:35:37.236864 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:37.236735 2569 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 16:35:37.564286 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:37.564215 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddr2h_a063c6b7-80d4-45d7-815d-88d94693a0b1/ovn-acl-logging/0.log" Apr 23 16:35:37.564969 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:37.564943 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" event={"ID":"a063c6b7-80d4-45d7-815d-88d94693a0b1","Type":"ContainerStarted","Data":"3825e3ac7a16d94b31301fdd027fcdea41dbbe2da353f1a92bc18555208e0417"} Apr 23 16:35:37.567313 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:37.567281 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jlhpn" event={"ID":"6f9db1de-039e-440a-ad1a-5e30719153d9","Type":"ContainerStarted","Data":"547b17a4e6ee138330e49ce06000b95d6f3a17d12db1d3ece1b04830981e8240"} Apr 23 16:35:37.569270 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:37.569249 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-102.ec2.internal" event={"ID":"05adcf3a2d254ff254dca89fd1c3545d","Type":"ContainerStarted","Data":"5eb64b70aead35e3203d47e683ba579ee02a20ccc99dbbb8f3376f93ad20668b"} Apr 23 16:35:37.591812 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:37.591764 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jlhpn" podStartSLOduration=1.888476045 podStartE2EDuration="22.591750306s" podCreationTimestamp="2026-04-23 16:35:15 +0000 UTC" firstStartedPulling="2026-04-23 16:35:16.491530643 +0000 UTC m=+1.828793585" lastFinishedPulling="2026-04-23 16:35:37.194804889 +0000 UTC m=+22.532067846" observedRunningTime="2026-04-23 16:35:37.590506768 +0000 UTC m=+22.927769729" watchObservedRunningTime="2026-04-23 16:35:37.591750306 +0000 UTC m=+22.929013271" Apr 23 16:35:37.617638 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:37.617595 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-102.ec2.internal" podStartSLOduration=21.617584875 podStartE2EDuration="21.617584875s" podCreationTimestamp="2026-04-23 16:35:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:35:37.612125067 +0000 UTC m=+22.949388031" watchObservedRunningTime="2026-04-23 16:35:37.617584875 +0000 UTC m=+22.954847837" Apr 23 16:35:38.136543 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:38.136506 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-dnws5" Apr 23 16:35:38.137201 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:38.137180 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-dnws5" Apr 23 16:35:38.300039 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:38.300000 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ktsss" Apr 23 16:35:38.300207 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:38.300089 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7n97" Apr 23 16:35:38.300286 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:38.300210 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ktsss" podUID="1ba421e3-97e2-473e-a145-bf072f2b9393" Apr 23 16:35:38.300405 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:38.300380 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7n97" podUID="1eabe990-f610-4e94-8a89-7cff1c9a6a23" Apr 23 16:35:38.571714 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:38.571612 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-dnws5" Apr 23 16:35:38.572310 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:38.572200 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-dnws5" Apr 23 16:35:40.299764 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:40.299730 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7n97" Apr 23 16:35:40.300262 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:40.299850 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7n97" podUID="1eabe990-f610-4e94-8a89-7cff1c9a6a23" Apr 23 16:35:40.300262 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:40.299910 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ktsss" Apr 23 16:35:40.300262 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:40.300028 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ktsss" podUID="1ba421e3-97e2-473e-a145-bf072f2b9393" Apr 23 16:35:40.577295 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:40.577103 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddr2h_a063c6b7-80d4-45d7-815d-88d94693a0b1/ovn-acl-logging/0.log" Apr 23 16:35:40.577542 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:40.577518 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" event={"ID":"a063c6b7-80d4-45d7-815d-88d94693a0b1","Type":"ContainerStarted","Data":"f51895847188fa52bb0d1a342e3b3d4e6e5d4ebec61b556fc4817034c705c529"} Apr 23 16:35:40.577920 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:40.577897 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:40.578019 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:40.577932 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:40.578019 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:40.577945 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:40.578142 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:40.578127 2569 scope.go:117] "RemoveContainer" containerID="ace9bf6f86962b273ceb614427fb35a8ad08edd563e861a080f76957c92a5020" Apr 23 16:35:40.592309 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:40.592281 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:40.593745 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:40.593723 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:35:41.582118 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:41.582089 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddr2h_a063c6b7-80d4-45d7-815d-88d94693a0b1/ovn-acl-logging/0.log" Apr 23 16:35:41.582749 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:41.582442 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" event={"ID":"a063c6b7-80d4-45d7-815d-88d94693a0b1","Type":"ContainerStarted","Data":"a9b3193d0a1ae8155fff006cc26e73a272045a221d71e3976eb8cda8de52f83e"} Apr 23 16:35:41.584101 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:41.584080 2569 generic.go:358] "Generic (PLEG): container finished" podID="64f2e8d8-0a24-4b00-a66e-91dd67594081" containerID="10be1a35c564380e110d2382dc10e0b6e3b26e86e377f9f3607847cf0d8eac2e" exitCode=0 Apr 23 16:35:41.584200 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:41.584115 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z8jnl" event={"ID":"64f2e8d8-0a24-4b00-a66e-91dd67594081","Type":"ContainerDied","Data":"10be1a35c564380e110d2382dc10e0b6e3b26e86e377f9f3607847cf0d8eac2e"} Apr 23 16:35:41.635763 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:41.635642 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" podStartSLOduration=8.417434033 podStartE2EDuration="26.635627602s" podCreationTimestamp="2026-04-23 16:35:15 +0000 UTC" firstStartedPulling="2026-04-23 16:35:16.649287273 +0000 UTC m=+1.986550214" lastFinishedPulling="2026-04-23 16:35:34.867480839 +0000 UTC m=+20.204743783" observedRunningTime="2026-04-23 16:35:41.634108417 +0000 UTC m=+26.971371418" watchObservedRunningTime="2026-04-23 16:35:41.635627602 +0000 UTC m=+26.972890565" Apr 23 16:35:42.299763 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:42.299737 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ktsss" Apr 23 16:35:42.299905 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:42.299888 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7n97" Apr 23 16:35:42.300000 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:42.299976 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7n97" podUID="1eabe990-f610-4e94-8a89-7cff1c9a6a23" Apr 23 16:35:42.300053 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:42.299881 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ktsss" podUID="1ba421e3-97e2-473e-a145-bf072f2b9393" Apr 23 16:35:42.390940 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:42.390899 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-k7n97"] Apr 23 16:35:42.391630 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:42.391607 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ktsss"] Apr 23 16:35:42.587888 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:42.587675 2569 generic.go:358] "Generic (PLEG): container finished" podID="64f2e8d8-0a24-4b00-a66e-91dd67594081" containerID="6b2287847b9e2604d0f39b13df8fce67386c56f4c521c5b4f28e7c9f6def3623" exitCode=0 Apr 23 16:35:42.588275 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:42.587965 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ktsss" Apr 23 16:35:42.588275 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:42.587753 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z8jnl" event={"ID":"64f2e8d8-0a24-4b00-a66e-91dd67594081","Type":"ContainerDied","Data":"6b2287847b9e2604d0f39b13df8fce67386c56f4c521c5b4f28e7c9f6def3623"} Apr 23 16:35:42.588275 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:42.588047 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ktsss" podUID="1ba421e3-97e2-473e-a145-bf072f2b9393" Apr 23 16:35:42.588275 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:42.588132 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7n97" Apr 23 16:35:42.588978 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:42.588487 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7n97" podUID="1eabe990-f610-4e94-8a89-7cff1c9a6a23" Apr 23 16:35:43.591683 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:43.591637 2569 generic.go:358] "Generic (PLEG): container finished" podID="64f2e8d8-0a24-4b00-a66e-91dd67594081" containerID="250206d27b48f5fdc0049d410cd75ba60b1f1a910069e5f59c4f7a582193cf3e" exitCode=0 Apr 23 16:35:43.592020 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:43.591704 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z8jnl" event={"ID":"64f2e8d8-0a24-4b00-a66e-91dd67594081","Type":"ContainerDied","Data":"250206d27b48f5fdc0049d410cd75ba60b1f1a910069e5f59c4f7a582193cf3e"} Apr 23 16:35:44.299766 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:44.299687 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ktsss" Apr 23 16:35:44.299956 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:44.299692 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7n97" Apr 23 16:35:44.299956 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:44.299805 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ktsss" podUID="1ba421e3-97e2-473e-a145-bf072f2b9393" Apr 23 16:35:44.299956 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:44.299925 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7n97" podUID="1eabe990-f610-4e94-8a89-7cff1c9a6a23" Apr 23 16:35:46.299484 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:46.299448 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ktsss" Apr 23 16:35:46.299484 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:46.299485 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7n97" Apr 23 16:35:46.300486 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:46.299563 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ktsss" podUID="1ba421e3-97e2-473e-a145-bf072f2b9393" Apr 23 16:35:46.300486 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:46.299710 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7n97" podUID="1eabe990-f610-4e94-8a89-7cff1c9a6a23" Apr 23 16:35:47.003952 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:47.003924 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-102.ec2.internal" event="NodeReady" Apr 23 16:35:47.004086 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:47.004039 2569 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 16:35:47.074990 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:47.074956 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-jfzk8"] Apr 23 16:35:47.100597 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:47.100477 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-bsgxs"] Apr 23 16:35:47.101308 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:47.100698 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jfzk8" Apr 23 16:35:47.105422 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:47.105402 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-jmk6t\"" Apr 23 16:35:47.105564 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:47.105537 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 16:35:47.106184 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:47.106162 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 16:35:47.122896 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:47.122873 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jfzk8"] Apr 23 16:35:47.122896 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:47.122898 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bsgxs"] Apr 23 16:35:47.123031 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:47.122987 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bsgxs" Apr 23 16:35:47.126046 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:47.126028 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 16:35:47.126193 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:47.126171 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 16:35:47.126377 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:47.126360 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 16:35:47.126927 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:47.126910 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-s982g\"" Apr 23 16:35:47.226837 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:47.226806 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2c88812-4055-43aa-8e5a-25b432f9041d-config-volume\") pod \"dns-default-jfzk8\" (UID: \"e2c88812-4055-43aa-8e5a-25b432f9041d\") " pod="openshift-dns/dns-default-jfzk8" Apr 23 16:35:47.226837 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:47.226841 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e2c88812-4055-43aa-8e5a-25b432f9041d-metrics-tls\") pod \"dns-default-jfzk8\" (UID: \"e2c88812-4055-43aa-8e5a-25b432f9041d\") " pod="openshift-dns/dns-default-jfzk8" Apr 23 16:35:47.227079 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:47.226859 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e2c88812-4055-43aa-8e5a-25b432f9041d-tmp-dir\") pod \"dns-default-jfzk8\" (UID: \"e2c88812-4055-43aa-8e5a-25b432f9041d\") " pod="openshift-dns/dns-default-jfzk8" Apr 23 16:35:47.227079 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:47.226877 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2nsp\" (UniqueName: \"kubernetes.io/projected/e2c88812-4055-43aa-8e5a-25b432f9041d-kube-api-access-w2nsp\") pod \"dns-default-jfzk8\" (UID: \"e2c88812-4055-43aa-8e5a-25b432f9041d\") " pod="openshift-dns/dns-default-jfzk8" Apr 23 16:35:47.227079 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:47.226965 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jqs7\" (UniqueName: \"kubernetes.io/projected/3bfd1dfe-900e-4260-b0fc-9dc05d2c604c-kube-api-access-4jqs7\") pod \"ingress-canary-bsgxs\" (UID: \"3bfd1dfe-900e-4260-b0fc-9dc05d2c604c\") " pod="openshift-ingress-canary/ingress-canary-bsgxs" Apr 23 16:35:47.227079 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:47.227010 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bfd1dfe-900e-4260-b0fc-9dc05d2c604c-cert\") pod \"ingress-canary-bsgxs\" (UID: \"3bfd1dfe-900e-4260-b0fc-9dc05d2c604c\") " pod="openshift-ingress-canary/ingress-canary-bsgxs" Apr 23 16:35:47.328045 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:47.327977 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2c88812-4055-43aa-8e5a-25b432f9041d-config-volume\") pod \"dns-default-jfzk8\" (UID: \"e2c88812-4055-43aa-8e5a-25b432f9041d\") " pod="openshift-dns/dns-default-jfzk8" Apr 23 16:35:47.328045 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:47.328009 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e2c88812-4055-43aa-8e5a-25b432f9041d-metrics-tls\") pod \"dns-default-jfzk8\" (UID: \"e2c88812-4055-43aa-8e5a-25b432f9041d\") " pod="openshift-dns/dns-default-jfzk8" Apr 23 16:35:47.328045 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:47.328027 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e2c88812-4055-43aa-8e5a-25b432f9041d-tmp-dir\") pod \"dns-default-jfzk8\" (UID: \"e2c88812-4055-43aa-8e5a-25b432f9041d\") " pod="openshift-dns/dns-default-jfzk8" Apr 23 16:35:47.328646 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:47.328054 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w2nsp\" (UniqueName: \"kubernetes.io/projected/e2c88812-4055-43aa-8e5a-25b432f9041d-kube-api-access-w2nsp\") pod \"dns-default-jfzk8\" (UID: \"e2c88812-4055-43aa-8e5a-25b432f9041d\") " pod="openshift-dns/dns-default-jfzk8" Apr 23 16:35:47.328646 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:47.328094 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4jqs7\" (UniqueName: \"kubernetes.io/projected/3bfd1dfe-900e-4260-b0fc-9dc05d2c604c-kube-api-access-4jqs7\") pod \"ingress-canary-bsgxs\" (UID: \"3bfd1dfe-900e-4260-b0fc-9dc05d2c604c\") " pod="openshift-ingress-canary/ingress-canary-bsgxs" Apr 23 16:35:47.328646 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:47.328123 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bfd1dfe-900e-4260-b0fc-9dc05d2c604c-cert\") pod \"ingress-canary-bsgxs\" (UID: \"3bfd1dfe-900e-4260-b0fc-9dc05d2c604c\") " pod="openshift-ingress-canary/ingress-canary-bsgxs" Apr 23 16:35:47.328646 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:47.328184 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:35:47.328646 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:47.328230 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:35:47.328646 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:47.328240 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2c88812-4055-43aa-8e5a-25b432f9041d-metrics-tls podName:e2c88812-4055-43aa-8e5a-25b432f9041d nodeName:}" failed. No retries permitted until 2026-04-23 16:35:47.828225827 +0000 UTC m=+33.165488770 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e2c88812-4055-43aa-8e5a-25b432f9041d-metrics-tls") pod "dns-default-jfzk8" (UID: "e2c88812-4055-43aa-8e5a-25b432f9041d") : secret "dns-default-metrics-tls" not found Apr 23 16:35:47.328646 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:47.328278 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bfd1dfe-900e-4260-b0fc-9dc05d2c604c-cert podName:3bfd1dfe-900e-4260-b0fc-9dc05d2c604c nodeName:}" failed. No retries permitted until 2026-04-23 16:35:47.828262412 +0000 UTC m=+33.165525356 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3bfd1dfe-900e-4260-b0fc-9dc05d2c604c-cert") pod "ingress-canary-bsgxs" (UID: "3bfd1dfe-900e-4260-b0fc-9dc05d2c604c") : secret "canary-serving-cert" not found Apr 23 16:35:47.328646 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:47.328345 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e2c88812-4055-43aa-8e5a-25b432f9041d-tmp-dir\") pod \"dns-default-jfzk8\" (UID: \"e2c88812-4055-43aa-8e5a-25b432f9041d\") " pod="openshift-dns/dns-default-jfzk8" Apr 23 16:35:47.328927 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:47.328656 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2c88812-4055-43aa-8e5a-25b432f9041d-config-volume\") pod \"dns-default-jfzk8\" (UID: \"e2c88812-4055-43aa-8e5a-25b432f9041d\") " pod="openshift-dns/dns-default-jfzk8" Apr 23 16:35:47.344531 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:47.344509 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2nsp\" (UniqueName: \"kubernetes.io/projected/e2c88812-4055-43aa-8e5a-25b432f9041d-kube-api-access-w2nsp\") pod \"dns-default-jfzk8\" (UID: \"e2c88812-4055-43aa-8e5a-25b432f9041d\") " pod="openshift-dns/dns-default-jfzk8" Apr 23 16:35:47.344690 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:47.344591 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jqs7\" (UniqueName: \"kubernetes.io/projected/3bfd1dfe-900e-4260-b0fc-9dc05d2c604c-kube-api-access-4jqs7\") pod \"ingress-canary-bsgxs\" (UID: \"3bfd1dfe-900e-4260-b0fc-9dc05d2c604c\") " pod="openshift-ingress-canary/ingress-canary-bsgxs" Apr 23 16:35:47.832697 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:47.832645 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e2c88812-4055-43aa-8e5a-25b432f9041d-metrics-tls\") pod \"dns-default-jfzk8\" (UID: \"e2c88812-4055-43aa-8e5a-25b432f9041d\") " pod="openshift-dns/dns-default-jfzk8" Apr 23 16:35:47.832902 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:47.832745 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bfd1dfe-900e-4260-b0fc-9dc05d2c604c-cert\") pod \"ingress-canary-bsgxs\" (UID: \"3bfd1dfe-900e-4260-b0fc-9dc05d2c604c\") " pod="openshift-ingress-canary/ingress-canary-bsgxs" Apr 23 16:35:47.832902 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:47.832810 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:35:47.832902 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:47.832882 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2c88812-4055-43aa-8e5a-25b432f9041d-metrics-tls podName:e2c88812-4055-43aa-8e5a-25b432f9041d nodeName:}" failed. No retries permitted until 2026-04-23 16:35:48.832862381 +0000 UTC m=+34.170125331 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e2c88812-4055-43aa-8e5a-25b432f9041d-metrics-tls") pod "dns-default-jfzk8" (UID: "e2c88812-4055-43aa-8e5a-25b432f9041d") : secret "dns-default-metrics-tls" not found Apr 23 16:35:47.833090 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:47.832900 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:35:47.833090 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:47.832964 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bfd1dfe-900e-4260-b0fc-9dc05d2c604c-cert podName:3bfd1dfe-900e-4260-b0fc-9dc05d2c604c nodeName:}" failed. No retries permitted until 2026-04-23 16:35:48.832948142 +0000 UTC m=+34.170211085 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3bfd1dfe-900e-4260-b0fc-9dc05d2c604c-cert") pod "ingress-canary-bsgxs" (UID: "3bfd1dfe-900e-4260-b0fc-9dc05d2c604c") : secret "canary-serving-cert" not found Apr 23 16:35:47.933130 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:47.933084 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1eabe990-f610-4e94-8a89-7cff1c9a6a23-metrics-certs\") pod \"network-metrics-daemon-k7n97\" (UID: \"1eabe990-f610-4e94-8a89-7cff1c9a6a23\") " pod="openshift-multus/network-metrics-daemon-k7n97" Apr 23 16:35:47.933317 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:47.933244 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:47.933317 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:47.933304 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1eabe990-f610-4e94-8a89-7cff1c9a6a23-metrics-certs podName:1eabe990-f610-4e94-8a89-7cff1c9a6a23 nodeName:}" failed. No retries permitted until 2026-04-23 16:36:19.933286181 +0000 UTC m=+65.270549123 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1eabe990-f610-4e94-8a89-7cff1c9a6a23-metrics-certs") pod "network-metrics-daemon-k7n97" (UID: "1eabe990-f610-4e94-8a89-7cff1c9a6a23") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:48.034576 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:48.034538 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svdfq\" (UniqueName: \"kubernetes.io/projected/1ba421e3-97e2-473e-a145-bf072f2b9393-kube-api-access-svdfq\") pod \"network-check-target-ktsss\" (UID: \"1ba421e3-97e2-473e-a145-bf072f2b9393\") " pod="openshift-network-diagnostics/network-check-target-ktsss" Apr 23 16:35:48.034758 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:48.034742 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:48.034806 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:48.034765 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:48.034806 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:48.034776 2569 projected.go:194] Error preparing data for projected volume kube-api-access-svdfq for pod openshift-network-diagnostics/network-check-target-ktsss: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:48.034877 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:48.034829 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1ba421e3-97e2-473e-a145-bf072f2b9393-kube-api-access-svdfq podName:1ba421e3-97e2-473e-a145-bf072f2b9393 nodeName:}" failed. No retries permitted until 2026-04-23 16:36:20.034813832 +0000 UTC m=+65.372076774 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-svdfq" (UniqueName: "kubernetes.io/projected/1ba421e3-97e2-473e-a145-bf072f2b9393-kube-api-access-svdfq") pod "network-check-target-ktsss" (UID: "1ba421e3-97e2-473e-a145-bf072f2b9393") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:48.300016 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:48.299976 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ktsss" Apr 23 16:35:48.300284 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:48.300173 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7n97" Apr 23 16:35:48.302289 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:48.302265 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 16:35:48.302435 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:48.302321 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 16:35:48.302898 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:48.302689 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-f89h4\"" Apr 23 16:35:48.302898 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:48.302711 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 16:35:48.302898 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:48.302811 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-6xqf4\"" Apr 23 16:35:48.840135 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:48.840099 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bfd1dfe-900e-4260-b0fc-9dc05d2c604c-cert\") pod \"ingress-canary-bsgxs\" (UID: \"3bfd1dfe-900e-4260-b0fc-9dc05d2c604c\") " pod="openshift-ingress-canary/ingress-canary-bsgxs" Apr 23 16:35:48.840813 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:48.840279 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:35:48.840813 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:48.840295 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e2c88812-4055-43aa-8e5a-25b432f9041d-metrics-tls\") pod \"dns-default-jfzk8\" (UID: \"e2c88812-4055-43aa-8e5a-25b432f9041d\") " pod="openshift-dns/dns-default-jfzk8" Apr 23 16:35:48.840813 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:48.840356 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bfd1dfe-900e-4260-b0fc-9dc05d2c604c-cert podName:3bfd1dfe-900e-4260-b0fc-9dc05d2c604c nodeName:}" failed. No retries permitted until 2026-04-23 16:35:50.840333767 +0000 UTC m=+36.177596713 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3bfd1dfe-900e-4260-b0fc-9dc05d2c604c-cert") pod "ingress-canary-bsgxs" (UID: "3bfd1dfe-900e-4260-b0fc-9dc05d2c604c") : secret "canary-serving-cert" not found Apr 23 16:35:48.840813 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:48.840405 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:35:48.840813 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:48.840456 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2c88812-4055-43aa-8e5a-25b432f9041d-metrics-tls podName:e2c88812-4055-43aa-8e5a-25b432f9041d nodeName:}" failed. No retries permitted until 2026-04-23 16:35:50.84043972 +0000 UTC m=+36.177702679 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e2c88812-4055-43aa-8e5a-25b432f9041d-metrics-tls") pod "dns-default-jfzk8" (UID: "e2c88812-4055-43aa-8e5a-25b432f9041d") : secret "dns-default-metrics-tls" not found Apr 23 16:35:50.608164 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:50.607993 2569 generic.go:358] "Generic (PLEG): container finished" podID="64f2e8d8-0a24-4b00-a66e-91dd67594081" containerID="2744dc2ca3910266d4891c85c17323e3b763ddd9454b8ceca00968f703ef68f9" exitCode=0 Apr 23 16:35:50.608164 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:50.608076 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z8jnl" event={"ID":"64f2e8d8-0a24-4b00-a66e-91dd67594081","Type":"ContainerDied","Data":"2744dc2ca3910266d4891c85c17323e3b763ddd9454b8ceca00968f703ef68f9"} Apr 23 16:35:50.855973 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:50.855946 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e2c88812-4055-43aa-8e5a-25b432f9041d-metrics-tls\") pod \"dns-default-jfzk8\" (UID: \"e2c88812-4055-43aa-8e5a-25b432f9041d\") " pod="openshift-dns/dns-default-jfzk8" Apr 23 16:35:50.856116 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:50.856009 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bfd1dfe-900e-4260-b0fc-9dc05d2c604c-cert\") pod \"ingress-canary-bsgxs\" (UID: \"3bfd1dfe-900e-4260-b0fc-9dc05d2c604c\") " pod="openshift-ingress-canary/ingress-canary-bsgxs" Apr 23 16:35:50.856116 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:50.856097 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:35:50.856212 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:50.856127 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:35:50.856212 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:50.856160 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2c88812-4055-43aa-8e5a-25b432f9041d-metrics-tls podName:e2c88812-4055-43aa-8e5a-25b432f9041d nodeName:}" failed. No retries permitted until 2026-04-23 16:35:54.856144093 +0000 UTC m=+40.193407034 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e2c88812-4055-43aa-8e5a-25b432f9041d-metrics-tls") pod "dns-default-jfzk8" (UID: "e2c88812-4055-43aa-8e5a-25b432f9041d") : secret "dns-default-metrics-tls" not found Apr 23 16:35:50.856212 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:50.856174 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bfd1dfe-900e-4260-b0fc-9dc05d2c604c-cert podName:3bfd1dfe-900e-4260-b0fc-9dc05d2c604c nodeName:}" failed. No retries permitted until 2026-04-23 16:35:54.856167832 +0000 UTC m=+40.193430774 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3bfd1dfe-900e-4260-b0fc-9dc05d2c604c-cert") pod "ingress-canary-bsgxs" (UID: "3bfd1dfe-900e-4260-b0fc-9dc05d2c604c") : secret "canary-serving-cert" not found Apr 23 16:35:51.612694 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:51.612644 2569 generic.go:358] "Generic (PLEG): container finished" podID="64f2e8d8-0a24-4b00-a66e-91dd67594081" containerID="fe0346e0bb27d060c19ebe213879070f5701e0ff867f4a9bc5837ef085bc0961" exitCode=0 Apr 23 16:35:51.613045 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:51.612708 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z8jnl" event={"ID":"64f2e8d8-0a24-4b00-a66e-91dd67594081","Type":"ContainerDied","Data":"fe0346e0bb27d060c19ebe213879070f5701e0ff867f4a9bc5837ef085bc0961"} Apr 23 16:35:52.619704 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:52.619653 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z8jnl" event={"ID":"64f2e8d8-0a24-4b00-a66e-91dd67594081","Type":"ContainerStarted","Data":"837189a16be0c670083b02a40cdd12af4bfc76305656c579a9619c56e3a44fde"} Apr 23 16:35:52.644789 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:52.644716 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-z8jnl" podStartSLOduration=4.656944542 podStartE2EDuration="37.644705544s" podCreationTimestamp="2026-04-23 16:35:15 +0000 UTC" firstStartedPulling="2026-04-23 16:35:16.593240302 +0000 UTC m=+1.930503244" lastFinishedPulling="2026-04-23 16:35:49.5810013 +0000 UTC m=+34.918264246" observedRunningTime="2026-04-23 16:35:52.643858451 +0000 UTC m=+37.981121413" watchObservedRunningTime="2026-04-23 16:35:52.644705544 +0000 UTC m=+37.981968508" Apr 23 16:35:54.890583 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:54.890537 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e2c88812-4055-43aa-8e5a-25b432f9041d-metrics-tls\") pod \"dns-default-jfzk8\" (UID: \"e2c88812-4055-43aa-8e5a-25b432f9041d\") " pod="openshift-dns/dns-default-jfzk8" Apr 23 16:35:54.890583 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:35:54.890599 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bfd1dfe-900e-4260-b0fc-9dc05d2c604c-cert\") pod \"ingress-canary-bsgxs\" (UID: \"3bfd1dfe-900e-4260-b0fc-9dc05d2c604c\") " pod="openshift-ingress-canary/ingress-canary-bsgxs" Apr 23 16:35:54.891079 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:54.890701 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:35:54.891079 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:54.890763 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2c88812-4055-43aa-8e5a-25b432f9041d-metrics-tls podName:e2c88812-4055-43aa-8e5a-25b432f9041d nodeName:}" failed. No retries permitted until 2026-04-23 16:36:02.890747871 +0000 UTC m=+48.228010813 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e2c88812-4055-43aa-8e5a-25b432f9041d-metrics-tls") pod "dns-default-jfzk8" (UID: "e2c88812-4055-43aa-8e5a-25b432f9041d") : secret "dns-default-metrics-tls" not found Apr 23 16:35:54.891079 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:54.890708 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:35:54.891079 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:35:54.890835 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bfd1dfe-900e-4260-b0fc-9dc05d2c604c-cert podName:3bfd1dfe-900e-4260-b0fc-9dc05d2c604c nodeName:}" failed. No retries permitted until 2026-04-23 16:36:02.890824293 +0000 UTC m=+48.228087239 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3bfd1dfe-900e-4260-b0fc-9dc05d2c604c-cert") pod "ingress-canary-bsgxs" (UID: "3bfd1dfe-900e-4260-b0fc-9dc05d2c604c") : secret "canary-serving-cert" not found Apr 23 16:36:02.946847 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:36:02.946803 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bfd1dfe-900e-4260-b0fc-9dc05d2c604c-cert\") pod \"ingress-canary-bsgxs\" (UID: \"3bfd1dfe-900e-4260-b0fc-9dc05d2c604c\") " pod="openshift-ingress-canary/ingress-canary-bsgxs" Apr 23 16:36:02.947280 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:36:02.946871 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e2c88812-4055-43aa-8e5a-25b432f9041d-metrics-tls\") pod \"dns-default-jfzk8\" (UID: \"e2c88812-4055-43aa-8e5a-25b432f9041d\") " pod="openshift-dns/dns-default-jfzk8" Apr 23 16:36:02.947280 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:36:02.946956 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:36:02.947280 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:36:02.947030 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bfd1dfe-900e-4260-b0fc-9dc05d2c604c-cert podName:3bfd1dfe-900e-4260-b0fc-9dc05d2c604c nodeName:}" failed. No retries permitted until 2026-04-23 16:36:18.947013705 +0000 UTC m=+64.284276646 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3bfd1dfe-900e-4260-b0fc-9dc05d2c604c-cert") pod "ingress-canary-bsgxs" (UID: "3bfd1dfe-900e-4260-b0fc-9dc05d2c604c") : secret "canary-serving-cert" not found Apr 23 16:36:02.947280 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:36:02.946963 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:36:02.947280 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:36:02.947107 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2c88812-4055-43aa-8e5a-25b432f9041d-metrics-tls podName:e2c88812-4055-43aa-8e5a-25b432f9041d nodeName:}" failed. No retries permitted until 2026-04-23 16:36:18.947093455 +0000 UTC m=+64.284356397 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e2c88812-4055-43aa-8e5a-25b432f9041d-metrics-tls") pod "dns-default-jfzk8" (UID: "e2c88812-4055-43aa-8e5a-25b432f9041d") : secret "dns-default-metrics-tls" not found Apr 23 16:36:12.603623 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:36:12.603593 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ddr2h" Apr 23 16:36:18.949827 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:36:18.949774 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bfd1dfe-900e-4260-b0fc-9dc05d2c604c-cert\") pod \"ingress-canary-bsgxs\" (UID: \"3bfd1dfe-900e-4260-b0fc-9dc05d2c604c\") " pod="openshift-ingress-canary/ingress-canary-bsgxs" Apr 23 16:36:18.950375 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:36:18.949868 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e2c88812-4055-43aa-8e5a-25b432f9041d-metrics-tls\") pod \"dns-default-jfzk8\" (UID: \"e2c88812-4055-43aa-8e5a-25b432f9041d\") " pod="openshift-dns/dns-default-jfzk8" Apr 23 16:36:18.950375 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:36:18.949936 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:36:18.950375 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:36:18.949952 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:36:18.950375 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:36:18.950013 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bfd1dfe-900e-4260-b0fc-9dc05d2c604c-cert podName:3bfd1dfe-900e-4260-b0fc-9dc05d2c604c nodeName:}" failed. No retries permitted until 2026-04-23 16:36:50.949999337 +0000 UTC m=+96.287262283 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3bfd1dfe-900e-4260-b0fc-9dc05d2c604c-cert") pod "ingress-canary-bsgxs" (UID: "3bfd1dfe-900e-4260-b0fc-9dc05d2c604c") : secret "canary-serving-cert" not found Apr 23 16:36:18.950375 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:36:18.950029 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2c88812-4055-43aa-8e5a-25b432f9041d-metrics-tls podName:e2c88812-4055-43aa-8e5a-25b432f9041d nodeName:}" failed. No retries permitted until 2026-04-23 16:36:50.95002234 +0000 UTC m=+96.287285282 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e2c88812-4055-43aa-8e5a-25b432f9041d-metrics-tls") pod "dns-default-jfzk8" (UID: "e2c88812-4055-43aa-8e5a-25b432f9041d") : secret "dns-default-metrics-tls" not found Apr 23 16:36:19.956010 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:36:19.955974 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1eabe990-f610-4e94-8a89-7cff1c9a6a23-metrics-certs\") pod \"network-metrics-daemon-k7n97\" (UID: \"1eabe990-f610-4e94-8a89-7cff1c9a6a23\") " pod="openshift-multus/network-metrics-daemon-k7n97" Apr 23 16:36:19.958263 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:36:19.958243 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 16:36:19.966710 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:36:19.966696 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 16:36:19.966767 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:36:19.966745 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1eabe990-f610-4e94-8a89-7cff1c9a6a23-metrics-certs podName:1eabe990-f610-4e94-8a89-7cff1c9a6a23 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:23.966730429 +0000 UTC m=+129.303993375 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1eabe990-f610-4e94-8a89-7cff1c9a6a23-metrics-certs") pod "network-metrics-daemon-k7n97" (UID: "1eabe990-f610-4e94-8a89-7cff1c9a6a23") : secret "metrics-daemon-secret" not found Apr 23 16:36:20.056953 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:36:20.056915 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svdfq\" (UniqueName: \"kubernetes.io/projected/1ba421e3-97e2-473e-a145-bf072f2b9393-kube-api-access-svdfq\") pod \"network-check-target-ktsss\" (UID: \"1ba421e3-97e2-473e-a145-bf072f2b9393\") " pod="openshift-network-diagnostics/network-check-target-ktsss" Apr 23 16:36:20.059114 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:36:20.059093 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 16:36:20.071589 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:36:20.071565 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 16:36:20.081445 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:36:20.081424 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-svdfq\" (UniqueName: \"kubernetes.io/projected/1ba421e3-97e2-473e-a145-bf072f2b9393-kube-api-access-svdfq\") pod \"network-check-target-ktsss\" (UID: \"1ba421e3-97e2-473e-a145-bf072f2b9393\") " pod="openshift-network-diagnostics/network-check-target-ktsss" Apr 23 16:36:20.115339 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:36:20.115314 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-6xqf4\"" Apr 23 16:36:20.124088 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:36:20.124070 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ktsss" Apr 23 16:36:20.288859 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:36:20.288829 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ktsss"] Apr 23 16:36:20.292196 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:36:20.292165 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ba421e3_97e2_473e_a145_bf072f2b9393.slice/crio-cf960f7646ac5453d07f4dcc8fc8080a595328cbeb8aa4afddfee00f70d7050b WatchSource:0}: Error finding container cf960f7646ac5453d07f4dcc8fc8080a595328cbeb8aa4afddfee00f70d7050b: Status 404 returned error can't find the container with id cf960f7646ac5453d07f4dcc8fc8080a595328cbeb8aa4afddfee00f70d7050b Apr 23 16:36:20.671682 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:36:20.671631 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ktsss" event={"ID":"1ba421e3-97e2-473e-a145-bf072f2b9393","Type":"ContainerStarted","Data":"cf960f7646ac5453d07f4dcc8fc8080a595328cbeb8aa4afddfee00f70d7050b"} Apr 23 16:36:23.679071 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:36:23.679031 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ktsss" event={"ID":"1ba421e3-97e2-473e-a145-bf072f2b9393","Type":"ContainerStarted","Data":"9c19c136e4689fefc14a7787d272a6e97ff040ee5c5f65d6dd2b40bb4d2a5f61"} Apr 23 16:36:23.679540 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:36:23.679153 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-ktsss" Apr 23 16:36:23.693833 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:36:23.693791 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-ktsss" podStartSLOduration=65.932678156 podStartE2EDuration="1m8.693779231s" podCreationTimestamp="2026-04-23 16:35:15 +0000 UTC" firstStartedPulling="2026-04-23 16:36:20.294468967 +0000 UTC m=+65.631731914" lastFinishedPulling="2026-04-23 16:36:23.055570033 +0000 UTC m=+68.392832989" observedRunningTime="2026-04-23 16:36:23.693205637 +0000 UTC m=+69.030468600" watchObservedRunningTime="2026-04-23 16:36:23.693779231 +0000 UTC m=+69.031042195" Apr 23 16:36:50.962757 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:36:50.962631 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e2c88812-4055-43aa-8e5a-25b432f9041d-metrics-tls\") pod \"dns-default-jfzk8\" (UID: \"e2c88812-4055-43aa-8e5a-25b432f9041d\") " pod="openshift-dns/dns-default-jfzk8" Apr 23 16:36:50.962757 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:36:50.962713 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bfd1dfe-900e-4260-b0fc-9dc05d2c604c-cert\") pod \"ingress-canary-bsgxs\" (UID: \"3bfd1dfe-900e-4260-b0fc-9dc05d2c604c\") " pod="openshift-ingress-canary/ingress-canary-bsgxs" Apr 23 16:36:50.963266 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:36:50.962807 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:36:50.963266 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:36:50.962861 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bfd1dfe-900e-4260-b0fc-9dc05d2c604c-cert podName:3bfd1dfe-900e-4260-b0fc-9dc05d2c604c nodeName:}" failed. No retries permitted until 2026-04-23 16:37:54.962847833 +0000 UTC m=+160.300110775 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3bfd1dfe-900e-4260-b0fc-9dc05d2c604c-cert") pod "ingress-canary-bsgxs" (UID: "3bfd1dfe-900e-4260-b0fc-9dc05d2c604c") : secret "canary-serving-cert" not found Apr 23 16:36:50.963266 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:36:50.962809 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:36:50.963266 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:36:50.962958 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2c88812-4055-43aa-8e5a-25b432f9041d-metrics-tls podName:e2c88812-4055-43aa-8e5a-25b432f9041d nodeName:}" failed. No retries permitted until 2026-04-23 16:37:54.962942472 +0000 UTC m=+160.300205430 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e2c88812-4055-43aa-8e5a-25b432f9041d-metrics-tls") pod "dns-default-jfzk8" (UID: "e2c88812-4055-43aa-8e5a-25b432f9041d") : secret "dns-default-metrics-tls" not found Apr 23 16:36:54.683277 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:36:54.683244 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-ktsss" Apr 23 16:37:12.735715 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.735683 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-hbwp6"] Apr 23 16:37:12.738502 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.738486 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-hbwp6" Apr 23 16:37:12.740546 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.740520 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 16:37:12.740887 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.740857 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 23 16:37:12.740982 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.740896 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 23 16:37:12.740982 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.740923 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 16:37:12.740982 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.740926 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-6srrb\"" Apr 23 16:37:12.745963 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.745948 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 23 16:37:12.748215 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.748194 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-hbwp6"] Apr 23 16:37:12.801407 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.801380 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzcff\" (UniqueName: \"kubernetes.io/projected/5e7b64cf-d8ab-48a3-86f5-9ea5db912782-kube-api-access-nzcff\") pod \"insights-operator-585dfdc468-hbwp6\" (UID: \"5e7b64cf-d8ab-48a3-86f5-9ea5db912782\") " pod="openshift-insights/insights-operator-585dfdc468-hbwp6" Apr 23 16:37:12.801536 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.801415 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/5e7b64cf-d8ab-48a3-86f5-9ea5db912782-snapshots\") pod \"insights-operator-585dfdc468-hbwp6\" (UID: \"5e7b64cf-d8ab-48a3-86f5-9ea5db912782\") " pod="openshift-insights/insights-operator-585dfdc468-hbwp6" Apr 23 16:37:12.801536 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.801489 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e7b64cf-d8ab-48a3-86f5-9ea5db912782-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-hbwp6\" (UID: \"5e7b64cf-d8ab-48a3-86f5-9ea5db912782\") " pod="openshift-insights/insights-operator-585dfdc468-hbwp6" Apr 23 16:37:12.801650 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.801536 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5e7b64cf-d8ab-48a3-86f5-9ea5db912782-tmp\") pod \"insights-operator-585dfdc468-hbwp6\" (UID: \"5e7b64cf-d8ab-48a3-86f5-9ea5db912782\") " pod="openshift-insights/insights-operator-585dfdc468-hbwp6" Apr 23 16:37:12.801650 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.801567 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e7b64cf-d8ab-48a3-86f5-9ea5db912782-service-ca-bundle\") pod \"insights-operator-585dfdc468-hbwp6\" (UID: \"5e7b64cf-d8ab-48a3-86f5-9ea5db912782\") " pod="openshift-insights/insights-operator-585dfdc468-hbwp6" Apr 23 16:37:12.801650 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.801595 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e7b64cf-d8ab-48a3-86f5-9ea5db912782-serving-cert\") pod \"insights-operator-585dfdc468-hbwp6\" (UID: \"5e7b64cf-d8ab-48a3-86f5-9ea5db912782\") " pod="openshift-insights/insights-operator-585dfdc468-hbwp6" Apr 23 16:37:12.837137 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.837106 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g5f4s"] Apr 23 16:37:12.839785 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.839770 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g5f4s" Apr 23 16:37:12.841769 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.841737 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 23 16:37:12.841769 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.841743 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:37:12.842179 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.842163 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-fmvtm\"" Apr 23 16:37:12.848520 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.848502 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-gfsdv"] Apr 23 16:37:12.851273 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.851256 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-gfsdv" Apr 23 16:37:12.852990 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.852946 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g5f4s"] Apr 23 16:37:12.853247 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.853230 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:37:12.853572 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.853554 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-sq9b2\"" Apr 23 16:37:12.853809 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.853791 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 23 16:37:12.854831 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.854813 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 23 16:37:12.855106 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.855086 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 23 16:37:12.860071 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.860054 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 23 16:37:12.880720 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.880702 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-gfsdv"] Apr 23 16:37:12.901942 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.901917 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nzcff\" (UniqueName: \"kubernetes.io/projected/5e7b64cf-d8ab-48a3-86f5-9ea5db912782-kube-api-access-nzcff\") pod \"insights-operator-585dfdc468-hbwp6\" (UID: \"5e7b64cf-d8ab-48a3-86f5-9ea5db912782\") " pod="openshift-insights/insights-operator-585dfdc468-hbwp6" Apr 23 16:37:12.902030 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.901954 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/5e7b64cf-d8ab-48a3-86f5-9ea5db912782-snapshots\") pod \"insights-operator-585dfdc468-hbwp6\" (UID: \"5e7b64cf-d8ab-48a3-86f5-9ea5db912782\") " pod="openshift-insights/insights-operator-585dfdc468-hbwp6" Apr 23 16:37:12.902030 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.901989 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e7b64cf-d8ab-48a3-86f5-9ea5db912782-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-hbwp6\" (UID: \"5e7b64cf-d8ab-48a3-86f5-9ea5db912782\") " pod="openshift-insights/insights-operator-585dfdc468-hbwp6" Apr 23 16:37:12.902112 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.902039 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl252\" (UniqueName: \"kubernetes.io/projected/7ad7b86f-e66c-4d5e-9d95-ce20650adeba-kube-api-access-sl252\") pod \"volume-data-source-validator-7c6cbb6c87-g5f4s\" (UID: \"7ad7b86f-e66c-4d5e-9d95-ce20650adeba\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g5f4s" Apr 23 16:37:12.902112 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.902074 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5e7b64cf-d8ab-48a3-86f5-9ea5db912782-tmp\") pod \"insights-operator-585dfdc468-hbwp6\" (UID: \"5e7b64cf-d8ab-48a3-86f5-9ea5db912782\") " pod="openshift-insights/insights-operator-585dfdc468-hbwp6" Apr 23 16:37:12.902112 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.902097 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e7b64cf-d8ab-48a3-86f5-9ea5db912782-service-ca-bundle\") pod \"insights-operator-585dfdc468-hbwp6\" (UID: \"5e7b64cf-d8ab-48a3-86f5-9ea5db912782\") " pod="openshift-insights/insights-operator-585dfdc468-hbwp6" Apr 23 16:37:12.902253 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.902120 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e7b64cf-d8ab-48a3-86f5-9ea5db912782-serving-cert\") pod \"insights-operator-585dfdc468-hbwp6\" (UID: \"5e7b64cf-d8ab-48a3-86f5-9ea5db912782\") " pod="openshift-insights/insights-operator-585dfdc468-hbwp6" Apr 23 16:37:12.902630 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.902606 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5e7b64cf-d8ab-48a3-86f5-9ea5db912782-tmp\") pod \"insights-operator-585dfdc468-hbwp6\" (UID: \"5e7b64cf-d8ab-48a3-86f5-9ea5db912782\") " pod="openshift-insights/insights-operator-585dfdc468-hbwp6" Apr 23 16:37:12.902722 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.902639 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/5e7b64cf-d8ab-48a3-86f5-9ea5db912782-snapshots\") pod \"insights-operator-585dfdc468-hbwp6\" (UID: \"5e7b64cf-d8ab-48a3-86f5-9ea5db912782\") " pod="openshift-insights/insights-operator-585dfdc468-hbwp6" Apr 23 16:37:12.902722 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.902704 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e7b64cf-d8ab-48a3-86f5-9ea5db912782-service-ca-bundle\") pod \"insights-operator-585dfdc468-hbwp6\" (UID: \"5e7b64cf-d8ab-48a3-86f5-9ea5db912782\") " pod="openshift-insights/insights-operator-585dfdc468-hbwp6" Apr 23 16:37:12.902791 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.902769 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e7b64cf-d8ab-48a3-86f5-9ea5db912782-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-hbwp6\" (UID: \"5e7b64cf-d8ab-48a3-86f5-9ea5db912782\") " pod="openshift-insights/insights-operator-585dfdc468-hbwp6" Apr 23 16:37:12.904372 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.904356 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e7b64cf-d8ab-48a3-86f5-9ea5db912782-serving-cert\") pod \"insights-operator-585dfdc468-hbwp6\" (UID: \"5e7b64cf-d8ab-48a3-86f5-9ea5db912782\") " pod="openshift-insights/insights-operator-585dfdc468-hbwp6" Apr 23 16:37:12.924365 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.924338 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzcff\" (UniqueName: \"kubernetes.io/projected/5e7b64cf-d8ab-48a3-86f5-9ea5db912782-kube-api-access-nzcff\") pod \"insights-operator-585dfdc468-hbwp6\" (UID: \"5e7b64cf-d8ab-48a3-86f5-9ea5db912782\") " pod="openshift-insights/insights-operator-585dfdc468-hbwp6" Apr 23 16:37:12.977478 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.977449 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-b88c6764-pc8q7"] Apr 23 16:37:12.980445 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.980431 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-b88c6764-pc8q7" Apr 23 16:37:12.982796 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.982770 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 23 16:37:12.982895 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.982869 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 23 16:37:12.982989 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.982974 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 23 16:37:12.983168 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.983153 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 23 16:37:12.983439 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.983418 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 23 16:37:12.984065 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.984048 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 23 16:37:12.984751 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:12.984732 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-lw29q\"" Apr 23 16:37:13.001910 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:13.001889 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-b88c6764-pc8q7"] Apr 23 16:37:13.003147 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:13.003130 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3481652d-e5fb-498e-84c3-e2c629340367-trusted-ca\") pod \"console-operator-9d4b6777b-gfsdv\" (UID: \"3481652d-e5fb-498e-84c3-e2c629340367\") " pod="openshift-console-operator/console-operator-9d4b6777b-gfsdv" Apr 23 16:37:13.003199 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:13.003157 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3481652d-e5fb-498e-84c3-e2c629340367-config\") pod \"console-operator-9d4b6777b-gfsdv\" (UID: \"3481652d-e5fb-498e-84c3-e2c629340367\") " pod="openshift-console-operator/console-operator-9d4b6777b-gfsdv" Apr 23 16:37:13.003199 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:13.003184 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnkjn\" (UniqueName: \"kubernetes.io/projected/3481652d-e5fb-498e-84c3-e2c629340367-kube-api-access-mnkjn\") pod \"console-operator-9d4b6777b-gfsdv\" (UID: \"3481652d-e5fb-498e-84c3-e2c629340367\") " pod="openshift-console-operator/console-operator-9d4b6777b-gfsdv" Apr 23 16:37:13.003269 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:13.003254 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sl252\" (UniqueName: \"kubernetes.io/projected/7ad7b86f-e66c-4d5e-9d95-ce20650adeba-kube-api-access-sl252\") pod \"volume-data-source-validator-7c6cbb6c87-g5f4s\" (UID: \"7ad7b86f-e66c-4d5e-9d95-ce20650adeba\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g5f4s" Apr 23 16:37:13.003303 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:13.003282 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3481652d-e5fb-498e-84c3-e2c629340367-serving-cert\") pod \"console-operator-9d4b6777b-gfsdv\" (UID: \"3481652d-e5fb-498e-84c3-e2c629340367\") " pod="openshift-console-operator/console-operator-9d4b6777b-gfsdv" Apr 23 16:37:13.011445 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:13.011425 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl252\" (UniqueName: \"kubernetes.io/projected/7ad7b86f-e66c-4d5e-9d95-ce20650adeba-kube-api-access-sl252\") pod \"volume-data-source-validator-7c6cbb6c87-g5f4s\" (UID: \"7ad7b86f-e66c-4d5e-9d95-ce20650adeba\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g5f4s" Apr 23 16:37:13.048642 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:13.048612 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-hbwp6" Apr 23 16:37:13.106683 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:13.105008 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3481652d-e5fb-498e-84c3-e2c629340367-trusted-ca\") pod \"console-operator-9d4b6777b-gfsdv\" (UID: \"3481652d-e5fb-498e-84c3-e2c629340367\") " pod="openshift-console-operator/console-operator-9d4b6777b-gfsdv" Apr 23 16:37:13.106683 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:13.105056 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3481652d-e5fb-498e-84c3-e2c629340367-config\") pod \"console-operator-9d4b6777b-gfsdv\" (UID: \"3481652d-e5fb-498e-84c3-e2c629340367\") " pod="openshift-console-operator/console-operator-9d4b6777b-gfsdv" Apr 23 16:37:13.106683 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:13.105097 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4983d124-dedd-4eec-8bdd-7d87844e7eaf-service-ca-bundle\") pod \"router-default-b88c6764-pc8q7\" (UID: \"4983d124-dedd-4eec-8bdd-7d87844e7eaf\") " pod="openshift-ingress/router-default-b88c6764-pc8q7" Apr 23 16:37:13.106683 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:13.105130 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4983d124-dedd-4eec-8bdd-7d87844e7eaf-default-certificate\") pod \"router-default-b88c6764-pc8q7\" (UID: \"4983d124-dedd-4eec-8bdd-7d87844e7eaf\") " pod="openshift-ingress/router-default-b88c6764-pc8q7" Apr 23 16:37:13.106683 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:13.105158 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlcxf\" (UniqueName: \"kubernetes.io/projected/4983d124-dedd-4eec-8bdd-7d87844e7eaf-kube-api-access-vlcxf\") pod \"router-default-b88c6764-pc8q7\" (UID: \"4983d124-dedd-4eec-8bdd-7d87844e7eaf\") " pod="openshift-ingress/router-default-b88c6764-pc8q7" Apr 23 16:37:13.106683 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:13.105195 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mnkjn\" (UniqueName: \"kubernetes.io/projected/3481652d-e5fb-498e-84c3-e2c629340367-kube-api-access-mnkjn\") pod \"console-operator-9d4b6777b-gfsdv\" (UID: \"3481652d-e5fb-498e-84c3-e2c629340367\") " pod="openshift-console-operator/console-operator-9d4b6777b-gfsdv" Apr 23 16:37:13.106683 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:13.105222 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4983d124-dedd-4eec-8bdd-7d87844e7eaf-stats-auth\") pod \"router-default-b88c6764-pc8q7\" (UID: \"4983d124-dedd-4eec-8bdd-7d87844e7eaf\") " pod="openshift-ingress/router-default-b88c6764-pc8q7" Apr 23 16:37:13.106683 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:13.105246 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4983d124-dedd-4eec-8bdd-7d87844e7eaf-metrics-certs\") pod \"router-default-b88c6764-pc8q7\" (UID: \"4983d124-dedd-4eec-8bdd-7d87844e7eaf\") " pod="openshift-ingress/router-default-b88c6764-pc8q7" Apr 23 16:37:13.106683 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:13.105304 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3481652d-e5fb-498e-84c3-e2c629340367-serving-cert\") pod \"console-operator-9d4b6777b-gfsdv\" (UID: \"3481652d-e5fb-498e-84c3-e2c629340367\") " pod="openshift-console-operator/console-operator-9d4b6777b-gfsdv" Apr 23 16:37:13.107233 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:13.106814 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3481652d-e5fb-498e-84c3-e2c629340367-trusted-ca\") pod \"console-operator-9d4b6777b-gfsdv\" (UID: \"3481652d-e5fb-498e-84c3-e2c629340367\") " pod="openshift-console-operator/console-operator-9d4b6777b-gfsdv" Apr 23 16:37:13.108712 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:13.107294 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3481652d-e5fb-498e-84c3-e2c629340367-config\") pod \"console-operator-9d4b6777b-gfsdv\" (UID: \"3481652d-e5fb-498e-84c3-e2c629340367\") " pod="openshift-console-operator/console-operator-9d4b6777b-gfsdv" Apr 23 16:37:13.109780 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:13.109731 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3481652d-e5fb-498e-84c3-e2c629340367-serving-cert\") pod \"console-operator-9d4b6777b-gfsdv\" (UID: \"3481652d-e5fb-498e-84c3-e2c629340367\") " pod="openshift-console-operator/console-operator-9d4b6777b-gfsdv" Apr 23 16:37:13.119235 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:13.119180 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnkjn\" (UniqueName: \"kubernetes.io/projected/3481652d-e5fb-498e-84c3-e2c629340367-kube-api-access-mnkjn\") pod \"console-operator-9d4b6777b-gfsdv\" (UID: \"3481652d-e5fb-498e-84c3-e2c629340367\") " pod="openshift-console-operator/console-operator-9d4b6777b-gfsdv" Apr 23 16:37:13.148296 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:13.148220 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g5f4s" Apr 23 16:37:13.160349 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:13.160327 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-gfsdv" Apr 23 16:37:13.187327 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:13.187293 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-hbwp6"] Apr 23 16:37:13.192143 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:37:13.192107 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e7b64cf_d8ab_48a3_86f5_9ea5db912782.slice/crio-e2c6197914fd78ec6a56cc28ff232b759cd8e215d9bb998abd423105c5fbd433 WatchSource:0}: Error finding container e2c6197914fd78ec6a56cc28ff232b759cd8e215d9bb998abd423105c5fbd433: Status 404 returned error can't find the container with id e2c6197914fd78ec6a56cc28ff232b759cd8e215d9bb998abd423105c5fbd433 Apr 23 16:37:13.206359 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:13.206324 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4983d124-dedd-4eec-8bdd-7d87844e7eaf-default-certificate\") pod \"router-default-b88c6764-pc8q7\" (UID: \"4983d124-dedd-4eec-8bdd-7d87844e7eaf\") " pod="openshift-ingress/router-default-b88c6764-pc8q7" Apr 23 16:37:13.206444 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:13.206375 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vlcxf\" (UniqueName: \"kubernetes.io/projected/4983d124-dedd-4eec-8bdd-7d87844e7eaf-kube-api-access-vlcxf\") pod \"router-default-b88c6764-pc8q7\" (UID: \"4983d124-dedd-4eec-8bdd-7d87844e7eaf\") " pod="openshift-ingress/router-default-b88c6764-pc8q7" Apr 23 16:37:13.206444 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:13.206419 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4983d124-dedd-4eec-8bdd-7d87844e7eaf-stats-auth\") pod \"router-default-b88c6764-pc8q7\" (UID: \"4983d124-dedd-4eec-8bdd-7d87844e7eaf\") " pod="openshift-ingress/router-default-b88c6764-pc8q7" Apr 23 16:37:13.206524 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:13.206448 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4983d124-dedd-4eec-8bdd-7d87844e7eaf-metrics-certs\") pod \"router-default-b88c6764-pc8q7\" (UID: \"4983d124-dedd-4eec-8bdd-7d87844e7eaf\") " pod="openshift-ingress/router-default-b88c6764-pc8q7" Apr 23 16:37:13.206992 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:13.206557 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4983d124-dedd-4eec-8bdd-7d87844e7eaf-service-ca-bundle\") pod \"router-default-b88c6764-pc8q7\" (UID: \"4983d124-dedd-4eec-8bdd-7d87844e7eaf\") " pod="openshift-ingress/router-default-b88c6764-pc8q7" Apr 23 16:37:13.207106 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:37:13.206989 2569 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 16:37:13.207106 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:37:13.207063 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4983d124-dedd-4eec-8bdd-7d87844e7eaf-metrics-certs podName:4983d124-dedd-4eec-8bdd-7d87844e7eaf nodeName:}" failed. No retries permitted until 2026-04-23 16:37:13.707037645 +0000 UTC m=+119.044300608 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4983d124-dedd-4eec-8bdd-7d87844e7eaf-metrics-certs") pod "router-default-b88c6764-pc8q7" (UID: "4983d124-dedd-4eec-8bdd-7d87844e7eaf") : secret "router-metrics-certs-default" not found Apr 23 16:37:13.207106 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:37:13.207086 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4983d124-dedd-4eec-8bdd-7d87844e7eaf-service-ca-bundle podName:4983d124-dedd-4eec-8bdd-7d87844e7eaf nodeName:}" failed. No retries permitted until 2026-04-23 16:37:13.70707342 +0000 UTC m=+119.044336365 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4983d124-dedd-4eec-8bdd-7d87844e7eaf-service-ca-bundle") pod "router-default-b88c6764-pc8q7" (UID: "4983d124-dedd-4eec-8bdd-7d87844e7eaf") : configmap references non-existent config key: service-ca.crt Apr 23 16:37:13.212893 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:13.212869 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4983d124-dedd-4eec-8bdd-7d87844e7eaf-default-certificate\") pod \"router-default-b88c6764-pc8q7\" (UID: \"4983d124-dedd-4eec-8bdd-7d87844e7eaf\") " pod="openshift-ingress/router-default-b88c6764-pc8q7" Apr 23 16:37:13.213043 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:13.212996 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4983d124-dedd-4eec-8bdd-7d87844e7eaf-stats-auth\") pod \"router-default-b88c6764-pc8q7\" (UID: \"4983d124-dedd-4eec-8bdd-7d87844e7eaf\") " pod="openshift-ingress/router-default-b88c6764-pc8q7" Apr 23 16:37:13.218301 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:13.218069 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlcxf\" (UniqueName: \"kubernetes.io/projected/4983d124-dedd-4eec-8bdd-7d87844e7eaf-kube-api-access-vlcxf\") pod \"router-default-b88c6764-pc8q7\" (UID: \"4983d124-dedd-4eec-8bdd-7d87844e7eaf\") " pod="openshift-ingress/router-default-b88c6764-pc8q7" Apr 23 16:37:13.276125 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:13.276087 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g5f4s"] Apr 23 16:37:13.279145 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:37:13.279120 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ad7b86f_e66c_4d5e_9d95_ce20650adeba.slice/crio-4f2c18c624291c25a859c9e3817e9fc8b3531c398641cabfbe3113d428e2ad85 WatchSource:0}: Error finding container 4f2c18c624291c25a859c9e3817e9fc8b3531c398641cabfbe3113d428e2ad85: Status 404 returned error can't find the container with id 4f2c18c624291c25a859c9e3817e9fc8b3531c398641cabfbe3113d428e2ad85 Apr 23 16:37:13.295314 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:13.295290 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-gfsdv"] Apr 23 16:37:13.298345 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:37:13.298320 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3481652d_e5fb_498e_84c3_e2c629340367.slice/crio-6aa27dc96e9fc1e265fcad48304fda10739d5d6078adc2ef1fb64798a1367abe WatchSource:0}: Error finding container 6aa27dc96e9fc1e265fcad48304fda10739d5d6078adc2ef1fb64798a1367abe: Status 404 returned error can't find the container with id 6aa27dc96e9fc1e265fcad48304fda10739d5d6078adc2ef1fb64798a1367abe Apr 23 16:37:13.712887 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:13.712850 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4983d124-dedd-4eec-8bdd-7d87844e7eaf-service-ca-bundle\") pod \"router-default-b88c6764-pc8q7\" (UID: \"4983d124-dedd-4eec-8bdd-7d87844e7eaf\") " pod="openshift-ingress/router-default-b88c6764-pc8q7" Apr 23 16:37:13.713088 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:13.712920 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4983d124-dedd-4eec-8bdd-7d87844e7eaf-metrics-certs\") pod \"router-default-b88c6764-pc8q7\" (UID: \"4983d124-dedd-4eec-8bdd-7d87844e7eaf\") " pod="openshift-ingress/router-default-b88c6764-pc8q7" Apr 23 16:37:13.713088 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:37:13.713037 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4983d124-dedd-4eec-8bdd-7d87844e7eaf-service-ca-bundle podName:4983d124-dedd-4eec-8bdd-7d87844e7eaf nodeName:}" failed. No retries permitted until 2026-04-23 16:37:14.713018092 +0000 UTC m=+120.050281033 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4983d124-dedd-4eec-8bdd-7d87844e7eaf-service-ca-bundle") pod "router-default-b88c6764-pc8q7" (UID: "4983d124-dedd-4eec-8bdd-7d87844e7eaf") : configmap references non-existent config key: service-ca.crt Apr 23 16:37:13.713088 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:37:13.713064 2569 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 16:37:13.713257 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:37:13.713124 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4983d124-dedd-4eec-8bdd-7d87844e7eaf-metrics-certs podName:4983d124-dedd-4eec-8bdd-7d87844e7eaf nodeName:}" failed. No retries permitted until 2026-04-23 16:37:14.713108677 +0000 UTC m=+120.050371622 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4983d124-dedd-4eec-8bdd-7d87844e7eaf-metrics-certs") pod "router-default-b88c6764-pc8q7" (UID: "4983d124-dedd-4eec-8bdd-7d87844e7eaf") : secret "router-metrics-certs-default" not found Apr 23 16:37:13.772150 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:13.772108 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g5f4s" event={"ID":"7ad7b86f-e66c-4d5e-9d95-ce20650adeba","Type":"ContainerStarted","Data":"4f2c18c624291c25a859c9e3817e9fc8b3531c398641cabfbe3113d428e2ad85"} Apr 23 16:37:13.773171 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:13.773142 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-hbwp6" event={"ID":"5e7b64cf-d8ab-48a3-86f5-9ea5db912782","Type":"ContainerStarted","Data":"e2c6197914fd78ec6a56cc28ff232b759cd8e215d9bb998abd423105c5fbd433"} Apr 23 16:37:13.774213 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:13.774185 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-gfsdv" event={"ID":"3481652d-e5fb-498e-84c3-e2c629340367","Type":"ContainerStarted","Data":"6aa27dc96e9fc1e265fcad48304fda10739d5d6078adc2ef1fb64798a1367abe"} Apr 23 16:37:14.721407 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:14.721373 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4983d124-dedd-4eec-8bdd-7d87844e7eaf-service-ca-bundle\") pod \"router-default-b88c6764-pc8q7\" (UID: \"4983d124-dedd-4eec-8bdd-7d87844e7eaf\") " pod="openshift-ingress/router-default-b88c6764-pc8q7" Apr 23 16:37:14.721564 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:14.721436 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4983d124-dedd-4eec-8bdd-7d87844e7eaf-metrics-certs\") pod \"router-default-b88c6764-pc8q7\" (UID: \"4983d124-dedd-4eec-8bdd-7d87844e7eaf\") " pod="openshift-ingress/router-default-b88c6764-pc8q7" Apr 23 16:37:14.721564 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:37:14.721533 2569 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 16:37:14.721678 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:37:14.721568 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4983d124-dedd-4eec-8bdd-7d87844e7eaf-service-ca-bundle podName:4983d124-dedd-4eec-8bdd-7d87844e7eaf nodeName:}" failed. No retries permitted until 2026-04-23 16:37:16.72154331 +0000 UTC m=+122.058806253 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4983d124-dedd-4eec-8bdd-7d87844e7eaf-service-ca-bundle") pod "router-default-b88c6764-pc8q7" (UID: "4983d124-dedd-4eec-8bdd-7d87844e7eaf") : configmap references non-existent config key: service-ca.crt Apr 23 16:37:14.721678 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:37:14.721600 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4983d124-dedd-4eec-8bdd-7d87844e7eaf-metrics-certs podName:4983d124-dedd-4eec-8bdd-7d87844e7eaf nodeName:}" failed. No retries permitted until 2026-04-23 16:37:16.721588442 +0000 UTC m=+122.058851392 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4983d124-dedd-4eec-8bdd-7d87844e7eaf-metrics-certs") pod "router-default-b88c6764-pc8q7" (UID: "4983d124-dedd-4eec-8bdd-7d87844e7eaf") : secret "router-metrics-certs-default" not found Apr 23 16:37:14.777700 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:14.777647 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g5f4s" event={"ID":"7ad7b86f-e66c-4d5e-9d95-ce20650adeba","Type":"ContainerStarted","Data":"cb235af3c1c7d6277896ace648208bd0f6d13ed47e5c78a46f796743fa2b185c"} Apr 23 16:37:14.796691 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:14.796631 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g5f4s" podStartSLOduration=1.485237944 podStartE2EDuration="2.79661651s" podCreationTimestamp="2026-04-23 16:37:12 +0000 UTC" firstStartedPulling="2026-04-23 16:37:13.280942653 +0000 UTC m=+118.618205595" lastFinishedPulling="2026-04-23 16:37:14.592321215 +0000 UTC m=+119.929584161" observedRunningTime="2026-04-23 16:37:14.796185231 +0000 UTC m=+120.133448196" watchObservedRunningTime="2026-04-23 16:37:14.79661651 +0000 UTC m=+120.133879452" Apr 23 16:37:15.781037 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:15.780989 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-hbwp6" event={"ID":"5e7b64cf-d8ab-48a3-86f5-9ea5db912782","Type":"ContainerStarted","Data":"1decf4b96c31c1bbe9fa25e2b9762d81352d673cb1fd5d5f32e30c43b835856b"} Apr 23 16:37:15.782884 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:15.782853 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-gfsdv" event={"ID":"3481652d-e5fb-498e-84c3-e2c629340367","Type":"ContainerStarted","Data":"1f64c7aaa02da2b6723907951362df53956d3a1b99aeaf755f5bc6cc2a029f6c"} Apr 23 16:37:15.783082 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:15.783063 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-gfsdv" Apr 23 16:37:15.784769 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:15.784466 2569 patch_prober.go:28] interesting pod/console-operator-9d4b6777b-gfsdv container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.134.0.9:8443/readyz\": dial tcp 10.134.0.9:8443: connect: connection refused" start-of-body= Apr 23 16:37:15.784769 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:15.784517 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-9d4b6777b-gfsdv" podUID="3481652d-e5fb-498e-84c3-e2c629340367" containerName="console-operator" probeResult="failure" output="Get \"https://10.134.0.9:8443/readyz\": dial tcp 10.134.0.9:8443: connect: connection refused" Apr 23 16:37:15.798187 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:15.798138 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-hbwp6" podStartSLOduration=1.301850439 podStartE2EDuration="3.798125981s" podCreationTimestamp="2026-04-23 16:37:12 +0000 UTC" firstStartedPulling="2026-04-23 16:37:13.194214763 +0000 UTC m=+118.531477706" lastFinishedPulling="2026-04-23 16:37:15.690490303 +0000 UTC m=+121.027753248" observedRunningTime="2026-04-23 16:37:15.796899059 +0000 UTC m=+121.134162025" watchObservedRunningTime="2026-04-23 16:37:15.798125981 +0000 UTC m=+121.135388945" Apr 23 16:37:15.819978 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:15.819843 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-gfsdv" podStartSLOduration=1.426983667 podStartE2EDuration="3.819826737s" podCreationTimestamp="2026-04-23 16:37:12 +0000 UTC" firstStartedPulling="2026-04-23 16:37:13.300124437 +0000 UTC m=+118.637387382" lastFinishedPulling="2026-04-23 16:37:15.692967506 +0000 UTC m=+121.030230452" observedRunningTime="2026-04-23 16:37:15.819818112 +0000 UTC m=+121.157081097" watchObservedRunningTime="2026-04-23 16:37:15.819826737 +0000 UTC m=+121.157089703" Apr 23 16:37:16.739298 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:16.739261 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4983d124-dedd-4eec-8bdd-7d87844e7eaf-service-ca-bundle\") pod \"router-default-b88c6764-pc8q7\" (UID: \"4983d124-dedd-4eec-8bdd-7d87844e7eaf\") " pod="openshift-ingress/router-default-b88c6764-pc8q7" Apr 23 16:37:16.739457 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:16.739317 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4983d124-dedd-4eec-8bdd-7d87844e7eaf-metrics-certs\") pod \"router-default-b88c6764-pc8q7\" (UID: \"4983d124-dedd-4eec-8bdd-7d87844e7eaf\") " pod="openshift-ingress/router-default-b88c6764-pc8q7" Apr 23 16:37:16.739457 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:37:16.739422 2569 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 16:37:16.739523 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:37:16.739476 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4983d124-dedd-4eec-8bdd-7d87844e7eaf-metrics-certs podName:4983d124-dedd-4eec-8bdd-7d87844e7eaf nodeName:}" failed. No retries permitted until 2026-04-23 16:37:20.739458933 +0000 UTC m=+126.076721878 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4983d124-dedd-4eec-8bdd-7d87844e7eaf-metrics-certs") pod "router-default-b88c6764-pc8q7" (UID: "4983d124-dedd-4eec-8bdd-7d87844e7eaf") : secret "router-metrics-certs-default" not found Apr 23 16:37:16.739591 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:37:16.739571 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4983d124-dedd-4eec-8bdd-7d87844e7eaf-service-ca-bundle podName:4983d124-dedd-4eec-8bdd-7d87844e7eaf nodeName:}" failed. No retries permitted until 2026-04-23 16:37:20.739555234 +0000 UTC m=+126.076818183 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4983d124-dedd-4eec-8bdd-7d87844e7eaf-service-ca-bundle") pod "router-default-b88c6764-pc8q7" (UID: "4983d124-dedd-4eec-8bdd-7d87844e7eaf") : configmap references non-existent config key: service-ca.crt Apr 23 16:37:16.785934 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:16.785914 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gfsdv_3481652d-e5fb-498e-84c3-e2c629340367/console-operator/0.log" Apr 23 16:37:16.786264 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:16.785953 2569 generic.go:358] "Generic (PLEG): container finished" podID="3481652d-e5fb-498e-84c3-e2c629340367" containerID="1f64c7aaa02da2b6723907951362df53956d3a1b99aeaf755f5bc6cc2a029f6c" exitCode=255 Apr 23 16:37:16.786264 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:16.786047 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-gfsdv" event={"ID":"3481652d-e5fb-498e-84c3-e2c629340367","Type":"ContainerDied","Data":"1f64c7aaa02da2b6723907951362df53956d3a1b99aeaf755f5bc6cc2a029f6c"} Apr 23 16:37:16.786335 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:16.786288 2569 scope.go:117] "RemoveContainer" containerID="1f64c7aaa02da2b6723907951362df53956d3a1b99aeaf755f5bc6cc2a029f6c" Apr 23 16:37:17.302867 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:17.302838 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-rwgzf"] Apr 23 16:37:17.305554 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:17.305537 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-rwgzf" Apr 23 16:37:17.307554 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:17.307533 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-2rkr2\"" Apr 23 16:37:17.307902 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:17.307885 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 23 16:37:17.308237 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:17.308217 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 23 16:37:17.324866 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:17.324836 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-rwgzf"] Apr 23 16:37:17.444969 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:17.444930 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpcl9\" (UniqueName: \"kubernetes.io/projected/e972d7fb-a515-4ee1-9487-5af95477b522-kube-api-access-gpcl9\") pod \"migrator-74bb7799d9-rwgzf\" (UID: \"e972d7fb-a515-4ee1-9487-5af95477b522\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-rwgzf" Apr 23 16:37:17.546143 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:17.546110 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gpcl9\" (UniqueName: \"kubernetes.io/projected/e972d7fb-a515-4ee1-9487-5af95477b522-kube-api-access-gpcl9\") pod \"migrator-74bb7799d9-rwgzf\" (UID: \"e972d7fb-a515-4ee1-9487-5af95477b522\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-rwgzf" Apr 23 16:37:17.554205 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:17.554136 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpcl9\" (UniqueName: \"kubernetes.io/projected/e972d7fb-a515-4ee1-9487-5af95477b522-kube-api-access-gpcl9\") pod \"migrator-74bb7799d9-rwgzf\" (UID: \"e972d7fb-a515-4ee1-9487-5af95477b522\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-rwgzf" Apr 23 16:37:17.614352 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:17.614320 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-rwgzf" Apr 23 16:37:17.723254 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:17.723225 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-rwgzf"] Apr 23 16:37:17.726017 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:37:17.725985 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode972d7fb_a515_4ee1_9487_5af95477b522.slice/crio-22284cf29faf7810518e719814b9dd38c37e9665eccc5ad8c8280c6d71bf703f WatchSource:0}: Error finding container 22284cf29faf7810518e719814b9dd38c37e9665eccc5ad8c8280c6d71bf703f: Status 404 returned error can't find the container with id 22284cf29faf7810518e719814b9dd38c37e9665eccc5ad8c8280c6d71bf703f Apr 23 16:37:17.789515 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:17.789489 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gfsdv_3481652d-e5fb-498e-84c3-e2c629340367/console-operator/1.log" Apr 23 16:37:17.789940 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:17.789881 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gfsdv_3481652d-e5fb-498e-84c3-e2c629340367/console-operator/0.log" Apr 23 16:37:17.789940 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:17.789921 2569 generic.go:358] "Generic (PLEG): container finished" podID="3481652d-e5fb-498e-84c3-e2c629340367" containerID="ed2b9907aaf13862711e17012349747aec4bce0922f3bf18794fa7846413c652" exitCode=255 Apr 23 16:37:17.790042 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:17.790009 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-gfsdv" event={"ID":"3481652d-e5fb-498e-84c3-e2c629340367","Type":"ContainerDied","Data":"ed2b9907aaf13862711e17012349747aec4bce0922f3bf18794fa7846413c652"} Apr 23 16:37:17.790090 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:17.790057 2569 scope.go:117] "RemoveContainer" containerID="1f64c7aaa02da2b6723907951362df53956d3a1b99aeaf755f5bc6cc2a029f6c" Apr 23 16:37:17.790257 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:17.790238 2569 scope.go:117] "RemoveContainer" containerID="ed2b9907aaf13862711e17012349747aec4bce0922f3bf18794fa7846413c652" Apr 23 16:37:17.790464 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:37:17.790441 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-gfsdv_openshift-console-operator(3481652d-e5fb-498e-84c3-e2c629340367)\"" pod="openshift-console-operator/console-operator-9d4b6777b-gfsdv" podUID="3481652d-e5fb-498e-84c3-e2c629340367" Apr 23 16:37:17.791306 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:17.791015 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-rwgzf" event={"ID":"e972d7fb-a515-4ee1-9487-5af95477b522","Type":"ContainerStarted","Data":"22284cf29faf7810518e719814b9dd38c37e9665eccc5ad8c8280c6d71bf703f"} Apr 23 16:37:18.795062 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:18.795030 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gfsdv_3481652d-e5fb-498e-84c3-e2c629340367/console-operator/1.log" Apr 23 16:37:18.795474 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:18.795370 2569 scope.go:117] "RemoveContainer" containerID="ed2b9907aaf13862711e17012349747aec4bce0922f3bf18794fa7846413c652" Apr 23 16:37:18.795545 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:37:18.795528 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-gfsdv_openshift-console-operator(3481652d-e5fb-498e-84c3-e2c629340367)\"" pod="openshift-console-operator/console-operator-9d4b6777b-gfsdv" podUID="3481652d-e5fb-498e-84c3-e2c629340367" Apr 23 16:37:19.798803 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:19.798768 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-rwgzf" event={"ID":"e972d7fb-a515-4ee1-9487-5af95477b522","Type":"ContainerStarted","Data":"e1e2a6d3bb8df3cc3cc4a8a41093ee907678836340c659be690b9afdb96828b1"} Apr 23 16:37:19.798803 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:19.798804 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-rwgzf" event={"ID":"e972d7fb-a515-4ee1-9487-5af95477b522","Type":"ContainerStarted","Data":"4a6aa76d1b9bcfb7bf309243673238637ff0a8396b203c732d4e40897d268552"} Apr 23 16:37:19.819787 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:19.819748 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-rwgzf" podStartSLOduration=1.509595056 podStartE2EDuration="2.819736545s" podCreationTimestamp="2026-04-23 16:37:17 +0000 UTC" firstStartedPulling="2026-04-23 16:37:17.727799604 +0000 UTC m=+123.065062547" lastFinishedPulling="2026-04-23 16:37:19.03794109 +0000 UTC m=+124.375204036" observedRunningTime="2026-04-23 16:37:19.816765489 +0000 UTC m=+125.154028466" watchObservedRunningTime="2026-04-23 16:37:19.819736545 +0000 UTC m=+125.156999509" Apr 23 16:37:20.193077 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:20.193053 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-fgwnc_486e65b1-cb27-4533-8ab9-9a91c79c58b1/dns-node-resolver/0.log" Apr 23 16:37:20.507840 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:20.507752 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-txswz"] Apr 23 16:37:20.510755 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:20.510733 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-txswz" Apr 23 16:37:20.512905 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:20.512886 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 23 16:37:20.513294 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:20.513279 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 23 16:37:20.513379 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:20.513344 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 23 16:37:20.513379 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:20.513361 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 23 16:37:20.513485 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:20.513385 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-bd76b\"" Apr 23 16:37:20.520981 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:20.520957 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-txswz"] Apr 23 16:37:20.670346 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:20.670313 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7af67d0f-8d44-419f-b4a9-d9ac7d69ed85-signing-key\") pod \"service-ca-865cb79987-txswz\" (UID: \"7af67d0f-8d44-419f-b4a9-d9ac7d69ed85\") " pod="openshift-service-ca/service-ca-865cb79987-txswz" Apr 23 16:37:20.670523 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:20.670374 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gffz\" (UniqueName: \"kubernetes.io/projected/7af67d0f-8d44-419f-b4a9-d9ac7d69ed85-kube-api-access-5gffz\") pod \"service-ca-865cb79987-txswz\" (UID: \"7af67d0f-8d44-419f-b4a9-d9ac7d69ed85\") " pod="openshift-service-ca/service-ca-865cb79987-txswz" Apr 23 16:37:20.670523 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:20.670446 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7af67d0f-8d44-419f-b4a9-d9ac7d69ed85-signing-cabundle\") pod \"service-ca-865cb79987-txswz\" (UID: \"7af67d0f-8d44-419f-b4a9-d9ac7d69ed85\") " pod="openshift-service-ca/service-ca-865cb79987-txswz" Apr 23 16:37:20.771054 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:20.770964 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7af67d0f-8d44-419f-b4a9-d9ac7d69ed85-signing-cabundle\") pod \"service-ca-865cb79987-txswz\" (UID: \"7af67d0f-8d44-419f-b4a9-d9ac7d69ed85\") " pod="openshift-service-ca/service-ca-865cb79987-txswz" Apr 23 16:37:20.771054 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:20.771010 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4983d124-dedd-4eec-8bdd-7d87844e7eaf-metrics-certs\") pod \"router-default-b88c6764-pc8q7\" (UID: \"4983d124-dedd-4eec-8bdd-7d87844e7eaf\") " pod="openshift-ingress/router-default-b88c6764-pc8q7" Apr 23 16:37:20.771279 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:20.771114 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7af67d0f-8d44-419f-b4a9-d9ac7d69ed85-signing-key\") pod \"service-ca-865cb79987-txswz\" (UID: \"7af67d0f-8d44-419f-b4a9-d9ac7d69ed85\") " pod="openshift-service-ca/service-ca-865cb79987-txswz" Apr 23 16:37:20.771279 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:20.771192 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4983d124-dedd-4eec-8bdd-7d87844e7eaf-service-ca-bundle\") pod \"router-default-b88c6764-pc8q7\" (UID: \"4983d124-dedd-4eec-8bdd-7d87844e7eaf\") " pod="openshift-ingress/router-default-b88c6764-pc8q7" Apr 23 16:37:20.771279 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:20.771229 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5gffz\" (UniqueName: \"kubernetes.io/projected/7af67d0f-8d44-419f-b4a9-d9ac7d69ed85-kube-api-access-5gffz\") pod \"service-ca-865cb79987-txswz\" (UID: \"7af67d0f-8d44-419f-b4a9-d9ac7d69ed85\") " pod="openshift-service-ca/service-ca-865cb79987-txswz" Apr 23 16:37:20.771424 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:37:20.771332 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4983d124-dedd-4eec-8bdd-7d87844e7eaf-service-ca-bundle podName:4983d124-dedd-4eec-8bdd-7d87844e7eaf nodeName:}" failed. No retries permitted until 2026-04-23 16:37:28.771316013 +0000 UTC m=+134.108578956 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4983d124-dedd-4eec-8bdd-7d87844e7eaf-service-ca-bundle") pod "router-default-b88c6764-pc8q7" (UID: "4983d124-dedd-4eec-8bdd-7d87844e7eaf") : configmap references non-existent config key: service-ca.crt Apr 23 16:37:20.771505 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:37:20.771481 2569 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 16:37:20.771568 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:37:20.771556 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4983d124-dedd-4eec-8bdd-7d87844e7eaf-metrics-certs podName:4983d124-dedd-4eec-8bdd-7d87844e7eaf nodeName:}" failed. No retries permitted until 2026-04-23 16:37:28.771536841 +0000 UTC m=+134.108799794 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4983d124-dedd-4eec-8bdd-7d87844e7eaf-metrics-certs") pod "router-default-b88c6764-pc8q7" (UID: "4983d124-dedd-4eec-8bdd-7d87844e7eaf") : secret "router-metrics-certs-default" not found Apr 23 16:37:20.771779 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:20.771763 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7af67d0f-8d44-419f-b4a9-d9ac7d69ed85-signing-cabundle\") pod \"service-ca-865cb79987-txswz\" (UID: \"7af67d0f-8d44-419f-b4a9-d9ac7d69ed85\") " pod="openshift-service-ca/service-ca-865cb79987-txswz" Apr 23 16:37:20.773694 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:20.773652 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7af67d0f-8d44-419f-b4a9-d9ac7d69ed85-signing-key\") pod \"service-ca-865cb79987-txswz\" (UID: \"7af67d0f-8d44-419f-b4a9-d9ac7d69ed85\") " pod="openshift-service-ca/service-ca-865cb79987-txswz" Apr 23 16:37:20.778969 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:20.778949 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gffz\" (UniqueName: \"kubernetes.io/projected/7af67d0f-8d44-419f-b4a9-d9ac7d69ed85-kube-api-access-5gffz\") pod \"service-ca-865cb79987-txswz\" (UID: \"7af67d0f-8d44-419f-b4a9-d9ac7d69ed85\") " pod="openshift-service-ca/service-ca-865cb79987-txswz" Apr 23 16:37:20.819901 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:20.819875 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-txswz" Apr 23 16:37:20.929319 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:20.929288 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-txswz"] Apr 23 16:37:20.932325 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:37:20.932302 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7af67d0f_8d44_419f_b4a9_d9ac7d69ed85.slice/crio-245c991208555ca75ce5645e9f23d9daf943e1c50b522180c0588cc06e2a6852 WatchSource:0}: Error finding container 245c991208555ca75ce5645e9f23d9daf943e1c50b522180c0588cc06e2a6852: Status 404 returned error can't find the container with id 245c991208555ca75ce5645e9f23d9daf943e1c50b522180c0588cc06e2a6852 Apr 23 16:37:21.193720 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:21.193698 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-pmktz_919d79c7-8b2d-41ad-b0ba-bf48e8815841/node-ca/0.log" Apr 23 16:37:21.804864 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:21.804830 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-txswz" event={"ID":"7af67d0f-8d44-419f-b4a9-d9ac7d69ed85","Type":"ContainerStarted","Data":"245c991208555ca75ce5645e9f23d9daf943e1c50b522180c0588cc06e2a6852"} Apr 23 16:37:22.393470 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:22.393441 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-rwgzf_e972d7fb-a515-4ee1-9487-5af95477b522/migrator/0.log" Apr 23 16:37:22.594609 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:22.594575 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-rwgzf_e972d7fb-a515-4ee1-9487-5af95477b522/graceful-termination/0.log" Apr 23 16:37:23.161379 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:23.161337 2569 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-gfsdv" Apr 23 16:37:23.161756 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:23.161743 2569 scope.go:117] "RemoveContainer" containerID="ed2b9907aaf13862711e17012349747aec4bce0922f3bf18794fa7846413c652" Apr 23 16:37:23.161920 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:37:23.161903 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-gfsdv_openshift-console-operator(3481652d-e5fb-498e-84c3-e2c629340367)\"" pod="openshift-console-operator/console-operator-9d4b6777b-gfsdv" podUID="3481652d-e5fb-498e-84c3-e2c629340367" Apr 23 16:37:23.812957 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:23.812915 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-txswz" event={"ID":"7af67d0f-8d44-419f-b4a9-d9ac7d69ed85","Type":"ContainerStarted","Data":"d89e1e3765728cdcae2d3a659152bcc2cf74fd39c2fa62765661cd99d11d819c"} Apr 23 16:37:23.998997 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:23.998952 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1eabe990-f610-4e94-8a89-7cff1c9a6a23-metrics-certs\") pod \"network-metrics-daemon-k7n97\" (UID: \"1eabe990-f610-4e94-8a89-7cff1c9a6a23\") " pod="openshift-multus/network-metrics-daemon-k7n97" Apr 23 16:37:23.999205 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:37:23.999124 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 16:37:23.999266 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:37:23.999209 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1eabe990-f610-4e94-8a89-7cff1c9a6a23-metrics-certs podName:1eabe990-f610-4e94-8a89-7cff1c9a6a23 nodeName:}" failed. No retries permitted until 2026-04-23 16:39:25.999185563 +0000 UTC m=+251.336448509 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1eabe990-f610-4e94-8a89-7cff1c9a6a23-metrics-certs") pod "network-metrics-daemon-k7n97" (UID: "1eabe990-f610-4e94-8a89-7cff1c9a6a23") : secret "metrics-daemon-secret" not found Apr 23 16:37:25.783780 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:25.783731 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-gfsdv" Apr 23 16:37:25.784237 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:25.784136 2569 scope.go:117] "RemoveContainer" containerID="ed2b9907aaf13862711e17012349747aec4bce0922f3bf18794fa7846413c652" Apr 23 16:37:25.784309 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:37:25.784292 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-gfsdv_openshift-console-operator(3481652d-e5fb-498e-84c3-e2c629340367)\"" pod="openshift-console-operator/console-operator-9d4b6777b-gfsdv" podUID="3481652d-e5fb-498e-84c3-e2c629340367" Apr 23 16:37:28.840807 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:28.840769 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4983d124-dedd-4eec-8bdd-7d87844e7eaf-service-ca-bundle\") pod \"router-default-b88c6764-pc8q7\" (UID: \"4983d124-dedd-4eec-8bdd-7d87844e7eaf\") " pod="openshift-ingress/router-default-b88c6764-pc8q7" Apr 23 16:37:28.841160 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:28.840838 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4983d124-dedd-4eec-8bdd-7d87844e7eaf-metrics-certs\") pod \"router-default-b88c6764-pc8q7\" (UID: \"4983d124-dedd-4eec-8bdd-7d87844e7eaf\") " pod="openshift-ingress/router-default-b88c6764-pc8q7" Apr 23 16:37:28.841160 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:37:28.840940 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4983d124-dedd-4eec-8bdd-7d87844e7eaf-service-ca-bundle podName:4983d124-dedd-4eec-8bdd-7d87844e7eaf nodeName:}" failed. No retries permitted until 2026-04-23 16:37:44.840917089 +0000 UTC m=+150.178180047 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4983d124-dedd-4eec-8bdd-7d87844e7eaf-service-ca-bundle") pod "router-default-b88c6764-pc8q7" (UID: "4983d124-dedd-4eec-8bdd-7d87844e7eaf") : configmap references non-existent config key: service-ca.crt Apr 23 16:37:28.841160 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:37:28.840975 2569 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 16:37:28.841160 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:37:28.841023 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4983d124-dedd-4eec-8bdd-7d87844e7eaf-metrics-certs podName:4983d124-dedd-4eec-8bdd-7d87844e7eaf nodeName:}" failed. No retries permitted until 2026-04-23 16:37:44.841008803 +0000 UTC m=+150.178271760 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4983d124-dedd-4eec-8bdd-7d87844e7eaf-metrics-certs") pod "router-default-b88c6764-pc8q7" (UID: "4983d124-dedd-4eec-8bdd-7d87844e7eaf") : secret "router-metrics-certs-default" not found Apr 23 16:37:39.300157 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:39.300122 2569 scope.go:117] "RemoveContainer" containerID="ed2b9907aaf13862711e17012349747aec4bce0922f3bf18794fa7846413c652" Apr 23 16:37:39.860278 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:39.860249 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gfsdv_3481652d-e5fb-498e-84c3-e2c629340367/console-operator/1.log" Apr 23 16:37:39.860460 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:39.860319 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-gfsdv" event={"ID":"3481652d-e5fb-498e-84c3-e2c629340367","Type":"ContainerStarted","Data":"924dd981db89334accaa18d0dfe15f2120eff90b8d1b1e07c4fd781e2cce3606"} Apr 23 16:37:39.860629 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:39.860609 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-gfsdv" Apr 23 16:37:39.878514 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:39.878459 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-txswz" podStartSLOduration=17.874208412 podStartE2EDuration="19.878444402s" podCreationTimestamp="2026-04-23 16:37:20 +0000 UTC" firstStartedPulling="2026-04-23 16:37:20.934038162 +0000 UTC m=+126.271301104" lastFinishedPulling="2026-04-23 16:37:22.938274152 +0000 UTC m=+128.275537094" observedRunningTime="2026-04-23 16:37:23.841181329 +0000 UTC m=+129.178444318" watchObservedRunningTime="2026-04-23 16:37:39.878444402 +0000 UTC m=+145.215707366" Apr 23 16:37:40.448619 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:40.448588 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-gfsdv" Apr 23 16:37:41.099138 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:41.099108 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-b5vnc"] Apr 23 16:37:41.102154 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:41.102138 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-b5vnc" Apr 23 16:37:41.129463 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:41.129436 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/88e1f618-5f3b-4306-a6ad-52dec47aee87-data-volume\") pod \"insights-runtime-extractor-b5vnc\" (UID: \"88e1f618-5f3b-4306-a6ad-52dec47aee87\") " pod="openshift-insights/insights-runtime-extractor-b5vnc" Apr 23 16:37:41.129560 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:41.129478 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/88e1f618-5f3b-4306-a6ad-52dec47aee87-crio-socket\") pod \"insights-runtime-extractor-b5vnc\" (UID: \"88e1f618-5f3b-4306-a6ad-52dec47aee87\") " pod="openshift-insights/insights-runtime-extractor-b5vnc" Apr 23 16:37:41.129560 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:41.129516 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/88e1f618-5f3b-4306-a6ad-52dec47aee87-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-b5vnc\" (UID: \"88e1f618-5f3b-4306-a6ad-52dec47aee87\") " pod="openshift-insights/insights-runtime-extractor-b5vnc" Apr 23 16:37:41.129560 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:41.129542 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/88e1f618-5f3b-4306-a6ad-52dec47aee87-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-b5vnc\" (UID: \"88e1f618-5f3b-4306-a6ad-52dec47aee87\") " pod="openshift-insights/insights-runtime-extractor-b5vnc" Apr 23 16:37:41.129560 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:41.129559 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqp9p\" (UniqueName: \"kubernetes.io/projected/88e1f618-5f3b-4306-a6ad-52dec47aee87-kube-api-access-mqp9p\") pod \"insights-runtime-extractor-b5vnc\" (UID: \"88e1f618-5f3b-4306-a6ad-52dec47aee87\") " pod="openshift-insights/insights-runtime-extractor-b5vnc" Apr 23 16:37:41.151298 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:41.151271 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 16:37:41.151410 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:41.151271 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-nnwmz\"" Apr 23 16:37:41.151410 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:41.151316 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 16:37:41.189355 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:41.189331 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-b5vnc"] Apr 23 16:37:41.229936 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:41.229909 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/88e1f618-5f3b-4306-a6ad-52dec47aee87-data-volume\") pod \"insights-runtime-extractor-b5vnc\" (UID: \"88e1f618-5f3b-4306-a6ad-52dec47aee87\") " pod="openshift-insights/insights-runtime-extractor-b5vnc" Apr 23 16:37:41.230062 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:41.229949 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/88e1f618-5f3b-4306-a6ad-52dec47aee87-crio-socket\") pod \"insights-runtime-extractor-b5vnc\" (UID: \"88e1f618-5f3b-4306-a6ad-52dec47aee87\") " pod="openshift-insights/insights-runtime-extractor-b5vnc" Apr 23 16:37:41.230062 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:41.229987 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/88e1f618-5f3b-4306-a6ad-52dec47aee87-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-b5vnc\" (UID: \"88e1f618-5f3b-4306-a6ad-52dec47aee87\") " pod="openshift-insights/insights-runtime-extractor-b5vnc" Apr 23 16:37:41.230062 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:41.230025 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/88e1f618-5f3b-4306-a6ad-52dec47aee87-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-b5vnc\" (UID: \"88e1f618-5f3b-4306-a6ad-52dec47aee87\") " pod="openshift-insights/insights-runtime-extractor-b5vnc" Apr 23 16:37:41.230062 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:41.230053 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mqp9p\" (UniqueName: \"kubernetes.io/projected/88e1f618-5f3b-4306-a6ad-52dec47aee87-kube-api-access-mqp9p\") pod \"insights-runtime-extractor-b5vnc\" (UID: \"88e1f618-5f3b-4306-a6ad-52dec47aee87\") " pod="openshift-insights/insights-runtime-extractor-b5vnc" Apr 23 16:37:41.230226 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:41.230088 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/88e1f618-5f3b-4306-a6ad-52dec47aee87-crio-socket\") pod \"insights-runtime-extractor-b5vnc\" (UID: \"88e1f618-5f3b-4306-a6ad-52dec47aee87\") " pod="openshift-insights/insights-runtime-extractor-b5vnc" Apr 23 16:37:41.230345 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:41.230270 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/88e1f618-5f3b-4306-a6ad-52dec47aee87-data-volume\") pod \"insights-runtime-extractor-b5vnc\" (UID: \"88e1f618-5f3b-4306-a6ad-52dec47aee87\") " pod="openshift-insights/insights-runtime-extractor-b5vnc" Apr 23 16:37:41.230599 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:41.230582 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/88e1f618-5f3b-4306-a6ad-52dec47aee87-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-b5vnc\" (UID: \"88e1f618-5f3b-4306-a6ad-52dec47aee87\") " pod="openshift-insights/insights-runtime-extractor-b5vnc" Apr 23 16:37:41.232243 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:41.232224 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/88e1f618-5f3b-4306-a6ad-52dec47aee87-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-b5vnc\" (UID: \"88e1f618-5f3b-4306-a6ad-52dec47aee87\") " pod="openshift-insights/insights-runtime-extractor-b5vnc" Apr 23 16:37:41.306412 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:41.306378 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqp9p\" (UniqueName: \"kubernetes.io/projected/88e1f618-5f3b-4306-a6ad-52dec47aee87-kube-api-access-mqp9p\") pod \"insights-runtime-extractor-b5vnc\" (UID: \"88e1f618-5f3b-4306-a6ad-52dec47aee87\") " pod="openshift-insights/insights-runtime-extractor-b5vnc" Apr 23 16:37:41.410914 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:41.410830 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-b5vnc" Apr 23 16:37:41.544381 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:41.544210 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-b5vnc"] Apr 23 16:37:41.546891 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:37:41.546859 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88e1f618_5f3b_4306_a6ad_52dec47aee87.slice/crio-aa092b6dcde9360df7ab000075acdea203c748d36b4e0acb0b8b46645a826106 WatchSource:0}: Error finding container aa092b6dcde9360df7ab000075acdea203c748d36b4e0acb0b8b46645a826106: Status 404 returned error can't find the container with id aa092b6dcde9360df7ab000075acdea203c748d36b4e0acb0b8b46645a826106 Apr 23 16:37:41.866108 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:41.866072 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-b5vnc" event={"ID":"88e1f618-5f3b-4306-a6ad-52dec47aee87","Type":"ContainerStarted","Data":"d4fb0dc951949a24c3184dc8ddcb419e2e92db1543a8406a278363d0bedf7aeb"} Apr 23 16:37:41.866285 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:41.866120 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-b5vnc" event={"ID":"88e1f618-5f3b-4306-a6ad-52dec47aee87","Type":"ContainerStarted","Data":"aa092b6dcde9360df7ab000075acdea203c748d36b4e0acb0b8b46645a826106"} Apr 23 16:37:42.870533 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:42.870498 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-b5vnc" event={"ID":"88e1f618-5f3b-4306-a6ad-52dec47aee87","Type":"ContainerStarted","Data":"15c304a59c7af6f3807fdf45680e1fa616113ab02e3464d237c9285d68f295de"} Apr 23 16:37:43.874898 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:43.874865 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-b5vnc" event={"ID":"88e1f618-5f3b-4306-a6ad-52dec47aee87","Type":"ContainerStarted","Data":"712f43837431c29601cf2edd8fbdd8b77391075f83db5e5105c0e50d0db352f1"} Apr 23 16:37:43.899772 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:43.897456 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-b5vnc" podStartSLOduration=0.80580851 podStartE2EDuration="2.89743782s" podCreationTimestamp="2026-04-23 16:37:41 +0000 UTC" firstStartedPulling="2026-04-23 16:37:41.605011476 +0000 UTC m=+146.942274418" lastFinishedPulling="2026-04-23 16:37:43.696640783 +0000 UTC m=+149.033903728" observedRunningTime="2026-04-23 16:37:43.895233351 +0000 UTC m=+149.232496314" watchObservedRunningTime="2026-04-23 16:37:43.89743782 +0000 UTC m=+149.234700784" Apr 23 16:37:44.857501 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:44.857467 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4983d124-dedd-4eec-8bdd-7d87844e7eaf-service-ca-bundle\") pod \"router-default-b88c6764-pc8q7\" (UID: \"4983d124-dedd-4eec-8bdd-7d87844e7eaf\") " pod="openshift-ingress/router-default-b88c6764-pc8q7" Apr 23 16:37:44.857690 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:44.857511 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4983d124-dedd-4eec-8bdd-7d87844e7eaf-metrics-certs\") pod \"router-default-b88c6764-pc8q7\" (UID: \"4983d124-dedd-4eec-8bdd-7d87844e7eaf\") " pod="openshift-ingress/router-default-b88c6764-pc8q7" Apr 23 16:37:44.858089 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:44.858063 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4983d124-dedd-4eec-8bdd-7d87844e7eaf-service-ca-bundle\") pod \"router-default-b88c6764-pc8q7\" (UID: \"4983d124-dedd-4eec-8bdd-7d87844e7eaf\") " pod="openshift-ingress/router-default-b88c6764-pc8q7" Apr 23 16:37:44.859691 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:44.859653 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4983d124-dedd-4eec-8bdd-7d87844e7eaf-metrics-certs\") pod \"router-default-b88c6764-pc8q7\" (UID: \"4983d124-dedd-4eec-8bdd-7d87844e7eaf\") " pod="openshift-ingress/router-default-b88c6764-pc8q7" Apr 23 16:37:45.091031 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:45.090999 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-lw29q\"" Apr 23 16:37:45.098807 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:45.098782 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-b88c6764-pc8q7" Apr 23 16:37:45.219314 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:45.219283 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-b88c6764-pc8q7"] Apr 23 16:37:45.222230 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:37:45.222203 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4983d124_dedd_4eec_8bdd_7d87844e7eaf.slice/crio-6bb22f0f528bc9808fb95ad8042135ba7d1fb79b24b6d30204f5722c6335ef35 WatchSource:0}: Error finding container 6bb22f0f528bc9808fb95ad8042135ba7d1fb79b24b6d30204f5722c6335ef35: Status 404 returned error can't find the container with id 6bb22f0f528bc9808fb95ad8042135ba7d1fb79b24b6d30204f5722c6335ef35 Apr 23 16:37:45.880905 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:45.880871 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-b88c6764-pc8q7" event={"ID":"4983d124-dedd-4eec-8bdd-7d87844e7eaf","Type":"ContainerStarted","Data":"b288683816377ccbe209bb56c6f9da8529fd667d667ac4b65c65c672196b0d2e"} Apr 23 16:37:45.880905 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:45.880908 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-b88c6764-pc8q7" event={"ID":"4983d124-dedd-4eec-8bdd-7d87844e7eaf","Type":"ContainerStarted","Data":"6bb22f0f528bc9808fb95ad8042135ba7d1fb79b24b6d30204f5722c6335ef35"} Apr 23 16:37:45.916677 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:45.916612 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-b88c6764-pc8q7" podStartSLOduration=33.916596007 podStartE2EDuration="33.916596007s" podCreationTimestamp="2026-04-23 16:37:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:37:45.916065634 +0000 UTC m=+151.253328623" watchObservedRunningTime="2026-04-23 16:37:45.916596007 +0000 UTC m=+151.253858970" Apr 23 16:37:46.100016 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:46.099980 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-b88c6764-pc8q7" Apr 23 16:37:46.102460 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:46.102429 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-b88c6764-pc8q7" Apr 23 16:37:46.883934 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:46.883896 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-b88c6764-pc8q7" Apr 23 16:37:46.885094 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:46.885069 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-b88c6764-pc8q7" Apr 23 16:37:47.539892 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:47.539852 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9cght"] Apr 23 16:37:47.543211 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:47.543192 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9cght" Apr 23 16:37:47.545229 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:47.545206 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-nsxbj\"" Apr 23 16:37:47.545323 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:47.545245 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 23 16:37:47.550508 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:47.550486 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9cght"] Apr 23 16:37:47.581495 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:47.581466 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/2259fa71-9594-41a1-afbb-cb25cb955aba-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-9cght\" (UID: \"2259fa71-9594-41a1-afbb-cb25cb955aba\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9cght" Apr 23 16:37:47.682107 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:47.682066 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/2259fa71-9594-41a1-afbb-cb25cb955aba-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-9cght\" (UID: \"2259fa71-9594-41a1-afbb-cb25cb955aba\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9cght" Apr 23 16:37:47.684493 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:47.684471 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/2259fa71-9594-41a1-afbb-cb25cb955aba-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-9cght\" (UID: \"2259fa71-9594-41a1-afbb-cb25cb955aba\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9cght" Apr 23 16:37:47.852790 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:47.852762 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9cght" Apr 23 16:37:47.967257 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:47.967225 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9cght"] Apr 23 16:37:47.970356 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:37:47.970328 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2259fa71_9594_41a1_afbb_cb25cb955aba.slice/crio-4d9f58f7b1675c183fccdafb17ffd8a9dfc0847ba2f44c199c1abab2ecddab56 WatchSource:0}: Error finding container 4d9f58f7b1675c183fccdafb17ffd8a9dfc0847ba2f44c199c1abab2ecddab56: Status 404 returned error can't find the container with id 4d9f58f7b1675c183fccdafb17ffd8a9dfc0847ba2f44c199c1abab2ecddab56 Apr 23 16:37:48.890274 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:48.890229 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9cght" event={"ID":"2259fa71-9594-41a1-afbb-cb25cb955aba","Type":"ContainerStarted","Data":"4d9f58f7b1675c183fccdafb17ffd8a9dfc0847ba2f44c199c1abab2ecddab56"} Apr 23 16:37:49.894031 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:49.893999 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9cght" event={"ID":"2259fa71-9594-41a1-afbb-cb25cb955aba","Type":"ContainerStarted","Data":"ade53948ec17ced5a74f3a81bdb54d9d144969e1a0fddccbddf7bdfd97973cdd"} Apr 23 16:37:49.894442 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:49.894109 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9cght" Apr 23 16:37:49.898682 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:49.898644 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9cght" Apr 23 16:37:49.910950 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:49.910914 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9cght" podStartSLOduration=1.920704281 podStartE2EDuration="2.91090402s" podCreationTimestamp="2026-04-23 16:37:47 +0000 UTC" firstStartedPulling="2026-04-23 16:37:47.972211704 +0000 UTC m=+153.309474647" lastFinishedPulling="2026-04-23 16:37:48.962411444 +0000 UTC m=+154.299674386" observedRunningTime="2026-04-23 16:37:49.910207788 +0000 UTC m=+155.247470751" watchObservedRunningTime="2026-04-23 16:37:49.91090402 +0000 UTC m=+155.248166984" Apr 23 16:37:50.112255 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:37:50.112209 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-jfzk8" podUID="e2c88812-4055-43aa-8e5a-25b432f9041d" Apr 23 16:37:50.132500 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:37:50.132467 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-bsgxs" podUID="3bfd1dfe-900e-4260-b0fc-9dc05d2c604c" Apr 23 16:37:50.627173 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:50.627143 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-72g6n"] Apr 23 16:37:50.655143 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:50.655113 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-72g6n"] Apr 23 16:37:50.655305 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:50.655253 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-72g6n" Apr 23 16:37:50.659178 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:50.659150 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 23 16:37:50.659178 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:50.659161 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 16:37:50.659178 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:50.659166 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 23 16:37:50.659448 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:50.659434 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 16:37:50.659954 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:50.659935 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 16:37:50.660079 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:50.659959 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-c8lxz\"" Apr 23 16:37:50.706952 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:50.706917 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/a848667a-8bb1-4689-8537-bfdcf691c441-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-72g6n\" (UID: \"a848667a-8bb1-4689-8537-bfdcf691c441\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-72g6n" Apr 23 16:37:50.707125 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:50.706986 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a848667a-8bb1-4689-8537-bfdcf691c441-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-72g6n\" (UID: \"a848667a-8bb1-4689-8537-bfdcf691c441\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-72g6n" Apr 23 16:37:50.707125 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:50.707032 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a848667a-8bb1-4689-8537-bfdcf691c441-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-72g6n\" (UID: \"a848667a-8bb1-4689-8537-bfdcf691c441\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-72g6n" Apr 23 16:37:50.707125 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:50.707097 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g8l6\" (UniqueName: \"kubernetes.io/projected/a848667a-8bb1-4689-8537-bfdcf691c441-kube-api-access-6g8l6\") pod \"prometheus-operator-5676c8c784-72g6n\" (UID: \"a848667a-8bb1-4689-8537-bfdcf691c441\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-72g6n" Apr 23 16:37:50.808032 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:50.807988 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6g8l6\" (UniqueName: \"kubernetes.io/projected/a848667a-8bb1-4689-8537-bfdcf691c441-kube-api-access-6g8l6\") pod \"prometheus-operator-5676c8c784-72g6n\" (UID: \"a848667a-8bb1-4689-8537-bfdcf691c441\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-72g6n" Apr 23 16:37:50.808227 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:50.808149 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/a848667a-8bb1-4689-8537-bfdcf691c441-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-72g6n\" (UID: \"a848667a-8bb1-4689-8537-bfdcf691c441\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-72g6n" Apr 23 16:37:50.808227 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:50.808214 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a848667a-8bb1-4689-8537-bfdcf691c441-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-72g6n\" (UID: \"a848667a-8bb1-4689-8537-bfdcf691c441\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-72g6n" Apr 23 16:37:50.808346 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:50.808263 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a848667a-8bb1-4689-8537-bfdcf691c441-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-72g6n\" (UID: \"a848667a-8bb1-4689-8537-bfdcf691c441\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-72g6n" Apr 23 16:37:50.809127 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:50.809102 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a848667a-8bb1-4689-8537-bfdcf691c441-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-72g6n\" (UID: \"a848667a-8bb1-4689-8537-bfdcf691c441\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-72g6n" Apr 23 16:37:50.810550 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:50.810529 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/a848667a-8bb1-4689-8537-bfdcf691c441-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-72g6n\" (UID: \"a848667a-8bb1-4689-8537-bfdcf691c441\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-72g6n" Apr 23 16:37:50.810647 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:50.810618 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a848667a-8bb1-4689-8537-bfdcf691c441-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-72g6n\" (UID: \"a848667a-8bb1-4689-8537-bfdcf691c441\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-72g6n" Apr 23 16:37:50.816961 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:50.816933 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g8l6\" (UniqueName: \"kubernetes.io/projected/a848667a-8bb1-4689-8537-bfdcf691c441-kube-api-access-6g8l6\") pod \"prometheus-operator-5676c8c784-72g6n\" (UID: \"a848667a-8bb1-4689-8537-bfdcf691c441\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-72g6n" Apr 23 16:37:50.896872 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:50.896789 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jfzk8" Apr 23 16:37:50.964109 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:50.964078 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-72g6n" Apr 23 16:37:51.086429 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:51.086395 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-72g6n"] Apr 23 16:37:51.090040 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:37:51.090016 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda848667a_8bb1_4689_8537_bfdcf691c441.slice/crio-c3292ae876dafdb9a414c13acbff568bd4642f389be4b55493362f48d960fc3e WatchSource:0}: Error finding container c3292ae876dafdb9a414c13acbff568bd4642f389be4b55493362f48d960fc3e: Status 404 returned error can't find the container with id c3292ae876dafdb9a414c13acbff568bd4642f389be4b55493362f48d960fc3e Apr 23 16:37:51.319770 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:37:51.319688 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-k7n97" podUID="1eabe990-f610-4e94-8a89-7cff1c9a6a23" Apr 23 16:37:51.901225 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:51.901183 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-72g6n" event={"ID":"a848667a-8bb1-4689-8537-bfdcf691c441","Type":"ContainerStarted","Data":"c3292ae876dafdb9a414c13acbff568bd4642f389be4b55493362f48d960fc3e"} Apr 23 16:37:52.905085 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:52.905052 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-72g6n" event={"ID":"a848667a-8bb1-4689-8537-bfdcf691c441","Type":"ContainerStarted","Data":"04ca653d8d71c55cb2ed6a22192966fca3148ea38ee519794d6c2a06049c8489"} Apr 23 16:37:52.905457 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:52.905093 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-72g6n" event={"ID":"a848667a-8bb1-4689-8537-bfdcf691c441","Type":"ContainerStarted","Data":"bab9bcc4ebd9b95df16460e74d23698bdeeaf5963e86a7b05ad3e1600c402c21"} Apr 23 16:37:52.925374 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:52.925325 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-72g6n" podStartSLOduration=1.5214516059999998 podStartE2EDuration="2.925312904s" podCreationTimestamp="2026-04-23 16:37:50 +0000 UTC" firstStartedPulling="2026-04-23 16:37:51.091917265 +0000 UTC m=+156.429180208" lastFinishedPulling="2026-04-23 16:37:52.495778555 +0000 UTC m=+157.833041506" observedRunningTime="2026-04-23 16:37:52.923958592 +0000 UTC m=+158.261221557" watchObservedRunningTime="2026-04-23 16:37:52.925312904 +0000 UTC m=+158.262575867" Apr 23 16:37:55.041518 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.041479 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e2c88812-4055-43aa-8e5a-25b432f9041d-metrics-tls\") pod \"dns-default-jfzk8\" (UID: \"e2c88812-4055-43aa-8e5a-25b432f9041d\") " pod="openshift-dns/dns-default-jfzk8" Apr 23 16:37:55.041998 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.041616 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bfd1dfe-900e-4260-b0fc-9dc05d2c604c-cert\") pod \"ingress-canary-bsgxs\" (UID: \"3bfd1dfe-900e-4260-b0fc-9dc05d2c604c\") " pod="openshift-ingress-canary/ingress-canary-bsgxs" Apr 23 16:37:55.044528 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.044500 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e2c88812-4055-43aa-8e5a-25b432f9041d-metrics-tls\") pod \"dns-default-jfzk8\" (UID: \"e2c88812-4055-43aa-8e5a-25b432f9041d\") " pod="openshift-dns/dns-default-jfzk8" Apr 23 16:37:55.044682 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.044560 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bfd1dfe-900e-4260-b0fc-9dc05d2c604c-cert\") pod \"ingress-canary-bsgxs\" (UID: \"3bfd1dfe-900e-4260-b0fc-9dc05d2c604c\") " pod="openshift-ingress-canary/ingress-canary-bsgxs" Apr 23 16:37:55.115528 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.115499 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-jmk6t\"" Apr 23 16:37:55.118010 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.117984 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jfzk8" Apr 23 16:37:55.131440 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.131414 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-qhxfp"] Apr 23 16:37:55.136908 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.136817 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-b7ln8"] Apr 23 16:37:55.137043 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.137021 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-qhxfp" Apr 23 16:37:55.140236 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.140214 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-b7ln8" Apr 23 16:37:55.141021 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.140896 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 23 16:37:55.141021 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.140903 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-ggcgx\"" Apr 23 16:37:55.142001 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.141805 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 23 16:37:55.142001 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.141876 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 23 16:37:55.142406 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.142386 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 16:37:55.142471 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.142407 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 16:37:55.145335 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.144646 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-vplks\"" Apr 23 16:37:55.145335 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.144880 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 16:37:55.227753 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.227702 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-qhxfp"] Apr 23 16:37:55.243186 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.243150 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7b73fcba-01db-4fdf-b70e-16248a785061-sys\") pod \"node-exporter-b7ln8\" (UID: \"7b73fcba-01db-4fdf-b70e-16248a785061\") " pod="openshift-monitoring/node-exporter-b7ln8" Apr 23 16:37:55.243415 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.243201 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7b73fcba-01db-4fdf-b70e-16248a785061-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-b7ln8\" (UID: \"7b73fcba-01db-4fdf-b70e-16248a785061\") " pod="openshift-monitoring/node-exporter-b7ln8" Apr 23 16:37:55.243415 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.243240 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7b73fcba-01db-4fdf-b70e-16248a785061-node-exporter-textfile\") pod \"node-exporter-b7ln8\" (UID: \"7b73fcba-01db-4fdf-b70e-16248a785061\") " pod="openshift-monitoring/node-exporter-b7ln8" Apr 23 16:37:55.243415 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.243318 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/711d34ad-5d72-4eec-b9be-535d9896a2f6-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-qhxfp\" (UID: \"711d34ad-5d72-4eec-b9be-535d9896a2f6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qhxfp" Apr 23 16:37:55.243415 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.243369 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/711d34ad-5d72-4eec-b9be-535d9896a2f6-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-qhxfp\" (UID: \"711d34ad-5d72-4eec-b9be-535d9896a2f6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qhxfp" Apr 23 16:37:55.243415 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.243404 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7b73fcba-01db-4fdf-b70e-16248a785061-node-exporter-tls\") pod \"node-exporter-b7ln8\" (UID: \"7b73fcba-01db-4fdf-b70e-16248a785061\") " pod="openshift-monitoring/node-exporter-b7ln8" Apr 23 16:37:55.243760 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.243435 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f7nn\" (UniqueName: \"kubernetes.io/projected/711d34ad-5d72-4eec-b9be-535d9896a2f6-kube-api-access-6f7nn\") pod \"kube-state-metrics-69db897b98-qhxfp\" (UID: \"711d34ad-5d72-4eec-b9be-535d9896a2f6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qhxfp" Apr 23 16:37:55.243760 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.243488 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c58n4\" (UniqueName: \"kubernetes.io/projected/7b73fcba-01db-4fdf-b70e-16248a785061-kube-api-access-c58n4\") pod \"node-exporter-b7ln8\" (UID: \"7b73fcba-01db-4fdf-b70e-16248a785061\") " pod="openshift-monitoring/node-exporter-b7ln8" Apr 23 16:37:55.243760 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.243525 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7b73fcba-01db-4fdf-b70e-16248a785061-node-exporter-wtmp\") pod \"node-exporter-b7ln8\" (UID: \"7b73fcba-01db-4fdf-b70e-16248a785061\") " pod="openshift-monitoring/node-exporter-b7ln8" Apr 23 16:37:55.243760 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.243567 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/711d34ad-5d72-4eec-b9be-535d9896a2f6-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-qhxfp\" (UID: \"711d34ad-5d72-4eec-b9be-535d9896a2f6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qhxfp" Apr 23 16:37:55.243760 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.243598 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/711d34ad-5d72-4eec-b9be-535d9896a2f6-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-qhxfp\" (UID: \"711d34ad-5d72-4eec-b9be-535d9896a2f6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qhxfp" Apr 23 16:37:55.243760 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.243692 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7b73fcba-01db-4fdf-b70e-16248a785061-metrics-client-ca\") pod \"node-exporter-b7ln8\" (UID: \"7b73fcba-01db-4fdf-b70e-16248a785061\") " pod="openshift-monitoring/node-exporter-b7ln8" Apr 23 16:37:55.244072 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.243759 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7b73fcba-01db-4fdf-b70e-16248a785061-node-exporter-accelerators-collector-config\") pod \"node-exporter-b7ln8\" (UID: \"7b73fcba-01db-4fdf-b70e-16248a785061\") " pod="openshift-monitoring/node-exporter-b7ln8" Apr 23 16:37:55.244072 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.243812 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7b73fcba-01db-4fdf-b70e-16248a785061-root\") pod \"node-exporter-b7ln8\" (UID: \"7b73fcba-01db-4fdf-b70e-16248a785061\") " pod="openshift-monitoring/node-exporter-b7ln8" Apr 23 16:37:55.244072 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.243868 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/711d34ad-5d72-4eec-b9be-535d9896a2f6-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-qhxfp\" (UID: \"711d34ad-5d72-4eec-b9be-535d9896a2f6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qhxfp" Apr 23 16:37:55.260194 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.260171 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6674d685f7-kkh7x"] Apr 23 16:37:55.263391 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.263372 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6674d685f7-kkh7x" Apr 23 16:37:55.271428 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.266437 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 16:37:55.271428 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.266852 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 16:37:55.271428 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.267544 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 16:37:55.271428 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.267650 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 16:37:55.271428 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.268166 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 16:37:55.271428 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.268566 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 16:37:55.271428 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.269148 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-q82nr\"" Apr 23 16:37:55.274353 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.274287 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 16:37:55.278988 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.278967 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 23 16:37:55.280217 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.279420 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6674d685f7-kkh7x"] Apr 23 16:37:55.281445 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.281353 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jfzk8"] Apr 23 16:37:55.285087 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:37:55.285054 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2c88812_4055_43aa_8e5a_25b432f9041d.slice/crio-b60e37a4b7acdb523e6c42d3e17853047ffdba44a59eb6b973233c2e2745de47 WatchSource:0}: Error finding container b60e37a4b7acdb523e6c42d3e17853047ffdba44a59eb6b973233c2e2745de47: Status 404 returned error can't find the container with id b60e37a4b7acdb523e6c42d3e17853047ffdba44a59eb6b973233c2e2745de47 Apr 23 16:37:55.344408 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.344387 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7b73fcba-01db-4fdf-b70e-16248a785061-sys\") pod \"node-exporter-b7ln8\" (UID: \"7b73fcba-01db-4fdf-b70e-16248a785061\") " pod="openshift-monitoring/node-exporter-b7ln8" Apr 23 16:37:55.344549 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.344419 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c5ea87ff-a22d-4952-8409-ae584643f659-service-ca\") pod \"console-6674d685f7-kkh7x\" (UID: \"c5ea87ff-a22d-4952-8409-ae584643f659\") " pod="openshift-console/console-6674d685f7-kkh7x" Apr 23 16:37:55.344549 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.344442 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7b73fcba-01db-4fdf-b70e-16248a785061-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-b7ln8\" (UID: \"7b73fcba-01db-4fdf-b70e-16248a785061\") " pod="openshift-monitoring/node-exporter-b7ln8" Apr 23 16:37:55.344549 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.344460 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5ea87ff-a22d-4952-8409-ae584643f659-console-serving-cert\") pod \"console-6674d685f7-kkh7x\" (UID: \"c5ea87ff-a22d-4952-8409-ae584643f659\") " pod="openshift-console/console-6674d685f7-kkh7x" Apr 23 16:37:55.344549 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.344490 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7b73fcba-01db-4fdf-b70e-16248a785061-sys\") pod \"node-exporter-b7ln8\" (UID: \"7b73fcba-01db-4fdf-b70e-16248a785061\") " pod="openshift-monitoring/node-exporter-b7ln8" Apr 23 16:37:55.344549 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.344492 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7b73fcba-01db-4fdf-b70e-16248a785061-node-exporter-textfile\") pod \"node-exporter-b7ln8\" (UID: \"7b73fcba-01db-4fdf-b70e-16248a785061\") " pod="openshift-monitoring/node-exporter-b7ln8" Apr 23 16:37:55.344549 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.344538 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/711d34ad-5d72-4eec-b9be-535d9896a2f6-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-qhxfp\" (UID: \"711d34ad-5d72-4eec-b9be-535d9896a2f6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qhxfp" Apr 23 16:37:55.344851 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.344555 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/711d34ad-5d72-4eec-b9be-535d9896a2f6-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-qhxfp\" (UID: \"711d34ad-5d72-4eec-b9be-535d9896a2f6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qhxfp" Apr 23 16:37:55.344851 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.344573 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7b73fcba-01db-4fdf-b70e-16248a785061-node-exporter-tls\") pod \"node-exporter-b7ln8\" (UID: \"7b73fcba-01db-4fdf-b70e-16248a785061\") " pod="openshift-monitoring/node-exporter-b7ln8" Apr 23 16:37:55.344851 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.344594 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6f7nn\" (UniqueName: \"kubernetes.io/projected/711d34ad-5d72-4eec-b9be-535d9896a2f6-kube-api-access-6f7nn\") pod \"kube-state-metrics-69db897b98-qhxfp\" (UID: \"711d34ad-5d72-4eec-b9be-535d9896a2f6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qhxfp" Apr 23 16:37:55.344851 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.344614 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c58n4\" (UniqueName: \"kubernetes.io/projected/7b73fcba-01db-4fdf-b70e-16248a785061-kube-api-access-c58n4\") pod \"node-exporter-b7ln8\" (UID: \"7b73fcba-01db-4fdf-b70e-16248a785061\") " pod="openshift-monitoring/node-exporter-b7ln8" Apr 23 16:37:55.344851 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.344749 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5ea87ff-a22d-4952-8409-ae584643f659-trusted-ca-bundle\") pod \"console-6674d685f7-kkh7x\" (UID: \"c5ea87ff-a22d-4952-8409-ae584643f659\") " pod="openshift-console/console-6674d685f7-kkh7x" Apr 23 16:37:55.344851 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.344802 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7b73fcba-01db-4fdf-b70e-16248a785061-node-exporter-wtmp\") pod \"node-exporter-b7ln8\" (UID: \"7b73fcba-01db-4fdf-b70e-16248a785061\") " pod="openshift-monitoring/node-exporter-b7ln8" Apr 23 16:37:55.344851 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.344838 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/711d34ad-5d72-4eec-b9be-535d9896a2f6-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-qhxfp\" (UID: \"711d34ad-5d72-4eec-b9be-535d9896a2f6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qhxfp" Apr 23 16:37:55.345224 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.344875 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/711d34ad-5d72-4eec-b9be-535d9896a2f6-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-qhxfp\" (UID: \"711d34ad-5d72-4eec-b9be-535d9896a2f6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qhxfp" Apr 23 16:37:55.345224 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.344941 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlgg2\" (UniqueName: \"kubernetes.io/projected/c5ea87ff-a22d-4952-8409-ae584643f659-kube-api-access-mlgg2\") pod \"console-6674d685f7-kkh7x\" (UID: \"c5ea87ff-a22d-4952-8409-ae584643f659\") " pod="openshift-console/console-6674d685f7-kkh7x" Apr 23 16:37:55.345224 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.344978 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c5ea87ff-a22d-4952-8409-ae584643f659-oauth-serving-cert\") pod \"console-6674d685f7-kkh7x\" (UID: \"c5ea87ff-a22d-4952-8409-ae584643f659\") " pod="openshift-console/console-6674d685f7-kkh7x" Apr 23 16:37:55.345224 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.345006 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7b73fcba-01db-4fdf-b70e-16248a785061-node-exporter-wtmp\") pod \"node-exporter-b7ln8\" (UID: \"7b73fcba-01db-4fdf-b70e-16248a785061\") " pod="openshift-monitoring/node-exporter-b7ln8" Apr 23 16:37:55.345224 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.345014 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7b73fcba-01db-4fdf-b70e-16248a785061-metrics-client-ca\") pod \"node-exporter-b7ln8\" (UID: \"7b73fcba-01db-4fdf-b70e-16248a785061\") " pod="openshift-monitoring/node-exporter-b7ln8" Apr 23 16:37:55.345224 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.345040 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c5ea87ff-a22d-4952-8409-ae584643f659-console-oauth-config\") pod \"console-6674d685f7-kkh7x\" (UID: \"c5ea87ff-a22d-4952-8409-ae584643f659\") " pod="openshift-console/console-6674d685f7-kkh7x" Apr 23 16:37:55.345224 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.345077 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7b73fcba-01db-4fdf-b70e-16248a785061-node-exporter-accelerators-collector-config\") pod \"node-exporter-b7ln8\" (UID: \"7b73fcba-01db-4fdf-b70e-16248a785061\") " pod="openshift-monitoring/node-exporter-b7ln8" Apr 23 16:37:55.345224 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.345112 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7b73fcba-01db-4fdf-b70e-16248a785061-root\") pod \"node-exporter-b7ln8\" (UID: \"7b73fcba-01db-4fdf-b70e-16248a785061\") " pod="openshift-monitoring/node-exporter-b7ln8" Apr 23 16:37:55.345224 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.345139 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c5ea87ff-a22d-4952-8409-ae584643f659-console-config\") pod \"console-6674d685f7-kkh7x\" (UID: \"c5ea87ff-a22d-4952-8409-ae584643f659\") " pod="openshift-console/console-6674d685f7-kkh7x" Apr 23 16:37:55.345224 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.345170 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/711d34ad-5d72-4eec-b9be-535d9896a2f6-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-qhxfp\" (UID: \"711d34ad-5d72-4eec-b9be-535d9896a2f6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qhxfp" Apr 23 16:37:55.345770 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.345306 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/711d34ad-5d72-4eec-b9be-535d9896a2f6-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-qhxfp\" (UID: \"711d34ad-5d72-4eec-b9be-535d9896a2f6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qhxfp" Apr 23 16:37:55.345770 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.344841 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7b73fcba-01db-4fdf-b70e-16248a785061-node-exporter-textfile\") pod \"node-exporter-b7ln8\" (UID: \"7b73fcba-01db-4fdf-b70e-16248a785061\") " pod="openshift-monitoring/node-exporter-b7ln8" Apr 23 16:37:55.345770 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.345645 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/711d34ad-5d72-4eec-b9be-535d9896a2f6-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-qhxfp\" (UID: \"711d34ad-5d72-4eec-b9be-535d9896a2f6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qhxfp" Apr 23 16:37:55.345770 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.345735 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7b73fcba-01db-4fdf-b70e-16248a785061-root\") pod \"node-exporter-b7ln8\" (UID: \"7b73fcba-01db-4fdf-b70e-16248a785061\") " pod="openshift-monitoring/node-exporter-b7ln8" Apr 23 16:37:55.345973 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.345885 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/711d34ad-5d72-4eec-b9be-535d9896a2f6-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-qhxfp\" (UID: \"711d34ad-5d72-4eec-b9be-535d9896a2f6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qhxfp" Apr 23 16:37:55.346045 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.346023 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7b73fcba-01db-4fdf-b70e-16248a785061-node-exporter-accelerators-collector-config\") pod \"node-exporter-b7ln8\" (UID: \"7b73fcba-01db-4fdf-b70e-16248a785061\") " pod="openshift-monitoring/node-exporter-b7ln8" Apr 23 16:37:55.346409 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.346387 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7b73fcba-01db-4fdf-b70e-16248a785061-metrics-client-ca\") pod \"node-exporter-b7ln8\" (UID: \"7b73fcba-01db-4fdf-b70e-16248a785061\") " pod="openshift-monitoring/node-exporter-b7ln8" Apr 23 16:37:55.347139 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.347112 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7b73fcba-01db-4fdf-b70e-16248a785061-node-exporter-tls\") pod \"node-exporter-b7ln8\" (UID: \"7b73fcba-01db-4fdf-b70e-16248a785061\") " pod="openshift-monitoring/node-exporter-b7ln8" Apr 23 16:37:55.347229 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.347119 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/711d34ad-5d72-4eec-b9be-535d9896a2f6-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-qhxfp\" (UID: \"711d34ad-5d72-4eec-b9be-535d9896a2f6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qhxfp" Apr 23 16:37:55.347229 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.347172 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7b73fcba-01db-4fdf-b70e-16248a785061-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-b7ln8\" (UID: \"7b73fcba-01db-4fdf-b70e-16248a785061\") " pod="openshift-monitoring/node-exporter-b7ln8" Apr 23 16:37:55.347329 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.347312 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/711d34ad-5d72-4eec-b9be-535d9896a2f6-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-qhxfp\" (UID: \"711d34ad-5d72-4eec-b9be-535d9896a2f6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qhxfp" Apr 23 16:37:55.352727 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.352653 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c58n4\" (UniqueName: \"kubernetes.io/projected/7b73fcba-01db-4fdf-b70e-16248a785061-kube-api-access-c58n4\") pod \"node-exporter-b7ln8\" (UID: \"7b73fcba-01db-4fdf-b70e-16248a785061\") " pod="openshift-monitoring/node-exporter-b7ln8" Apr 23 16:37:55.352932 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.352913 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f7nn\" (UniqueName: \"kubernetes.io/projected/711d34ad-5d72-4eec-b9be-535d9896a2f6-kube-api-access-6f7nn\") pod \"kube-state-metrics-69db897b98-qhxfp\" (UID: \"711d34ad-5d72-4eec-b9be-535d9896a2f6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qhxfp" Apr 23 16:37:55.446519 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.446474 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mlgg2\" (UniqueName: \"kubernetes.io/projected/c5ea87ff-a22d-4952-8409-ae584643f659-kube-api-access-mlgg2\") pod \"console-6674d685f7-kkh7x\" (UID: \"c5ea87ff-a22d-4952-8409-ae584643f659\") " pod="openshift-console/console-6674d685f7-kkh7x" Apr 23 16:37:55.446519 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.446523 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c5ea87ff-a22d-4952-8409-ae584643f659-oauth-serving-cert\") pod \"console-6674d685f7-kkh7x\" (UID: \"c5ea87ff-a22d-4952-8409-ae584643f659\") " pod="openshift-console/console-6674d685f7-kkh7x" Apr 23 16:37:55.446750 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.446628 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c5ea87ff-a22d-4952-8409-ae584643f659-console-oauth-config\") pod \"console-6674d685f7-kkh7x\" (UID: \"c5ea87ff-a22d-4952-8409-ae584643f659\") " pod="openshift-console/console-6674d685f7-kkh7x" Apr 23 16:37:55.446750 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.446687 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c5ea87ff-a22d-4952-8409-ae584643f659-console-config\") pod \"console-6674d685f7-kkh7x\" (UID: \"c5ea87ff-a22d-4952-8409-ae584643f659\") " pod="openshift-console/console-6674d685f7-kkh7x" Apr 23 16:37:55.446750 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.446727 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c5ea87ff-a22d-4952-8409-ae584643f659-service-ca\") pod \"console-6674d685f7-kkh7x\" (UID: \"c5ea87ff-a22d-4952-8409-ae584643f659\") " pod="openshift-console/console-6674d685f7-kkh7x" Apr 23 16:37:55.446905 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.446757 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5ea87ff-a22d-4952-8409-ae584643f659-console-serving-cert\") pod \"console-6674d685f7-kkh7x\" (UID: \"c5ea87ff-a22d-4952-8409-ae584643f659\") " pod="openshift-console/console-6674d685f7-kkh7x" Apr 23 16:37:55.446905 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.446803 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5ea87ff-a22d-4952-8409-ae584643f659-trusted-ca-bundle\") pod \"console-6674d685f7-kkh7x\" (UID: \"c5ea87ff-a22d-4952-8409-ae584643f659\") " pod="openshift-console/console-6674d685f7-kkh7x" Apr 23 16:37:55.447392 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.447360 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c5ea87ff-a22d-4952-8409-ae584643f659-oauth-serving-cert\") pod \"console-6674d685f7-kkh7x\" (UID: \"c5ea87ff-a22d-4952-8409-ae584643f659\") " pod="openshift-console/console-6674d685f7-kkh7x" Apr 23 16:37:55.447743 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.447720 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c5ea87ff-a22d-4952-8409-ae584643f659-service-ca\") pod \"console-6674d685f7-kkh7x\" (UID: \"c5ea87ff-a22d-4952-8409-ae584643f659\") " pod="openshift-console/console-6674d685f7-kkh7x" Apr 23 16:37:55.447858 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.447779 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c5ea87ff-a22d-4952-8409-ae584643f659-console-config\") pod \"console-6674d685f7-kkh7x\" (UID: \"c5ea87ff-a22d-4952-8409-ae584643f659\") " pod="openshift-console/console-6674d685f7-kkh7x" Apr 23 16:37:55.448218 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.448194 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5ea87ff-a22d-4952-8409-ae584643f659-trusted-ca-bundle\") pod \"console-6674d685f7-kkh7x\" (UID: \"c5ea87ff-a22d-4952-8409-ae584643f659\") " pod="openshift-console/console-6674d685f7-kkh7x" Apr 23 16:37:55.449190 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.449161 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c5ea87ff-a22d-4952-8409-ae584643f659-console-oauth-config\") pod \"console-6674d685f7-kkh7x\" (UID: \"c5ea87ff-a22d-4952-8409-ae584643f659\") " pod="openshift-console/console-6674d685f7-kkh7x" Apr 23 16:37:55.449397 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.449377 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5ea87ff-a22d-4952-8409-ae584643f659-console-serving-cert\") pod \"console-6674d685f7-kkh7x\" (UID: \"c5ea87ff-a22d-4952-8409-ae584643f659\") " pod="openshift-console/console-6674d685f7-kkh7x" Apr 23 16:37:55.450115 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.450101 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-qhxfp" Apr 23 16:37:55.455304 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.455285 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlgg2\" (UniqueName: \"kubernetes.io/projected/c5ea87ff-a22d-4952-8409-ae584643f659-kube-api-access-mlgg2\") pod \"console-6674d685f7-kkh7x\" (UID: \"c5ea87ff-a22d-4952-8409-ae584643f659\") " pod="openshift-console/console-6674d685f7-kkh7x" Apr 23 16:37:55.456621 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.456602 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-b7ln8" Apr 23 16:37:55.463758 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:37:55.463711 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b73fcba_01db_4fdf_b70e_16248a785061.slice/crio-3b1a878c97aeb0520bd7f4749657dfdda9d8859127c83d6bd31beb9fc103f6bb WatchSource:0}: Error finding container 3b1a878c97aeb0520bd7f4749657dfdda9d8859127c83d6bd31beb9fc103f6bb: Status 404 returned error can't find the container with id 3b1a878c97aeb0520bd7f4749657dfdda9d8859127c83d6bd31beb9fc103f6bb Apr 23 16:37:55.574787 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.574758 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-qhxfp"] Apr 23 16:37:55.576859 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:37:55.576833 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod711d34ad_5d72_4eec_b9be_535d9896a2f6.slice/crio-fe4c16c17aa022366685e7258a8ad77edf32c25f6464a7dad3697271caf5444a WatchSource:0}: Error finding container fe4c16c17aa022366685e7258a8ad77edf32c25f6464a7dad3697271caf5444a: Status 404 returned error can't find the container with id fe4c16c17aa022366685e7258a8ad77edf32c25f6464a7dad3697271caf5444a Apr 23 16:37:55.581107 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.581088 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6674d685f7-kkh7x" Apr 23 16:37:55.705296 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.705262 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6674d685f7-kkh7x"] Apr 23 16:37:55.707979 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:37:55.707955 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5ea87ff_a22d_4952_8409_ae584643f659.slice/crio-d8ea10106081da33f9b63522cf345c76c7513d8e41ddaddac559d9befb093951 WatchSource:0}: Error finding container d8ea10106081da33f9b63522cf345c76c7513d8e41ddaddac559d9befb093951: Status 404 returned error can't find the container with id d8ea10106081da33f9b63522cf345c76c7513d8e41ddaddac559d9befb093951 Apr 23 16:37:55.914184 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.914095 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-qhxfp" event={"ID":"711d34ad-5d72-4eec-b9be-535d9896a2f6","Type":"ContainerStarted","Data":"fe4c16c17aa022366685e7258a8ad77edf32c25f6464a7dad3697271caf5444a"} Apr 23 16:37:55.915265 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.915232 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-b7ln8" event={"ID":"7b73fcba-01db-4fdf-b70e-16248a785061","Type":"ContainerStarted","Data":"3b1a878c97aeb0520bd7f4749657dfdda9d8859127c83d6bd31beb9fc103f6bb"} Apr 23 16:37:55.916490 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.916462 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6674d685f7-kkh7x" event={"ID":"c5ea87ff-a22d-4952-8409-ae584643f659","Type":"ContainerStarted","Data":"d8ea10106081da33f9b63522cf345c76c7513d8e41ddaddac559d9befb093951"} Apr 23 16:37:55.917513 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:55.917492 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jfzk8" event={"ID":"e2c88812-4055-43aa-8e5a-25b432f9041d","Type":"ContainerStarted","Data":"b60e37a4b7acdb523e6c42d3e17853047ffdba44a59eb6b973233c2e2745de47"} Apr 23 16:37:57.087101 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.087067 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-84956f9864-64zbb"] Apr 23 16:37:57.091618 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.091558 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-84956f9864-64zbb" Apr 23 16:37:57.093476 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.093453 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 23 16:37:57.093895 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.093870 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-nr7s5\"" Apr 23 16:37:57.094106 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.094073 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 23 16:37:57.094188 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.094149 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 23 16:37:57.094753 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.094708 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 23 16:37:57.094753 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.094735 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-7vhicti20ekee\"" Apr 23 16:37:57.095000 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.094983 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 23 16:37:57.104611 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.104587 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-84956f9864-64zbb"] Apr 23 16:37:57.168338 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.168301 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/086aa7f2-f4e0-44d1-8d28-be8fba79787b-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-84956f9864-64zbb\" (UID: \"086aa7f2-f4e0-44d1-8d28-be8fba79787b\") " pod="openshift-monitoring/thanos-querier-84956f9864-64zbb" Apr 23 16:37:57.168530 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.168350 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/086aa7f2-f4e0-44d1-8d28-be8fba79787b-metrics-client-ca\") pod \"thanos-querier-84956f9864-64zbb\" (UID: \"086aa7f2-f4e0-44d1-8d28-be8fba79787b\") " pod="openshift-monitoring/thanos-querier-84956f9864-64zbb" Apr 23 16:37:57.168530 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.168481 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mw2w\" (UniqueName: \"kubernetes.io/projected/086aa7f2-f4e0-44d1-8d28-be8fba79787b-kube-api-access-8mw2w\") pod \"thanos-querier-84956f9864-64zbb\" (UID: \"086aa7f2-f4e0-44d1-8d28-be8fba79787b\") " pod="openshift-monitoring/thanos-querier-84956f9864-64zbb" Apr 23 16:37:57.168647 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.168526 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/086aa7f2-f4e0-44d1-8d28-be8fba79787b-secret-thanos-querier-tls\") pod \"thanos-querier-84956f9864-64zbb\" (UID: \"086aa7f2-f4e0-44d1-8d28-be8fba79787b\") " pod="openshift-monitoring/thanos-querier-84956f9864-64zbb" Apr 23 16:37:57.168647 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.168557 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/086aa7f2-f4e0-44d1-8d28-be8fba79787b-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-84956f9864-64zbb\" (UID: \"086aa7f2-f4e0-44d1-8d28-be8fba79787b\") " pod="openshift-monitoring/thanos-querier-84956f9864-64zbb" Apr 23 16:37:57.168647 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.168586 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/086aa7f2-f4e0-44d1-8d28-be8fba79787b-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-84956f9864-64zbb\" (UID: \"086aa7f2-f4e0-44d1-8d28-be8fba79787b\") " pod="openshift-monitoring/thanos-querier-84956f9864-64zbb" Apr 23 16:37:57.168647 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.168631 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/086aa7f2-f4e0-44d1-8d28-be8fba79787b-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-84956f9864-64zbb\" (UID: \"086aa7f2-f4e0-44d1-8d28-be8fba79787b\") " pod="openshift-monitoring/thanos-querier-84956f9864-64zbb" Apr 23 16:37:57.168890 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.168705 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/086aa7f2-f4e0-44d1-8d28-be8fba79787b-secret-grpc-tls\") pod \"thanos-querier-84956f9864-64zbb\" (UID: \"086aa7f2-f4e0-44d1-8d28-be8fba79787b\") " pod="openshift-monitoring/thanos-querier-84956f9864-64zbb" Apr 23 16:37:57.269772 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.269733 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/086aa7f2-f4e0-44d1-8d28-be8fba79787b-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-84956f9864-64zbb\" (UID: \"086aa7f2-f4e0-44d1-8d28-be8fba79787b\") " pod="openshift-monitoring/thanos-querier-84956f9864-64zbb" Apr 23 16:37:57.269917 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.269791 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/086aa7f2-f4e0-44d1-8d28-be8fba79787b-metrics-client-ca\") pod \"thanos-querier-84956f9864-64zbb\" (UID: \"086aa7f2-f4e0-44d1-8d28-be8fba79787b\") " pod="openshift-monitoring/thanos-querier-84956f9864-64zbb" Apr 23 16:37:57.269917 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.269867 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8mw2w\" (UniqueName: \"kubernetes.io/projected/086aa7f2-f4e0-44d1-8d28-be8fba79787b-kube-api-access-8mw2w\") pod \"thanos-querier-84956f9864-64zbb\" (UID: \"086aa7f2-f4e0-44d1-8d28-be8fba79787b\") " pod="openshift-monitoring/thanos-querier-84956f9864-64zbb" Apr 23 16:37:57.270033 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.269903 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/086aa7f2-f4e0-44d1-8d28-be8fba79787b-secret-thanos-querier-tls\") pod \"thanos-querier-84956f9864-64zbb\" (UID: \"086aa7f2-f4e0-44d1-8d28-be8fba79787b\") " pod="openshift-monitoring/thanos-querier-84956f9864-64zbb" Apr 23 16:37:57.270033 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.269948 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/086aa7f2-f4e0-44d1-8d28-be8fba79787b-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-84956f9864-64zbb\" (UID: \"086aa7f2-f4e0-44d1-8d28-be8fba79787b\") " pod="openshift-monitoring/thanos-querier-84956f9864-64zbb" Apr 23 16:37:57.270033 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.269973 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/086aa7f2-f4e0-44d1-8d28-be8fba79787b-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-84956f9864-64zbb\" (UID: \"086aa7f2-f4e0-44d1-8d28-be8fba79787b\") " pod="openshift-monitoring/thanos-querier-84956f9864-64zbb" Apr 23 16:37:57.270178 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.270035 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/086aa7f2-f4e0-44d1-8d28-be8fba79787b-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-84956f9864-64zbb\" (UID: \"086aa7f2-f4e0-44d1-8d28-be8fba79787b\") " pod="openshift-monitoring/thanos-querier-84956f9864-64zbb" Apr 23 16:37:57.270178 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.270065 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/086aa7f2-f4e0-44d1-8d28-be8fba79787b-secret-grpc-tls\") pod \"thanos-querier-84956f9864-64zbb\" (UID: \"086aa7f2-f4e0-44d1-8d28-be8fba79787b\") " pod="openshift-monitoring/thanos-querier-84956f9864-64zbb" Apr 23 16:37:57.270705 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.270523 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/086aa7f2-f4e0-44d1-8d28-be8fba79787b-metrics-client-ca\") pod \"thanos-querier-84956f9864-64zbb\" (UID: \"086aa7f2-f4e0-44d1-8d28-be8fba79787b\") " pod="openshift-monitoring/thanos-querier-84956f9864-64zbb" Apr 23 16:37:57.273008 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.272944 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/086aa7f2-f4e0-44d1-8d28-be8fba79787b-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-84956f9864-64zbb\" (UID: \"086aa7f2-f4e0-44d1-8d28-be8fba79787b\") " pod="openshift-monitoring/thanos-querier-84956f9864-64zbb" Apr 23 16:37:57.273008 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.273000 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/086aa7f2-f4e0-44d1-8d28-be8fba79787b-secret-grpc-tls\") pod \"thanos-querier-84956f9864-64zbb\" (UID: \"086aa7f2-f4e0-44d1-8d28-be8fba79787b\") " pod="openshift-monitoring/thanos-querier-84956f9864-64zbb" Apr 23 16:37:57.273142 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.273058 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/086aa7f2-f4e0-44d1-8d28-be8fba79787b-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-84956f9864-64zbb\" (UID: \"086aa7f2-f4e0-44d1-8d28-be8fba79787b\") " pod="openshift-monitoring/thanos-querier-84956f9864-64zbb" Apr 23 16:37:57.273409 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.273384 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/086aa7f2-f4e0-44d1-8d28-be8fba79787b-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-84956f9864-64zbb\" (UID: \"086aa7f2-f4e0-44d1-8d28-be8fba79787b\") " pod="openshift-monitoring/thanos-querier-84956f9864-64zbb" Apr 23 16:37:57.273500 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.273391 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/086aa7f2-f4e0-44d1-8d28-be8fba79787b-secret-thanos-querier-tls\") pod \"thanos-querier-84956f9864-64zbb\" (UID: \"086aa7f2-f4e0-44d1-8d28-be8fba79787b\") " pod="openshift-monitoring/thanos-querier-84956f9864-64zbb" Apr 23 16:37:57.273854 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.273832 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/086aa7f2-f4e0-44d1-8d28-be8fba79787b-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-84956f9864-64zbb\" (UID: \"086aa7f2-f4e0-44d1-8d28-be8fba79787b\") " pod="openshift-monitoring/thanos-querier-84956f9864-64zbb" Apr 23 16:37:57.278090 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.278053 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mw2w\" (UniqueName: \"kubernetes.io/projected/086aa7f2-f4e0-44d1-8d28-be8fba79787b-kube-api-access-8mw2w\") pod \"thanos-querier-84956f9864-64zbb\" (UID: \"086aa7f2-f4e0-44d1-8d28-be8fba79787b\") " pod="openshift-monitoring/thanos-querier-84956f9864-64zbb" Apr 23 16:37:57.404465 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.404432 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-84956f9864-64zbb" Apr 23 16:37:57.603324 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.603206 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-84956f9864-64zbb"] Apr 23 16:37:57.642906 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:37:57.642871 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod086aa7f2_f4e0_44d1_8d28_be8fba79787b.slice/crio-1a39e6cc5263c920d676b199d4c7fb1fd0768ac1dfe38922ffb048ad07c428b5 WatchSource:0}: Error finding container 1a39e6cc5263c920d676b199d4c7fb1fd0768ac1dfe38922ffb048ad07c428b5: Status 404 returned error can't find the container with id 1a39e6cc5263c920d676b199d4c7fb1fd0768ac1dfe38922ffb048ad07c428b5 Apr 23 16:37:57.925289 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.925196 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-qhxfp" event={"ID":"711d34ad-5d72-4eec-b9be-535d9896a2f6","Type":"ContainerStarted","Data":"fe451082cc6853cd18b6db5ff82b142e233bdffddb294e3cbd3ddb7a53c7aa1e"} Apr 23 16:37:57.925289 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.925242 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-qhxfp" event={"ID":"711d34ad-5d72-4eec-b9be-535d9896a2f6","Type":"ContainerStarted","Data":"baebc63cfcd04409bdd9bcd71b1a7e1ef61b48fa55683aa9255dfc6ecab0f06f"} Apr 23 16:37:57.925289 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.925256 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-qhxfp" event={"ID":"711d34ad-5d72-4eec-b9be-535d9896a2f6","Type":"ContainerStarted","Data":"261b87132ae906438be78fe08cb879a1d1e09eda6d16f666e22a1b29a82a1a83"} Apr 23 16:37:57.926299 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.926256 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-84956f9864-64zbb" event={"ID":"086aa7f2-f4e0-44d1-8d28-be8fba79787b","Type":"ContainerStarted","Data":"1a39e6cc5263c920d676b199d4c7fb1fd0768ac1dfe38922ffb048ad07c428b5"} Apr 23 16:37:57.927787 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.927757 2569 generic.go:358] "Generic (PLEG): container finished" podID="7b73fcba-01db-4fdf-b70e-16248a785061" containerID="508dcd74a5705488d5505c5b0bec95cbbc8badca60f6a6f17752590543af5fa0" exitCode=0 Apr 23 16:37:57.927887 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.927851 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-b7ln8" event={"ID":"7b73fcba-01db-4fdf-b70e-16248a785061","Type":"ContainerDied","Data":"508dcd74a5705488d5505c5b0bec95cbbc8badca60f6a6f17752590543af5fa0"} Apr 23 16:37:57.929613 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.929581 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jfzk8" event={"ID":"e2c88812-4055-43aa-8e5a-25b432f9041d","Type":"ContainerStarted","Data":"27d1725bd37bfb8809f9b95b9d4be2291c3856dbab89682b46586d68c7dce8fa"} Apr 23 16:37:57.929613 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.929610 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jfzk8" event={"ID":"e2c88812-4055-43aa-8e5a-25b432f9041d","Type":"ContainerStarted","Data":"d864e53fbe51a9501a6de7ebfc77b30e13f8106c6ea87440428d9083c688f088"} Apr 23 16:37:57.929838 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.929819 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-jfzk8" Apr 23 16:37:57.947427 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.947375 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-qhxfp" podStartSLOduration=1.231469795 podStartE2EDuration="2.947360196s" podCreationTimestamp="2026-04-23 16:37:55 +0000 UTC" firstStartedPulling="2026-04-23 16:37:55.578638784 +0000 UTC m=+160.915901726" lastFinishedPulling="2026-04-23 16:37:57.294529169 +0000 UTC m=+162.631792127" observedRunningTime="2026-04-23 16:37:57.946165553 +0000 UTC m=+163.283428544" watchObservedRunningTime="2026-04-23 16:37:57.947360196 +0000 UTC m=+163.284623161" Apr 23 16:37:57.986401 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:57.986342 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-jfzk8" podStartSLOduration=128.979864474 podStartE2EDuration="2m10.986327834s" podCreationTimestamp="2026-04-23 16:35:47 +0000 UTC" firstStartedPulling="2026-04-23 16:37:55.28693712 +0000 UTC m=+160.624200061" lastFinishedPulling="2026-04-23 16:37:57.29340048 +0000 UTC m=+162.630663421" observedRunningTime="2026-04-23 16:37:57.984935591 +0000 UTC m=+163.322198559" watchObservedRunningTime="2026-04-23 16:37:57.986327834 +0000 UTC m=+163.323590797" Apr 23 16:37:58.934956 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:58.934914 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-b7ln8" event={"ID":"7b73fcba-01db-4fdf-b70e-16248a785061","Type":"ContainerStarted","Data":"fe1fea5d220449c52109872bdca91888ac3ed40bc7f907f8bfa5cf0762195997"} Apr 23 16:37:58.934956 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:58.934958 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-b7ln8" event={"ID":"7b73fcba-01db-4fdf-b70e-16248a785061","Type":"ContainerStarted","Data":"96908e62d83ac2397d75b675e4424dda6f93c9c38e5d01ffa3c86fa86cbffd1b"} Apr 23 16:37:58.936886 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:58.936853 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6674d685f7-kkh7x" event={"ID":"c5ea87ff-a22d-4952-8409-ae584643f659","Type":"ContainerStarted","Data":"8ec20bf5affb0cce1182e436a0682900148e476b1aa7f5d2c72ea3b88b67ccee"} Apr 23 16:37:58.959306 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:58.959257 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-b7ln8" podStartSLOduration=2.133104037 podStartE2EDuration="3.959241511s" podCreationTimestamp="2026-04-23 16:37:55 +0000 UTC" firstStartedPulling="2026-04-23 16:37:55.465406553 +0000 UTC m=+160.802669494" lastFinishedPulling="2026-04-23 16:37:57.291544021 +0000 UTC m=+162.628806968" observedRunningTime="2026-04-23 16:37:58.957867437 +0000 UTC m=+164.295130436" watchObservedRunningTime="2026-04-23 16:37:58.959241511 +0000 UTC m=+164.296504475" Apr 23 16:37:58.982497 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:37:58.982453 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6674d685f7-kkh7x" podStartSLOduration=0.871214219 podStartE2EDuration="3.982435885s" podCreationTimestamp="2026-04-23 16:37:55 +0000 UTC" firstStartedPulling="2026-04-23 16:37:55.709926365 +0000 UTC m=+161.047189309" lastFinishedPulling="2026-04-23 16:37:58.821148033 +0000 UTC m=+164.158410975" observedRunningTime="2026-04-23 16:37:58.981941126 +0000 UTC m=+164.319204113" watchObservedRunningTime="2026-04-23 16:37:58.982435885 +0000 UTC m=+164.319698848" Apr 23 16:38:00.379653 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.379629 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-558bdd47f8-ck8sf"] Apr 23 16:38:00.383260 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.383236 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-558bdd47f8-ck8sf" Apr 23 16:38:00.386291 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.386267 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 23 16:38:00.386383 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.386368 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 23 16:38:00.386781 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.386762 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 23 16:38:00.386968 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.386879 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 23 16:38:00.387081 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.387043 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 23 16:38:00.387206 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.387187 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-hxbpq\"" Apr 23 16:38:00.395100 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.395076 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 23 16:38:00.398071 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.398047 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-558bdd47f8-ck8sf"] Apr 23 16:38:00.506535 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.506410 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/120bc9ec-9a39-4aa8-862f-b8b2867ca401-federate-client-tls\") pod \"telemeter-client-558bdd47f8-ck8sf\" (UID: \"120bc9ec-9a39-4aa8-862f-b8b2867ca401\") " pod="openshift-monitoring/telemeter-client-558bdd47f8-ck8sf" Apr 23 16:38:00.506816 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.506528 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/120bc9ec-9a39-4aa8-862f-b8b2867ca401-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-558bdd47f8-ck8sf\" (UID: \"120bc9ec-9a39-4aa8-862f-b8b2867ca401\") " pod="openshift-monitoring/telemeter-client-558bdd47f8-ck8sf" Apr 23 16:38:00.506816 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.506578 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/120bc9ec-9a39-4aa8-862f-b8b2867ca401-metrics-client-ca\") pod \"telemeter-client-558bdd47f8-ck8sf\" (UID: \"120bc9ec-9a39-4aa8-862f-b8b2867ca401\") " pod="openshift-monitoring/telemeter-client-558bdd47f8-ck8sf" Apr 23 16:38:00.506816 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.506642 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/120bc9ec-9a39-4aa8-862f-b8b2867ca401-secret-telemeter-client\") pod \"telemeter-client-558bdd47f8-ck8sf\" (UID: \"120bc9ec-9a39-4aa8-862f-b8b2867ca401\") " pod="openshift-monitoring/telemeter-client-558bdd47f8-ck8sf" Apr 23 16:38:00.506816 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.506743 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/120bc9ec-9a39-4aa8-862f-b8b2867ca401-telemeter-trusted-ca-bundle\") pod \"telemeter-client-558bdd47f8-ck8sf\" (UID: \"120bc9ec-9a39-4aa8-862f-b8b2867ca401\") " pod="openshift-monitoring/telemeter-client-558bdd47f8-ck8sf" Apr 23 16:38:00.506816 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.506774 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/120bc9ec-9a39-4aa8-862f-b8b2867ca401-serving-certs-ca-bundle\") pod \"telemeter-client-558bdd47f8-ck8sf\" (UID: \"120bc9ec-9a39-4aa8-862f-b8b2867ca401\") " pod="openshift-monitoring/telemeter-client-558bdd47f8-ck8sf" Apr 23 16:38:00.506816 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.506813 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8jwz\" (UniqueName: \"kubernetes.io/projected/120bc9ec-9a39-4aa8-862f-b8b2867ca401-kube-api-access-h8jwz\") pod \"telemeter-client-558bdd47f8-ck8sf\" (UID: \"120bc9ec-9a39-4aa8-862f-b8b2867ca401\") " pod="openshift-monitoring/telemeter-client-558bdd47f8-ck8sf" Apr 23 16:38:00.507126 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.506857 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/120bc9ec-9a39-4aa8-862f-b8b2867ca401-telemeter-client-tls\") pod \"telemeter-client-558bdd47f8-ck8sf\" (UID: \"120bc9ec-9a39-4aa8-862f-b8b2867ca401\") " pod="openshift-monitoring/telemeter-client-558bdd47f8-ck8sf" Apr 23 16:38:00.607379 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.607340 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/120bc9ec-9a39-4aa8-862f-b8b2867ca401-telemeter-trusted-ca-bundle\") pod \"telemeter-client-558bdd47f8-ck8sf\" (UID: \"120bc9ec-9a39-4aa8-862f-b8b2867ca401\") " pod="openshift-monitoring/telemeter-client-558bdd47f8-ck8sf" Apr 23 16:38:00.607531 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.607385 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/120bc9ec-9a39-4aa8-862f-b8b2867ca401-serving-certs-ca-bundle\") pod \"telemeter-client-558bdd47f8-ck8sf\" (UID: \"120bc9ec-9a39-4aa8-862f-b8b2867ca401\") " pod="openshift-monitoring/telemeter-client-558bdd47f8-ck8sf" Apr 23 16:38:00.607531 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.607429 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h8jwz\" (UniqueName: \"kubernetes.io/projected/120bc9ec-9a39-4aa8-862f-b8b2867ca401-kube-api-access-h8jwz\") pod \"telemeter-client-558bdd47f8-ck8sf\" (UID: \"120bc9ec-9a39-4aa8-862f-b8b2867ca401\") " pod="openshift-monitoring/telemeter-client-558bdd47f8-ck8sf" Apr 23 16:38:00.607531 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.607460 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/120bc9ec-9a39-4aa8-862f-b8b2867ca401-telemeter-client-tls\") pod \"telemeter-client-558bdd47f8-ck8sf\" (UID: \"120bc9ec-9a39-4aa8-862f-b8b2867ca401\") " pod="openshift-monitoring/telemeter-client-558bdd47f8-ck8sf" Apr 23 16:38:00.607531 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.607527 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/120bc9ec-9a39-4aa8-862f-b8b2867ca401-federate-client-tls\") pod \"telemeter-client-558bdd47f8-ck8sf\" (UID: \"120bc9ec-9a39-4aa8-862f-b8b2867ca401\") " pod="openshift-monitoring/telemeter-client-558bdd47f8-ck8sf" Apr 23 16:38:00.607785 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.607569 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/120bc9ec-9a39-4aa8-862f-b8b2867ca401-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-558bdd47f8-ck8sf\" (UID: \"120bc9ec-9a39-4aa8-862f-b8b2867ca401\") " pod="openshift-monitoring/telemeter-client-558bdd47f8-ck8sf" Apr 23 16:38:00.607785 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.607604 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/120bc9ec-9a39-4aa8-862f-b8b2867ca401-metrics-client-ca\") pod \"telemeter-client-558bdd47f8-ck8sf\" (UID: \"120bc9ec-9a39-4aa8-862f-b8b2867ca401\") " pod="openshift-monitoring/telemeter-client-558bdd47f8-ck8sf" Apr 23 16:38:00.607785 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.607642 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/120bc9ec-9a39-4aa8-862f-b8b2867ca401-secret-telemeter-client\") pod \"telemeter-client-558bdd47f8-ck8sf\" (UID: \"120bc9ec-9a39-4aa8-862f-b8b2867ca401\") " pod="openshift-monitoring/telemeter-client-558bdd47f8-ck8sf" Apr 23 16:38:00.608323 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.608260 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/120bc9ec-9a39-4aa8-862f-b8b2867ca401-telemeter-trusted-ca-bundle\") pod \"telemeter-client-558bdd47f8-ck8sf\" (UID: \"120bc9ec-9a39-4aa8-862f-b8b2867ca401\") " pod="openshift-monitoring/telemeter-client-558bdd47f8-ck8sf" Apr 23 16:38:00.608323 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.608316 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/120bc9ec-9a39-4aa8-862f-b8b2867ca401-serving-certs-ca-bundle\") pod \"telemeter-client-558bdd47f8-ck8sf\" (UID: \"120bc9ec-9a39-4aa8-862f-b8b2867ca401\") " pod="openshift-monitoring/telemeter-client-558bdd47f8-ck8sf" Apr 23 16:38:00.608509 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.608398 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/120bc9ec-9a39-4aa8-862f-b8b2867ca401-metrics-client-ca\") pod \"telemeter-client-558bdd47f8-ck8sf\" (UID: \"120bc9ec-9a39-4aa8-862f-b8b2867ca401\") " pod="openshift-monitoring/telemeter-client-558bdd47f8-ck8sf" Apr 23 16:38:00.609979 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.609948 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/120bc9ec-9a39-4aa8-862f-b8b2867ca401-telemeter-client-tls\") pod \"telemeter-client-558bdd47f8-ck8sf\" (UID: \"120bc9ec-9a39-4aa8-862f-b8b2867ca401\") " pod="openshift-monitoring/telemeter-client-558bdd47f8-ck8sf" Apr 23 16:38:00.610076 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.610038 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/120bc9ec-9a39-4aa8-862f-b8b2867ca401-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-558bdd47f8-ck8sf\" (UID: \"120bc9ec-9a39-4aa8-862f-b8b2867ca401\") " pod="openshift-monitoring/telemeter-client-558bdd47f8-ck8sf" Apr 23 16:38:00.610397 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.610376 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/120bc9ec-9a39-4aa8-862f-b8b2867ca401-secret-telemeter-client\") pod \"telemeter-client-558bdd47f8-ck8sf\" (UID: \"120bc9ec-9a39-4aa8-862f-b8b2867ca401\") " pod="openshift-monitoring/telemeter-client-558bdd47f8-ck8sf" Apr 23 16:38:00.610432 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.610400 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/120bc9ec-9a39-4aa8-862f-b8b2867ca401-federate-client-tls\") pod \"telemeter-client-558bdd47f8-ck8sf\" (UID: \"120bc9ec-9a39-4aa8-862f-b8b2867ca401\") " pod="openshift-monitoring/telemeter-client-558bdd47f8-ck8sf" Apr 23 16:38:00.615551 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.615528 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8jwz\" (UniqueName: \"kubernetes.io/projected/120bc9ec-9a39-4aa8-862f-b8b2867ca401-kube-api-access-h8jwz\") pod \"telemeter-client-558bdd47f8-ck8sf\" (UID: \"120bc9ec-9a39-4aa8-862f-b8b2867ca401\") " pod="openshift-monitoring/telemeter-client-558bdd47f8-ck8sf" Apr 23 16:38:00.709075 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.709042 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-558bdd47f8-ck8sf" Apr 23 16:38:00.840248 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.840204 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-558bdd47f8-ck8sf"] Apr 23 16:38:00.842203 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:38:00.842178 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod120bc9ec_9a39_4aa8_862f_b8b2867ca401.slice/crio-2c65a855256a542c5ae84894bcbd843322cc3d9c20f2874613911ea057826f87 WatchSource:0}: Error finding container 2c65a855256a542c5ae84894bcbd843322cc3d9c20f2874613911ea057826f87: Status 404 returned error can't find the container with id 2c65a855256a542c5ae84894bcbd843322cc3d9c20f2874613911ea057826f87 Apr 23 16:38:00.944521 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.944483 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-84956f9864-64zbb" event={"ID":"086aa7f2-f4e0-44d1-8d28-be8fba79787b","Type":"ContainerStarted","Data":"698fc51a8e6579121bff1d54cdbfde953ea1bd91ec190d8eb3d8a9ca47407651"} Apr 23 16:38:00.944521 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.944519 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-84956f9864-64zbb" event={"ID":"086aa7f2-f4e0-44d1-8d28-be8fba79787b","Type":"ContainerStarted","Data":"77388af70c7e94d6703db2bae6066487482dde740af02a3aa742c6e639947c44"} Apr 23 16:38:00.944726 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.944529 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-84956f9864-64zbb" event={"ID":"086aa7f2-f4e0-44d1-8d28-be8fba79787b","Type":"ContainerStarted","Data":"817b418cd13e2be00c55aae3955486e158c8dc6d244396e7d0ef267215ed6e15"} Apr 23 16:38:00.945411 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:00.945389 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-558bdd47f8-ck8sf" event={"ID":"120bc9ec-9a39-4aa8-862f-b8b2867ca401","Type":"ContainerStarted","Data":"2c65a855256a542c5ae84894bcbd843322cc3d9c20f2874613911ea057826f87"} Apr 23 16:38:01.230216 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.230183 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-56dbdcb664-wth5t"] Apr 23 16:38:01.234155 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.234083 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56dbdcb664-wth5t" Apr 23 16:38:01.245447 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.245423 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56dbdcb664-wth5t"] Apr 23 16:38:01.299579 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.299551 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bsgxs" Apr 23 16:38:01.302698 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.302566 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-s982g\"" Apr 23 16:38:01.310564 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.310541 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bsgxs" Apr 23 16:38:01.314352 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.314324 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rczvw\" (UniqueName: \"kubernetes.io/projected/cddb0dbb-053c-4682-bdea-dbe11df4a62e-kube-api-access-rczvw\") pod \"console-56dbdcb664-wth5t\" (UID: \"cddb0dbb-053c-4682-bdea-dbe11df4a62e\") " pod="openshift-console/console-56dbdcb664-wth5t" Apr 23 16:38:01.314467 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.314365 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cddb0dbb-053c-4682-bdea-dbe11df4a62e-console-oauth-config\") pod \"console-56dbdcb664-wth5t\" (UID: \"cddb0dbb-053c-4682-bdea-dbe11df4a62e\") " pod="openshift-console/console-56dbdcb664-wth5t" Apr 23 16:38:01.314467 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.314415 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cddb0dbb-053c-4682-bdea-dbe11df4a62e-console-serving-cert\") pod \"console-56dbdcb664-wth5t\" (UID: \"cddb0dbb-053c-4682-bdea-dbe11df4a62e\") " pod="openshift-console/console-56dbdcb664-wth5t" Apr 23 16:38:01.314595 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.314473 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cddb0dbb-053c-4682-bdea-dbe11df4a62e-service-ca\") pod \"console-56dbdcb664-wth5t\" (UID: \"cddb0dbb-053c-4682-bdea-dbe11df4a62e\") " pod="openshift-console/console-56dbdcb664-wth5t" Apr 23 16:38:01.314595 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.314497 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cddb0dbb-053c-4682-bdea-dbe11df4a62e-console-config\") pod \"console-56dbdcb664-wth5t\" (UID: \"cddb0dbb-053c-4682-bdea-dbe11df4a62e\") " pod="openshift-console/console-56dbdcb664-wth5t" Apr 23 16:38:01.314595 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.314542 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cddb0dbb-053c-4682-bdea-dbe11df4a62e-oauth-serving-cert\") pod \"console-56dbdcb664-wth5t\" (UID: \"cddb0dbb-053c-4682-bdea-dbe11df4a62e\") " pod="openshift-console/console-56dbdcb664-wth5t" Apr 23 16:38:01.314595 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.314565 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cddb0dbb-053c-4682-bdea-dbe11df4a62e-trusted-ca-bundle\") pod \"console-56dbdcb664-wth5t\" (UID: \"cddb0dbb-053c-4682-bdea-dbe11df4a62e\") " pod="openshift-console/console-56dbdcb664-wth5t" Apr 23 16:38:01.416276 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.415474 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cddb0dbb-053c-4682-bdea-dbe11df4a62e-oauth-serving-cert\") pod \"console-56dbdcb664-wth5t\" (UID: \"cddb0dbb-053c-4682-bdea-dbe11df4a62e\") " pod="openshift-console/console-56dbdcb664-wth5t" Apr 23 16:38:01.416276 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.415535 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cddb0dbb-053c-4682-bdea-dbe11df4a62e-trusted-ca-bundle\") pod \"console-56dbdcb664-wth5t\" (UID: \"cddb0dbb-053c-4682-bdea-dbe11df4a62e\") " pod="openshift-console/console-56dbdcb664-wth5t" Apr 23 16:38:01.416276 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.415567 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rczvw\" (UniqueName: \"kubernetes.io/projected/cddb0dbb-053c-4682-bdea-dbe11df4a62e-kube-api-access-rczvw\") pod \"console-56dbdcb664-wth5t\" (UID: \"cddb0dbb-053c-4682-bdea-dbe11df4a62e\") " pod="openshift-console/console-56dbdcb664-wth5t" Apr 23 16:38:01.416276 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.415617 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cddb0dbb-053c-4682-bdea-dbe11df4a62e-console-oauth-config\") pod \"console-56dbdcb664-wth5t\" (UID: \"cddb0dbb-053c-4682-bdea-dbe11df4a62e\") " pod="openshift-console/console-56dbdcb664-wth5t" Apr 23 16:38:01.416276 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.415700 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cddb0dbb-053c-4682-bdea-dbe11df4a62e-console-serving-cert\") pod \"console-56dbdcb664-wth5t\" (UID: \"cddb0dbb-053c-4682-bdea-dbe11df4a62e\") " pod="openshift-console/console-56dbdcb664-wth5t" Apr 23 16:38:01.416276 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.415739 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cddb0dbb-053c-4682-bdea-dbe11df4a62e-service-ca\") pod \"console-56dbdcb664-wth5t\" (UID: \"cddb0dbb-053c-4682-bdea-dbe11df4a62e\") " pod="openshift-console/console-56dbdcb664-wth5t" Apr 23 16:38:01.416276 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.415777 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cddb0dbb-053c-4682-bdea-dbe11df4a62e-console-config\") pod \"console-56dbdcb664-wth5t\" (UID: \"cddb0dbb-053c-4682-bdea-dbe11df4a62e\") " pod="openshift-console/console-56dbdcb664-wth5t" Apr 23 16:38:01.417830 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.416946 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cddb0dbb-053c-4682-bdea-dbe11df4a62e-oauth-serving-cert\") pod \"console-56dbdcb664-wth5t\" (UID: \"cddb0dbb-053c-4682-bdea-dbe11df4a62e\") " pod="openshift-console/console-56dbdcb664-wth5t" Apr 23 16:38:01.417830 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.417199 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cddb0dbb-053c-4682-bdea-dbe11df4a62e-console-config\") pod \"console-56dbdcb664-wth5t\" (UID: \"cddb0dbb-053c-4682-bdea-dbe11df4a62e\") " pod="openshift-console/console-56dbdcb664-wth5t" Apr 23 16:38:01.417830 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.417522 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cddb0dbb-053c-4682-bdea-dbe11df4a62e-trusted-ca-bundle\") pod \"console-56dbdcb664-wth5t\" (UID: \"cddb0dbb-053c-4682-bdea-dbe11df4a62e\") " pod="openshift-console/console-56dbdcb664-wth5t" Apr 23 16:38:01.417830 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.417804 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cddb0dbb-053c-4682-bdea-dbe11df4a62e-service-ca\") pod \"console-56dbdcb664-wth5t\" (UID: \"cddb0dbb-053c-4682-bdea-dbe11df4a62e\") " pod="openshift-console/console-56dbdcb664-wth5t" Apr 23 16:38:01.420480 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.420434 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cddb0dbb-053c-4682-bdea-dbe11df4a62e-console-serving-cert\") pod \"console-56dbdcb664-wth5t\" (UID: \"cddb0dbb-053c-4682-bdea-dbe11df4a62e\") " pod="openshift-console/console-56dbdcb664-wth5t" Apr 23 16:38:01.420639 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.420590 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cddb0dbb-053c-4682-bdea-dbe11df4a62e-console-oauth-config\") pod \"console-56dbdcb664-wth5t\" (UID: \"cddb0dbb-053c-4682-bdea-dbe11df4a62e\") " pod="openshift-console/console-56dbdcb664-wth5t" Apr 23 16:38:01.436580 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.436550 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczvw\" (UniqueName: \"kubernetes.io/projected/cddb0dbb-053c-4682-bdea-dbe11df4a62e-kube-api-access-rczvw\") pod \"console-56dbdcb664-wth5t\" (UID: \"cddb0dbb-053c-4682-bdea-dbe11df4a62e\") " pod="openshift-console/console-56dbdcb664-wth5t" Apr 23 16:38:01.452619 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.452517 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 16:38:01.457895 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.457872 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.466619 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.466504 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 23 16:38:01.466719 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.466684 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 23 16:38:01.466719 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.466700 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 23 16:38:01.466833 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.466795 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 23 16:38:01.467749 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.467454 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 23 16:38:01.467749 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.467511 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 23 16:38:01.467749 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.467568 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 23 16:38:01.467952 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.467789 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 16:38:01.467952 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.467840 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-d1jk41q0nm03v\"" Apr 23 16:38:01.467952 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.467872 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 23 16:38:01.467952 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.467897 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 23 16:38:01.468245 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.468230 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-b4cl2\"" Apr 23 16:38:01.470544 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.470493 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 23 16:38:01.485604 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.485584 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 23 16:38:01.488173 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.488151 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 23 16:38:01.494808 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.494784 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bsgxs"] Apr 23 16:38:01.497924 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:38:01.497895 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bfd1dfe_900e_4260_b0fc_9dc05d2c604c.slice/crio-49b1548466966b9b6a39cade39d5f0fd44ad67fcb4c830a5b57d0736f3aff7fe WatchSource:0}: Error finding container 49b1548466966b9b6a39cade39d5f0fd44ad67fcb4c830a5b57d0736f3aff7fe: Status 404 returned error can't find the container with id 49b1548466966b9b6a39cade39d5f0fd44ad67fcb4c830a5b57d0736f3aff7fe Apr 23 16:38:01.516571 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.516543 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ad39d609-b577-48de-b485-a28556d6f1a1-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.516645 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.516593 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad39d609-b577-48de-b485-a28556d6f1a1-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.516695 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.516651 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-web-config\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.516746 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.516731 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcf9r\" (UniqueName: \"kubernetes.io/projected/ad39d609-b577-48de-b485-a28556d6f1a1-kube-api-access-wcf9r\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.516784 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.516761 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.516784 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.516779 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ad39d609-b577-48de-b485-a28556d6f1a1-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.516842 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.516801 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-config\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.516842 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.516826 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ad39d609-b577-48de-b485-a28556d6f1a1-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.516897 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.516878 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.516930 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.516902 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ad39d609-b577-48de-b485-a28556d6f1a1-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.516960 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.516936 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad39d609-b577-48de-b485-a28556d6f1a1-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.516991 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.516971 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ad39d609-b577-48de-b485-a28556d6f1a1-config-out\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.517032 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.516994 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.517032 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.517022 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.517091 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.517046 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad39d609-b577-48de-b485-a28556d6f1a1-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.517132 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.517115 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.517182 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.517166 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.517228 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.517214 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.535956 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.535900 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 16:38:01.547424 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.547402 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56dbdcb664-wth5t" Apr 23 16:38:01.619137 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.617689 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ad39d609-b577-48de-b485-a28556d6f1a1-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.619137 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.617747 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad39d609-b577-48de-b485-a28556d6f1a1-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.619137 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.617783 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ad39d609-b577-48de-b485-a28556d6f1a1-config-out\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.619137 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.617809 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.619137 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.617838 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.619137 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.617863 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad39d609-b577-48de-b485-a28556d6f1a1-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.619137 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.617929 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.619137 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.617976 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.619137 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.618019 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.619137 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.618054 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ad39d609-b577-48de-b485-a28556d6f1a1-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.619137 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.618091 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad39d609-b577-48de-b485-a28556d6f1a1-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.619137 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.618128 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-web-config\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.619137 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.618186 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wcf9r\" (UniqueName: \"kubernetes.io/projected/ad39d609-b577-48de-b485-a28556d6f1a1-kube-api-access-wcf9r\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.619137 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.618222 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.619137 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.618249 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ad39d609-b577-48de-b485-a28556d6f1a1-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.619137 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.618280 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-config\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.619137 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.618304 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ad39d609-b577-48de-b485-a28556d6f1a1-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.620086 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.618344 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.620086 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.619172 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ad39d609-b577-48de-b485-a28556d6f1a1-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.620239 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.620214 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ad39d609-b577-48de-b485-a28556d6f1a1-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.621189 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.620936 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad39d609-b577-48de-b485-a28556d6f1a1-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.621278 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.621184 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad39d609-b577-48de-b485-a28556d6f1a1-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.621985 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.621732 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad39d609-b577-48de-b485-a28556d6f1a1-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.626704 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.625505 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.626704 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.625765 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ad39d609-b577-48de-b485-a28556d6f1a1-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.629555 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.628394 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ad39d609-b577-48de-b485-a28556d6f1a1-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.629555 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.628632 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.629555 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.628817 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.629555 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.628843 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.629555 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.629128 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.629555 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.629308 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-web-config\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.629555 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.629520 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ad39d609-b577-48de-b485-a28556d6f1a1-config-out\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.630432 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.630376 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.630848 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.630826 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-config\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.631726 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.631684 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcf9r\" (UniqueName: \"kubernetes.io/projected/ad39d609-b577-48de-b485-a28556d6f1a1-kube-api-access-wcf9r\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.632454 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.632422 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.706218 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.706180 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56dbdcb664-wth5t"] Apr 23 16:38:01.714475 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:38:01.714433 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcddb0dbb_053c_4682_bdea_dbe11df4a62e.slice/crio-7633d8932a8ce2f9675fbe0ebe53c5c7602ea70fd5f5288cc231fb23450adf67 WatchSource:0}: Error finding container 7633d8932a8ce2f9675fbe0ebe53c5c7602ea70fd5f5288cc231fb23450adf67: Status 404 returned error can't find the container with id 7633d8932a8ce2f9675fbe0ebe53c5c7602ea70fd5f5288cc231fb23450adf67 Apr 23 16:38:01.769936 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.769821 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:01.951087 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.950847 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56dbdcb664-wth5t" event={"ID":"cddb0dbb-053c-4682-bdea-dbe11df4a62e","Type":"ContainerStarted","Data":"83f34f114c87b79654a0f78e46399b8cc62286d6d903c68e251af04fa5785af0"} Apr 23 16:38:01.951087 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.950890 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56dbdcb664-wth5t" event={"ID":"cddb0dbb-053c-4682-bdea-dbe11df4a62e","Type":"ContainerStarted","Data":"7633d8932a8ce2f9675fbe0ebe53c5c7602ea70fd5f5288cc231fb23450adf67"} Apr 23 16:38:01.954966 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.954885 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-84956f9864-64zbb" event={"ID":"086aa7f2-f4e0-44d1-8d28-be8fba79787b","Type":"ContainerStarted","Data":"4470bdf0b401917cac7c1f9a4906f151e5fcaf93166c06b53efe672d893d05d7"} Apr 23 16:38:01.954966 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.954927 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-84956f9864-64zbb" event={"ID":"086aa7f2-f4e0-44d1-8d28-be8fba79787b","Type":"ContainerStarted","Data":"a11f00c2a8c5839c27d1265e3c714e2729104e545abf7dc78dbea7c4d5d71f11"} Apr 23 16:38:01.954966 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.954944 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-84956f9864-64zbb" event={"ID":"086aa7f2-f4e0-44d1-8d28-be8fba79787b","Type":"ContainerStarted","Data":"dec187addac9dae4b526b0b6949947f0716290a723a4246369c71d3c5fa9e083"} Apr 23 16:38:01.955234 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.955164 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-84956f9864-64zbb" Apr 23 16:38:01.956992 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.956966 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bsgxs" event={"ID":"3bfd1dfe-900e-4260-b0fc-9dc05d2c604c","Type":"ContainerStarted","Data":"49b1548466966b9b6a39cade39d5f0fd44ad67fcb4c830a5b57d0736f3aff7fe"} Apr 23 16:38:01.985237 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.985190 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 16:38:01.986951 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:01.986455 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-56dbdcb664-wth5t" podStartSLOduration=0.986439409 podStartE2EDuration="986.439409ms" podCreationTimestamp="2026-04-23 16:38:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:38:01.983136883 +0000 UTC m=+167.320399848" watchObservedRunningTime="2026-04-23 16:38:01.986439409 +0000 UTC m=+167.323702376" Apr 23 16:38:01.986951 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:38:01.986928 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad39d609_b577_48de_b485_a28556d6f1a1.slice/crio-6fa9351bdc924b9d8d2a4e7cd9dc5028ed43203c0d042e344c7421c133210ccd WatchSource:0}: Error finding container 6fa9351bdc924b9d8d2a4e7cd9dc5028ed43203c0d042e344c7421c133210ccd: Status 404 returned error can't find the container with id 6fa9351bdc924b9d8d2a4e7cd9dc5028ed43203c0d042e344c7421c133210ccd Apr 23 16:38:02.010648 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:02.010527 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-84956f9864-64zbb" podStartSLOduration=1.294494215 podStartE2EDuration="5.010511578s" podCreationTimestamp="2026-04-23 16:37:57 +0000 UTC" firstStartedPulling="2026-04-23 16:37:57.645472059 +0000 UTC m=+162.982735004" lastFinishedPulling="2026-04-23 16:38:01.361489405 +0000 UTC m=+166.698752367" observedRunningTime="2026-04-23 16:38:02.008176643 +0000 UTC m=+167.345439629" watchObservedRunningTime="2026-04-23 16:38:02.010511578 +0000 UTC m=+167.347774546" Apr 23 16:38:02.962930 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:02.962883 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ad39d609-b577-48de-b485-a28556d6f1a1","Type":"ContainerStarted","Data":"6fa9351bdc924b9d8d2a4e7cd9dc5028ed43203c0d042e344c7421c133210ccd"} Apr 23 16:38:03.968361 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:03.968295 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-558bdd47f8-ck8sf" event={"ID":"120bc9ec-9a39-4aa8-862f-b8b2867ca401","Type":"ContainerStarted","Data":"0f2bb21e843d5cadef4c7069b1e0eb89219a73ea773a2edaf51efb7e6fb56cfc"} Apr 23 16:38:03.970151 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:03.970119 2569 generic.go:358] "Generic (PLEG): container finished" podID="ad39d609-b577-48de-b485-a28556d6f1a1" containerID="185faac1df253a6e5a74de092e85abfde931063b4b56507d27764ec425a2124b" exitCode=0 Apr 23 16:38:03.970262 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:03.970158 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ad39d609-b577-48de-b485-a28556d6f1a1","Type":"ContainerDied","Data":"185faac1df253a6e5a74de092e85abfde931063b4b56507d27764ec425a2124b"} Apr 23 16:38:03.971700 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:03.971674 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bsgxs" event={"ID":"3bfd1dfe-900e-4260-b0fc-9dc05d2c604c","Type":"ContainerStarted","Data":"c35470115b220f645365260f49742031313073fce1258e5e18d58b129ed51cbf"} Apr 23 16:38:04.035122 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:04.034012 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-bsgxs" podStartSLOduration=134.7625966 podStartE2EDuration="2m17.033989069s" podCreationTimestamp="2026-04-23 16:35:47 +0000 UTC" firstStartedPulling="2026-04-23 16:38:01.500016869 +0000 UTC m=+166.837279815" lastFinishedPulling="2026-04-23 16:38:03.771409336 +0000 UTC m=+169.108672284" observedRunningTime="2026-04-23 16:38:04.03220262 +0000 UTC m=+169.369465585" watchObservedRunningTime="2026-04-23 16:38:04.033989069 +0000 UTC m=+169.371252033" Apr 23 16:38:04.300215 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:04.300132 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7n97" Apr 23 16:38:04.982950 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:04.982906 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-558bdd47f8-ck8sf" event={"ID":"120bc9ec-9a39-4aa8-862f-b8b2867ca401","Type":"ContainerStarted","Data":"a24d8bdaf7ca57b5143a706d1d1e4c5d563e38eb19b30b73b845865442d261f8"} Apr 23 16:38:04.982950 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:04.982952 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-558bdd47f8-ck8sf" event={"ID":"120bc9ec-9a39-4aa8-862f-b8b2867ca401","Type":"ContainerStarted","Data":"7f9ba3414de1407d2a527bf044a4c6e833b467af2eaaa41f2eb8642507eeabd6"} Apr 23 16:38:05.005575 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:05.005523 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-558bdd47f8-ck8sf" podStartSLOduration=2.0798128240000002 podStartE2EDuration="5.005504859s" podCreationTimestamp="2026-04-23 16:38:00 +0000 UTC" firstStartedPulling="2026-04-23 16:38:00.844017145 +0000 UTC m=+166.181280101" lastFinishedPulling="2026-04-23 16:38:03.769709179 +0000 UTC m=+169.106972136" observedRunningTime="2026-04-23 16:38:05.003818536 +0000 UTC m=+170.341081500" watchObservedRunningTime="2026-04-23 16:38:05.005504859 +0000 UTC m=+170.342767843" Apr 23 16:38:05.582079 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:05.581497 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6674d685f7-kkh7x" Apr 23 16:38:05.582079 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:05.581540 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6674d685f7-kkh7x" Apr 23 16:38:05.587783 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:05.587752 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6674d685f7-kkh7x" Apr 23 16:38:05.641786 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:05.639791 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6674d685f7-kkh7x"] Apr 23 16:38:05.990673 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:05.990633 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6674d685f7-kkh7x" Apr 23 16:38:07.939489 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:07.939460 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-jfzk8" Apr 23 16:38:07.969482 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:07.969448 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-84956f9864-64zbb" Apr 23 16:38:07.997367 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:07.997321 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ad39d609-b577-48de-b485-a28556d6f1a1","Type":"ContainerStarted","Data":"4c29b67348cbb86fa1d969236980dadbdbd1b00fb44ecfcb6ac6ed1533fcb73b"} Apr 23 16:38:07.997367 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:07.997368 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ad39d609-b577-48de-b485-a28556d6f1a1","Type":"ContainerStarted","Data":"8ad48c584b2e15f7b64e57c87b4665a3807f021c680932e118861919623029e3"} Apr 23 16:38:07.997590 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:07.997381 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ad39d609-b577-48de-b485-a28556d6f1a1","Type":"ContainerStarted","Data":"8b6f78c6930bce920e2204b25adbb35a3cf9847f528533b19d7b02ffebf8b891"} Apr 23 16:38:07.997590 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:07.997395 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ad39d609-b577-48de-b485-a28556d6f1a1","Type":"ContainerStarted","Data":"25d4812ab1484ebbcc32f2c601f3a0248754354728f191a3e5103cff55d956bb"} Apr 23 16:38:07.997590 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:07.997409 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ad39d609-b577-48de-b485-a28556d6f1a1","Type":"ContainerStarted","Data":"93bfa28b8e0fa7c089f896274532039b6eb07a597d8b251c368f70aaadc161f2"} Apr 23 16:38:07.997590 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:07.997422 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ad39d609-b577-48de-b485-a28556d6f1a1","Type":"ContainerStarted","Data":"cccf5db571ae535719e3c5b597ee0ec050c1fe97ec686e3e3c77319f8747430c"} Apr 23 16:38:08.031384 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:08.031208 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.032940959 podStartE2EDuration="7.031187926s" podCreationTimestamp="2026-04-23 16:38:01 +0000 UTC" firstStartedPulling="2026-04-23 16:38:01.98965596 +0000 UTC m=+167.326918908" lastFinishedPulling="2026-04-23 16:38:06.987902916 +0000 UTC m=+172.325165875" observedRunningTime="2026-04-23 16:38:08.030095369 +0000 UTC m=+173.367358333" watchObservedRunningTime="2026-04-23 16:38:08.031187926 +0000 UTC m=+173.368450891" Apr 23 16:38:11.548238 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:11.548199 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-56dbdcb664-wth5t" Apr 23 16:38:11.548238 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:11.548244 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-56dbdcb664-wth5t" Apr 23 16:38:11.553077 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:11.553054 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-56dbdcb664-wth5t" Apr 23 16:38:11.770211 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:11.770172 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:12.013362 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:12.013334 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-56dbdcb664-wth5t" Apr 23 16:38:16.288726 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:16.288691 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-56dbdcb664-wth5t"] Apr 23 16:38:27.058685 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:27.058583 2569 generic.go:358] "Generic (PLEG): container finished" podID="5e7b64cf-d8ab-48a3-86f5-9ea5db912782" containerID="1decf4b96c31c1bbe9fa25e2b9762d81352d673cb1fd5d5f32e30c43b835856b" exitCode=0 Apr 23 16:38:27.058685 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:27.058654 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-hbwp6" event={"ID":"5e7b64cf-d8ab-48a3-86f5-9ea5db912782","Type":"ContainerDied","Data":"1decf4b96c31c1bbe9fa25e2b9762d81352d673cb1fd5d5f32e30c43b835856b"} Apr 23 16:38:27.059116 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:27.058978 2569 scope.go:117] "RemoveContainer" containerID="1decf4b96c31c1bbe9fa25e2b9762d81352d673cb1fd5d5f32e30c43b835856b" Apr 23 16:38:28.063074 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:28.063042 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-hbwp6" event={"ID":"5e7b64cf-d8ab-48a3-86f5-9ea5db912782","Type":"ContainerStarted","Data":"ae1697cc33ae5bd2e0a956c0c0c3daeb5b0dc86fc8ec775591591fb3315a962d"} Apr 23 16:38:32.009150 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:32.009104 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6674d685f7-kkh7x" podUID="c5ea87ff-a22d-4952-8409-ae584643f659" containerName="console" containerID="cri-o://8ec20bf5affb0cce1182e436a0682900148e476b1aa7f5d2c72ea3b88b67ccee" gracePeriod=15 Apr 23 16:38:32.245127 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:32.245104 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6674d685f7-kkh7x_c5ea87ff-a22d-4952-8409-ae584643f659/console/0.log" Apr 23 16:38:32.245238 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:32.245176 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6674d685f7-kkh7x" Apr 23 16:38:32.304836 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:32.304753 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c5ea87ff-a22d-4952-8409-ae584643f659-service-ca\") pod \"c5ea87ff-a22d-4952-8409-ae584643f659\" (UID: \"c5ea87ff-a22d-4952-8409-ae584643f659\") " Apr 23 16:38:32.304836 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:32.304813 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c5ea87ff-a22d-4952-8409-ae584643f659-console-config\") pod \"c5ea87ff-a22d-4952-8409-ae584643f659\" (UID: \"c5ea87ff-a22d-4952-8409-ae584643f659\") " Apr 23 16:38:32.305056 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:32.304866 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5ea87ff-a22d-4952-8409-ae584643f659-console-serving-cert\") pod \"c5ea87ff-a22d-4952-8409-ae584643f659\" (UID: \"c5ea87ff-a22d-4952-8409-ae584643f659\") " Apr 23 16:38:32.305056 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:32.304890 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlgg2\" (UniqueName: \"kubernetes.io/projected/c5ea87ff-a22d-4952-8409-ae584643f659-kube-api-access-mlgg2\") pod \"c5ea87ff-a22d-4952-8409-ae584643f659\" (UID: \"c5ea87ff-a22d-4952-8409-ae584643f659\") " Apr 23 16:38:32.305056 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:32.304947 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c5ea87ff-a22d-4952-8409-ae584643f659-console-oauth-config\") pod \"c5ea87ff-a22d-4952-8409-ae584643f659\" (UID: \"c5ea87ff-a22d-4952-8409-ae584643f659\") " Apr 23 16:38:32.305056 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:32.304979 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5ea87ff-a22d-4952-8409-ae584643f659-trusted-ca-bundle\") pod \"c5ea87ff-a22d-4952-8409-ae584643f659\" (UID: \"c5ea87ff-a22d-4952-8409-ae584643f659\") " Apr 23 16:38:32.305056 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:32.305006 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c5ea87ff-a22d-4952-8409-ae584643f659-oauth-serving-cert\") pod \"c5ea87ff-a22d-4952-8409-ae584643f659\" (UID: \"c5ea87ff-a22d-4952-8409-ae584643f659\") " Apr 23 16:38:32.305334 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:32.305307 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5ea87ff-a22d-4952-8409-ae584643f659-console-config" (OuterVolumeSpecName: "console-config") pod "c5ea87ff-a22d-4952-8409-ae584643f659" (UID: "c5ea87ff-a22d-4952-8409-ae584643f659"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:38:32.305438 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:32.305419 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5ea87ff-a22d-4952-8409-ae584643f659-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c5ea87ff-a22d-4952-8409-ae584643f659" (UID: "c5ea87ff-a22d-4952-8409-ae584643f659"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:38:32.305489 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:32.305407 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5ea87ff-a22d-4952-8409-ae584643f659-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c5ea87ff-a22d-4952-8409-ae584643f659" (UID: "c5ea87ff-a22d-4952-8409-ae584643f659"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:38:32.305543 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:32.305503 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5ea87ff-a22d-4952-8409-ae584643f659-service-ca" (OuterVolumeSpecName: "service-ca") pod "c5ea87ff-a22d-4952-8409-ae584643f659" (UID: "c5ea87ff-a22d-4952-8409-ae584643f659"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:38:32.307245 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:32.307214 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5ea87ff-a22d-4952-8409-ae584643f659-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c5ea87ff-a22d-4952-8409-ae584643f659" (UID: "c5ea87ff-a22d-4952-8409-ae584643f659"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:38:32.307245 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:32.307231 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5ea87ff-a22d-4952-8409-ae584643f659-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c5ea87ff-a22d-4952-8409-ae584643f659" (UID: "c5ea87ff-a22d-4952-8409-ae584643f659"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:38:32.307417 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:32.307344 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5ea87ff-a22d-4952-8409-ae584643f659-kube-api-access-mlgg2" (OuterVolumeSpecName: "kube-api-access-mlgg2") pod "c5ea87ff-a22d-4952-8409-ae584643f659" (UID: "c5ea87ff-a22d-4952-8409-ae584643f659"). InnerVolumeSpecName "kube-api-access-mlgg2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:38:32.405682 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:32.405630 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c5ea87ff-a22d-4952-8409-ae584643f659-console-config\") on node \"ip-10-0-128-102.ec2.internal\" DevicePath \"\"" Apr 23 16:38:32.405682 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:32.405678 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5ea87ff-a22d-4952-8409-ae584643f659-console-serving-cert\") on node \"ip-10-0-128-102.ec2.internal\" DevicePath \"\"" Apr 23 16:38:32.405682 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:32.405692 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mlgg2\" (UniqueName: \"kubernetes.io/projected/c5ea87ff-a22d-4952-8409-ae584643f659-kube-api-access-mlgg2\") on node \"ip-10-0-128-102.ec2.internal\" DevicePath \"\"" Apr 23 16:38:32.405909 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:32.405702 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c5ea87ff-a22d-4952-8409-ae584643f659-console-oauth-config\") on node \"ip-10-0-128-102.ec2.internal\" DevicePath \"\"" Apr 23 16:38:32.405909 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:32.405711 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5ea87ff-a22d-4952-8409-ae584643f659-trusted-ca-bundle\") on node \"ip-10-0-128-102.ec2.internal\" DevicePath \"\"" Apr 23 16:38:32.405909 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:32.405721 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c5ea87ff-a22d-4952-8409-ae584643f659-oauth-serving-cert\") on node \"ip-10-0-128-102.ec2.internal\" DevicePath \"\"" Apr 23 16:38:32.405909 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:32.405730 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c5ea87ff-a22d-4952-8409-ae584643f659-service-ca\") on node \"ip-10-0-128-102.ec2.internal\" DevicePath \"\"" Apr 23 16:38:33.083753 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:33.083721 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6674d685f7-kkh7x_c5ea87ff-a22d-4952-8409-ae584643f659/console/0.log" Apr 23 16:38:33.084199 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:33.083764 2569 generic.go:358] "Generic (PLEG): container finished" podID="c5ea87ff-a22d-4952-8409-ae584643f659" containerID="8ec20bf5affb0cce1182e436a0682900148e476b1aa7f5d2c72ea3b88b67ccee" exitCode=2 Apr 23 16:38:33.084199 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:33.083846 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6674d685f7-kkh7x" Apr 23 16:38:33.084199 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:33.083844 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6674d685f7-kkh7x" event={"ID":"c5ea87ff-a22d-4952-8409-ae584643f659","Type":"ContainerDied","Data":"8ec20bf5affb0cce1182e436a0682900148e476b1aa7f5d2c72ea3b88b67ccee"} Apr 23 16:38:33.084199 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:33.083963 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6674d685f7-kkh7x" event={"ID":"c5ea87ff-a22d-4952-8409-ae584643f659","Type":"ContainerDied","Data":"d8ea10106081da33f9b63522cf345c76c7513d8e41ddaddac559d9befb093951"} Apr 23 16:38:33.084199 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:33.083986 2569 scope.go:117] "RemoveContainer" containerID="8ec20bf5affb0cce1182e436a0682900148e476b1aa7f5d2c72ea3b88b67ccee" Apr 23 16:38:33.092612 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:33.092591 2569 scope.go:117] "RemoveContainer" containerID="8ec20bf5affb0cce1182e436a0682900148e476b1aa7f5d2c72ea3b88b67ccee" Apr 23 16:38:33.092865 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:38:33.092846 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ec20bf5affb0cce1182e436a0682900148e476b1aa7f5d2c72ea3b88b67ccee\": container with ID starting with 8ec20bf5affb0cce1182e436a0682900148e476b1aa7f5d2c72ea3b88b67ccee not found: ID does not exist" containerID="8ec20bf5affb0cce1182e436a0682900148e476b1aa7f5d2c72ea3b88b67ccee" Apr 23 16:38:33.092930 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:33.092872 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ec20bf5affb0cce1182e436a0682900148e476b1aa7f5d2c72ea3b88b67ccee"} err="failed to get container status \"8ec20bf5affb0cce1182e436a0682900148e476b1aa7f5d2c72ea3b88b67ccee\": rpc error: code = NotFound desc = could not find container \"8ec20bf5affb0cce1182e436a0682900148e476b1aa7f5d2c72ea3b88b67ccee\": container with ID starting with 8ec20bf5affb0cce1182e436a0682900148e476b1aa7f5d2c72ea3b88b67ccee not found: ID does not exist" Apr 23 16:38:33.109874 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:33.109851 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6674d685f7-kkh7x"] Apr 23 16:38:33.113105 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:33.113083 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6674d685f7-kkh7x"] Apr 23 16:38:33.303426 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:33.303384 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5ea87ff-a22d-4952-8409-ae584643f659" path="/var/lib/kubelet/pods/c5ea87ff-a22d-4952-8409-ae584643f659/volumes" Apr 23 16:38:41.309982 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:41.309917 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-56dbdcb664-wth5t" podUID="cddb0dbb-053c-4682-bdea-dbe11df4a62e" containerName="console" containerID="cri-o://83f34f114c87b79654a0f78e46399b8cc62286d6d903c68e251af04fa5785af0" gracePeriod=15 Apr 23 16:38:41.550354 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:41.550328 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56dbdcb664-wth5t_cddb0dbb-053c-4682-bdea-dbe11df4a62e/console/0.log" Apr 23 16:38:41.550473 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:41.550394 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56dbdcb664-wth5t" Apr 23 16:38:41.691042 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:41.691008 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rczvw\" (UniqueName: \"kubernetes.io/projected/cddb0dbb-053c-4682-bdea-dbe11df4a62e-kube-api-access-rczvw\") pod \"cddb0dbb-053c-4682-bdea-dbe11df4a62e\" (UID: \"cddb0dbb-053c-4682-bdea-dbe11df4a62e\") " Apr 23 16:38:41.691205 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:41.691058 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cddb0dbb-053c-4682-bdea-dbe11df4a62e-trusted-ca-bundle\") pod \"cddb0dbb-053c-4682-bdea-dbe11df4a62e\" (UID: \"cddb0dbb-053c-4682-bdea-dbe11df4a62e\") " Apr 23 16:38:41.691205 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:41.691131 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cddb0dbb-053c-4682-bdea-dbe11df4a62e-console-serving-cert\") pod \"cddb0dbb-053c-4682-bdea-dbe11df4a62e\" (UID: \"cddb0dbb-053c-4682-bdea-dbe11df4a62e\") " Apr 23 16:38:41.691205 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:41.691175 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cddb0dbb-053c-4682-bdea-dbe11df4a62e-oauth-serving-cert\") pod \"cddb0dbb-053c-4682-bdea-dbe11df4a62e\" (UID: \"cddb0dbb-053c-4682-bdea-dbe11df4a62e\") " Apr 23 16:38:41.691338 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:41.691219 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cddb0dbb-053c-4682-bdea-dbe11df4a62e-console-oauth-config\") pod \"cddb0dbb-053c-4682-bdea-dbe11df4a62e\" (UID: \"cddb0dbb-053c-4682-bdea-dbe11df4a62e\") " Apr 23 16:38:41.691338 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:41.691247 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cddb0dbb-053c-4682-bdea-dbe11df4a62e-console-config\") pod \"cddb0dbb-053c-4682-bdea-dbe11df4a62e\" (UID: \"cddb0dbb-053c-4682-bdea-dbe11df4a62e\") " Apr 23 16:38:41.691338 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:41.691298 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cddb0dbb-053c-4682-bdea-dbe11df4a62e-service-ca\") pod \"cddb0dbb-053c-4682-bdea-dbe11df4a62e\" (UID: \"cddb0dbb-053c-4682-bdea-dbe11df4a62e\") " Apr 23 16:38:41.691652 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:41.691601 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cddb0dbb-053c-4682-bdea-dbe11df4a62e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "cddb0dbb-053c-4682-bdea-dbe11df4a62e" (UID: "cddb0dbb-053c-4682-bdea-dbe11df4a62e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:38:41.691805 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:41.691744 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cddb0dbb-053c-4682-bdea-dbe11df4a62e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "cddb0dbb-053c-4682-bdea-dbe11df4a62e" (UID: "cddb0dbb-053c-4682-bdea-dbe11df4a62e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:38:41.691899 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:41.691867 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cddb0dbb-053c-4682-bdea-dbe11df4a62e-console-config" (OuterVolumeSpecName: "console-config") pod "cddb0dbb-053c-4682-bdea-dbe11df4a62e" (UID: "cddb0dbb-053c-4682-bdea-dbe11df4a62e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:38:41.691997 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:41.691924 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cddb0dbb-053c-4682-bdea-dbe11df4a62e-service-ca" (OuterVolumeSpecName: "service-ca") pod "cddb0dbb-053c-4682-bdea-dbe11df4a62e" (UID: "cddb0dbb-053c-4682-bdea-dbe11df4a62e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:38:41.693511 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:41.693460 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cddb0dbb-053c-4682-bdea-dbe11df4a62e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "cddb0dbb-053c-4682-bdea-dbe11df4a62e" (UID: "cddb0dbb-053c-4682-bdea-dbe11df4a62e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:38:41.693702 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:41.693644 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cddb0dbb-053c-4682-bdea-dbe11df4a62e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "cddb0dbb-053c-4682-bdea-dbe11df4a62e" (UID: "cddb0dbb-053c-4682-bdea-dbe11df4a62e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:38:41.693702 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:41.693682 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cddb0dbb-053c-4682-bdea-dbe11df4a62e-kube-api-access-rczvw" (OuterVolumeSpecName: "kube-api-access-rczvw") pod "cddb0dbb-053c-4682-bdea-dbe11df4a62e" (UID: "cddb0dbb-053c-4682-bdea-dbe11df4a62e"). InnerVolumeSpecName "kube-api-access-rczvw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:38:41.792193 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:41.792162 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cddb0dbb-053c-4682-bdea-dbe11df4a62e-oauth-serving-cert\") on node \"ip-10-0-128-102.ec2.internal\" DevicePath \"\"" Apr 23 16:38:41.792193 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:41.792189 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cddb0dbb-053c-4682-bdea-dbe11df4a62e-console-oauth-config\") on node \"ip-10-0-128-102.ec2.internal\" DevicePath \"\"" Apr 23 16:38:41.792193 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:41.792199 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cddb0dbb-053c-4682-bdea-dbe11df4a62e-console-config\") on node \"ip-10-0-128-102.ec2.internal\" DevicePath \"\"" Apr 23 16:38:41.792406 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:41.792208 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cddb0dbb-053c-4682-bdea-dbe11df4a62e-service-ca\") on node \"ip-10-0-128-102.ec2.internal\" DevicePath \"\"" Apr 23 16:38:41.792406 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:41.792217 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rczvw\" (UniqueName: \"kubernetes.io/projected/cddb0dbb-053c-4682-bdea-dbe11df4a62e-kube-api-access-rczvw\") on node \"ip-10-0-128-102.ec2.internal\" DevicePath \"\"" Apr 23 16:38:41.792406 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:41.792225 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cddb0dbb-053c-4682-bdea-dbe11df4a62e-trusted-ca-bundle\") on node \"ip-10-0-128-102.ec2.internal\" DevicePath \"\"" Apr 23 16:38:41.792406 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:41.792234 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cddb0dbb-053c-4682-bdea-dbe11df4a62e-console-serving-cert\") on node \"ip-10-0-128-102.ec2.internal\" DevicePath \"\"" Apr 23 16:38:42.110616 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:42.110588 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56dbdcb664-wth5t_cddb0dbb-053c-4682-bdea-dbe11df4a62e/console/0.log" Apr 23 16:38:42.110790 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:42.110628 2569 generic.go:358] "Generic (PLEG): container finished" podID="cddb0dbb-053c-4682-bdea-dbe11df4a62e" containerID="83f34f114c87b79654a0f78e46399b8cc62286d6d903c68e251af04fa5785af0" exitCode=2 Apr 23 16:38:42.110790 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:42.110710 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56dbdcb664-wth5t" Apr 23 16:38:42.110790 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:42.110728 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56dbdcb664-wth5t" event={"ID":"cddb0dbb-053c-4682-bdea-dbe11df4a62e","Type":"ContainerDied","Data":"83f34f114c87b79654a0f78e46399b8cc62286d6d903c68e251af04fa5785af0"} Apr 23 16:38:42.110790 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:42.110769 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56dbdcb664-wth5t" event={"ID":"cddb0dbb-053c-4682-bdea-dbe11df4a62e","Type":"ContainerDied","Data":"7633d8932a8ce2f9675fbe0ebe53c5c7602ea70fd5f5288cc231fb23450adf67"} Apr 23 16:38:42.110790 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:42.110786 2569 scope.go:117] "RemoveContainer" containerID="83f34f114c87b79654a0f78e46399b8cc62286d6d903c68e251af04fa5785af0" Apr 23 16:38:42.118821 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:42.118801 2569 scope.go:117] "RemoveContainer" containerID="83f34f114c87b79654a0f78e46399b8cc62286d6d903c68e251af04fa5785af0" Apr 23 16:38:42.119069 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:38:42.119049 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83f34f114c87b79654a0f78e46399b8cc62286d6d903c68e251af04fa5785af0\": container with ID starting with 83f34f114c87b79654a0f78e46399b8cc62286d6d903c68e251af04fa5785af0 not found: ID does not exist" containerID="83f34f114c87b79654a0f78e46399b8cc62286d6d903c68e251af04fa5785af0" Apr 23 16:38:42.119146 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:42.119075 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83f34f114c87b79654a0f78e46399b8cc62286d6d903c68e251af04fa5785af0"} err="failed to get container status \"83f34f114c87b79654a0f78e46399b8cc62286d6d903c68e251af04fa5785af0\": rpc error: code = NotFound desc = could not find container \"83f34f114c87b79654a0f78e46399b8cc62286d6d903c68e251af04fa5785af0\": container with ID starting with 83f34f114c87b79654a0f78e46399b8cc62286d6d903c68e251af04fa5785af0 not found: ID does not exist" Apr 23 16:38:42.131202 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:42.131183 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-56dbdcb664-wth5t"] Apr 23 16:38:42.134784 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:42.134761 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-56dbdcb664-wth5t"] Apr 23 16:38:43.303997 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:38:43.303952 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cddb0dbb-053c-4682-bdea-dbe11df4a62e" path="/var/lib/kubelet/pods/cddb0dbb-053c-4682-bdea-dbe11df4a62e/volumes" Apr 23 16:39:01.770647 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:01.770612 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:01.787411 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:01.787381 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:02.183895 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:02.183865 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:19.758601 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:19.758555 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 16:39:19.759357 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:19.759297 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="ad39d609-b577-48de-b485-a28556d6f1a1" containerName="kube-rbac-proxy-thanos" containerID="cri-o://4c29b67348cbb86fa1d969236980dadbdbd1b00fb44ecfcb6ac6ed1533fcb73b" gracePeriod=600 Apr 23 16:39:19.759907 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:19.759877 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="ad39d609-b577-48de-b485-a28556d6f1a1" containerName="kube-rbac-proxy-web" containerID="cri-o://8b6f78c6930bce920e2204b25adbb35a3cf9847f528533b19d7b02ffebf8b891" gracePeriod=600 Apr 23 16:39:19.760032 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:19.759998 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="ad39d609-b577-48de-b485-a28556d6f1a1" containerName="kube-rbac-proxy" containerID="cri-o://8ad48c584b2e15f7b64e57c87b4665a3807f021c680932e118861919623029e3" gracePeriod=600 Apr 23 16:39:19.760144 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:19.760109 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="ad39d609-b577-48de-b485-a28556d6f1a1" containerName="thanos-sidecar" containerID="cri-o://25d4812ab1484ebbcc32f2c601f3a0248754354728f191a3e5103cff55d956bb" gracePeriod=600 Apr 23 16:39:19.760211 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:19.760189 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="ad39d609-b577-48de-b485-a28556d6f1a1" containerName="config-reloader" containerID="cri-o://93bfa28b8e0fa7c089f896274532039b6eb07a597d8b251c368f70aaadc161f2" gracePeriod=600 Apr 23 16:39:19.761957 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:19.761734 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="ad39d609-b577-48de-b485-a28556d6f1a1" containerName="prometheus" containerID="cri-o://cccf5db571ae535719e3c5b597ee0ec050c1fe97ec686e3e3c77319f8747430c" gracePeriod=600 Apr 23 16:39:20.017395 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.017319 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.118365 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.118331 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-secret-grpc-tls\") pod \"ad39d609-b577-48de-b485-a28556d6f1a1\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " Apr 23 16:39:20.118365 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.118368 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-thanos-prometheus-http-client-file\") pod \"ad39d609-b577-48de-b485-a28556d6f1a1\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " Apr 23 16:39:20.118593 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.118407 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"ad39d609-b577-48de-b485-a28556d6f1a1\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " Apr 23 16:39:20.118593 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.118428 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"ad39d609-b577-48de-b485-a28556d6f1a1\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " Apr 23 16:39:20.118593 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.118456 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcf9r\" (UniqueName: \"kubernetes.io/projected/ad39d609-b577-48de-b485-a28556d6f1a1-kube-api-access-wcf9r\") pod \"ad39d609-b577-48de-b485-a28556d6f1a1\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " Apr 23 16:39:20.118593 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.118472 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ad39d609-b577-48de-b485-a28556d6f1a1-prometheus-k8s-rulefiles-0\") pod \"ad39d609-b577-48de-b485-a28556d6f1a1\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " Apr 23 16:39:20.118593 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.118499 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad39d609-b577-48de-b485-a28556d6f1a1-configmap-kubelet-serving-ca-bundle\") pod \"ad39d609-b577-48de-b485-a28556d6f1a1\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " Apr 23 16:39:20.118593 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.118546 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ad39d609-b577-48de-b485-a28556d6f1a1-prometheus-k8s-db\") pod \"ad39d609-b577-48de-b485-a28556d6f1a1\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " Apr 23 16:39:20.119819 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.119152 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad39d609-b577-48de-b485-a28556d6f1a1-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "ad39d609-b577-48de-b485-a28556d6f1a1" (UID: "ad39d609-b577-48de-b485-a28556d6f1a1"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:39:20.119819 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.119249 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ad39d609-b577-48de-b485-a28556d6f1a1-config-out\") pod \"ad39d609-b577-48de-b485-a28556d6f1a1\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " Apr 23 16:39:20.119819 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.119288 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-secret-metrics-client-certs\") pod \"ad39d609-b577-48de-b485-a28556d6f1a1\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " Apr 23 16:39:20.119819 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.119358 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ad39d609-b577-48de-b485-a28556d6f1a1-configmap-metrics-client-ca\") pod \"ad39d609-b577-48de-b485-a28556d6f1a1\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " Apr 23 16:39:20.119819 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.119382 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-web-config\") pod \"ad39d609-b577-48de-b485-a28556d6f1a1\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " Apr 23 16:39:20.119819 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.119406 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad39d609-b577-48de-b485-a28556d6f1a1-prometheus-trusted-ca-bundle\") pod \"ad39d609-b577-48de-b485-a28556d6f1a1\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " Apr 23 16:39:20.119819 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.119449 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ad39d609-b577-48de-b485-a28556d6f1a1-tls-assets\") pod \"ad39d609-b577-48de-b485-a28556d6f1a1\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " Apr 23 16:39:20.119819 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.119478 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-secret-kube-rbac-proxy\") pod \"ad39d609-b577-48de-b485-a28556d6f1a1\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " Apr 23 16:39:20.119819 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.119511 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad39d609-b577-48de-b485-a28556d6f1a1-configmap-serving-certs-ca-bundle\") pod \"ad39d609-b577-48de-b485-a28556d6f1a1\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " Apr 23 16:39:20.119819 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.119536 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-config\") pod \"ad39d609-b577-48de-b485-a28556d6f1a1\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " Apr 23 16:39:20.119819 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.119553 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad39d609-b577-48de-b485-a28556d6f1a1-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "ad39d609-b577-48de-b485-a28556d6f1a1" (UID: "ad39d609-b577-48de-b485-a28556d6f1a1"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:39:20.119819 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.119560 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-secret-prometheus-k8s-tls\") pod \"ad39d609-b577-48de-b485-a28556d6f1a1\" (UID: \"ad39d609-b577-48de-b485-a28556d6f1a1\") " Apr 23 16:39:20.120675 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.119899 2569 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad39d609-b577-48de-b485-a28556d6f1a1-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-128-102.ec2.internal\" DevicePath \"\"" Apr 23 16:39:20.120675 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.119921 2569 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ad39d609-b577-48de-b485-a28556d6f1a1-prometheus-k8s-db\") on node \"ip-10-0-128-102.ec2.internal\" DevicePath \"\"" Apr 23 16:39:20.120675 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.120228 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad39d609-b577-48de-b485-a28556d6f1a1-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "ad39d609-b577-48de-b485-a28556d6f1a1" (UID: "ad39d609-b577-48de-b485-a28556d6f1a1"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:39:20.120675 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.120246 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad39d609-b577-48de-b485-a28556d6f1a1-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "ad39d609-b577-48de-b485-a28556d6f1a1" (UID: "ad39d609-b577-48de-b485-a28556d6f1a1"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:39:20.121208 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.121121 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "ad39d609-b577-48de-b485-a28556d6f1a1" (UID: "ad39d609-b577-48de-b485-a28556d6f1a1"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:39:20.121312 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.121283 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad39d609-b577-48de-b485-a28556d6f1a1-kube-api-access-wcf9r" (OuterVolumeSpecName: "kube-api-access-wcf9r") pod "ad39d609-b577-48de-b485-a28556d6f1a1" (UID: "ad39d609-b577-48de-b485-a28556d6f1a1"). InnerVolumeSpecName "kube-api-access-wcf9r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:39:20.121711 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.121388 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "ad39d609-b577-48de-b485-a28556d6f1a1" (UID: "ad39d609-b577-48de-b485-a28556d6f1a1"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:39:20.121711 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.121647 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "ad39d609-b577-48de-b485-a28556d6f1a1" (UID: "ad39d609-b577-48de-b485-a28556d6f1a1"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:39:20.121711 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.121684 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad39d609-b577-48de-b485-a28556d6f1a1-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "ad39d609-b577-48de-b485-a28556d6f1a1" (UID: "ad39d609-b577-48de-b485-a28556d6f1a1"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:39:20.122058 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.122027 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad39d609-b577-48de-b485-a28556d6f1a1-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "ad39d609-b577-48de-b485-a28556d6f1a1" (UID: "ad39d609-b577-48de-b485-a28556d6f1a1"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:39:20.122252 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.122227 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "ad39d609-b577-48de-b485-a28556d6f1a1" (UID: "ad39d609-b577-48de-b485-a28556d6f1a1"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:39:20.122535 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.122483 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "ad39d609-b577-48de-b485-a28556d6f1a1" (UID: "ad39d609-b577-48de-b485-a28556d6f1a1"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:39:20.122764 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.122728 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad39d609-b577-48de-b485-a28556d6f1a1-config-out" (OuterVolumeSpecName: "config-out") pod "ad39d609-b577-48de-b485-a28556d6f1a1" (UID: "ad39d609-b577-48de-b485-a28556d6f1a1"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:39:20.122855 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.122816 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "ad39d609-b577-48de-b485-a28556d6f1a1" (UID: "ad39d609-b577-48de-b485-a28556d6f1a1"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:39:20.123114 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.123090 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad39d609-b577-48de-b485-a28556d6f1a1-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "ad39d609-b577-48de-b485-a28556d6f1a1" (UID: "ad39d609-b577-48de-b485-a28556d6f1a1"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:39:20.123654 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.123624 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-config" (OuterVolumeSpecName: "config") pod "ad39d609-b577-48de-b485-a28556d6f1a1" (UID: "ad39d609-b577-48de-b485-a28556d6f1a1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:39:20.123941 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.123920 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "ad39d609-b577-48de-b485-a28556d6f1a1" (UID: "ad39d609-b577-48de-b485-a28556d6f1a1"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:39:20.134042 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.134021 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-web-config" (OuterVolumeSpecName: "web-config") pod "ad39d609-b577-48de-b485-a28556d6f1a1" (UID: "ad39d609-b577-48de-b485-a28556d6f1a1"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:39:20.218214 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.218183 2569 generic.go:358] "Generic (PLEG): container finished" podID="ad39d609-b577-48de-b485-a28556d6f1a1" containerID="4c29b67348cbb86fa1d969236980dadbdbd1b00fb44ecfcb6ac6ed1533fcb73b" exitCode=0 Apr 23 16:39:20.218214 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.218209 2569 generic.go:358] "Generic (PLEG): container finished" podID="ad39d609-b577-48de-b485-a28556d6f1a1" containerID="8ad48c584b2e15f7b64e57c87b4665a3807f021c680932e118861919623029e3" exitCode=0 Apr 23 16:39:20.218214 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.218216 2569 generic.go:358] "Generic (PLEG): container finished" podID="ad39d609-b577-48de-b485-a28556d6f1a1" containerID="8b6f78c6930bce920e2204b25adbb35a3cf9847f528533b19d7b02ffebf8b891" exitCode=0 Apr 23 16:39:20.218214 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.218222 2569 generic.go:358] "Generic (PLEG): container finished" podID="ad39d609-b577-48de-b485-a28556d6f1a1" containerID="25d4812ab1484ebbcc32f2c601f3a0248754354728f191a3e5103cff55d956bb" exitCode=0 Apr 23 16:39:20.218531 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.218227 2569 generic.go:358] "Generic (PLEG): container finished" podID="ad39d609-b577-48de-b485-a28556d6f1a1" containerID="93bfa28b8e0fa7c089f896274532039b6eb07a597d8b251c368f70aaadc161f2" exitCode=0 Apr 23 16:39:20.218531 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.218233 2569 generic.go:358] "Generic (PLEG): container finished" podID="ad39d609-b577-48de-b485-a28556d6f1a1" containerID="cccf5db571ae535719e3c5b597ee0ec050c1fe97ec686e3e3c77319f8747430c" exitCode=0 Apr 23 16:39:20.218531 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.218282 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.218531 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.218276 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ad39d609-b577-48de-b485-a28556d6f1a1","Type":"ContainerDied","Data":"4c29b67348cbb86fa1d969236980dadbdbd1b00fb44ecfcb6ac6ed1533fcb73b"} Apr 23 16:39:20.218531 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.218399 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ad39d609-b577-48de-b485-a28556d6f1a1","Type":"ContainerDied","Data":"8ad48c584b2e15f7b64e57c87b4665a3807f021c680932e118861919623029e3"} Apr 23 16:39:20.218531 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.218417 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ad39d609-b577-48de-b485-a28556d6f1a1","Type":"ContainerDied","Data":"8b6f78c6930bce920e2204b25adbb35a3cf9847f528533b19d7b02ffebf8b891"} Apr 23 16:39:20.218531 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.218434 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ad39d609-b577-48de-b485-a28556d6f1a1","Type":"ContainerDied","Data":"25d4812ab1484ebbcc32f2c601f3a0248754354728f191a3e5103cff55d956bb"} Apr 23 16:39:20.218531 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.218448 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ad39d609-b577-48de-b485-a28556d6f1a1","Type":"ContainerDied","Data":"93bfa28b8e0fa7c089f896274532039b6eb07a597d8b251c368f70aaadc161f2"} Apr 23 16:39:20.218531 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.218460 2569 scope.go:117] "RemoveContainer" containerID="4c29b67348cbb86fa1d969236980dadbdbd1b00fb44ecfcb6ac6ed1533fcb73b" Apr 23 16:39:20.218531 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.218463 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ad39d609-b577-48de-b485-a28556d6f1a1","Type":"ContainerDied","Data":"cccf5db571ae535719e3c5b597ee0ec050c1fe97ec686e3e3c77319f8747430c"} Apr 23 16:39:20.218531 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.218479 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ad39d609-b577-48de-b485-a28556d6f1a1","Type":"ContainerDied","Data":"6fa9351bdc924b9d8d2a4e7cd9dc5028ed43203c0d042e344c7421c133210ccd"} Apr 23 16:39:20.220336 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.220315 2569 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ad39d609-b577-48de-b485-a28556d6f1a1-configmap-metrics-client-ca\") on node \"ip-10-0-128-102.ec2.internal\" DevicePath \"\"" Apr 23 16:39:20.220336 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.220337 2569 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-web-config\") on node \"ip-10-0-128-102.ec2.internal\" DevicePath \"\"" Apr 23 16:39:20.220471 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.220348 2569 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad39d609-b577-48de-b485-a28556d6f1a1-prometheus-trusted-ca-bundle\") on node \"ip-10-0-128-102.ec2.internal\" DevicePath \"\"" Apr 23 16:39:20.220471 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.220357 2569 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ad39d609-b577-48de-b485-a28556d6f1a1-tls-assets\") on node \"ip-10-0-128-102.ec2.internal\" DevicePath \"\"" Apr 23 16:39:20.220471 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.220371 2569 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-secret-kube-rbac-proxy\") on node \"ip-10-0-128-102.ec2.internal\" DevicePath \"\"" Apr 23 16:39:20.220471 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.220384 2569 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad39d609-b577-48de-b485-a28556d6f1a1-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-128-102.ec2.internal\" DevicePath \"\"" Apr 23 16:39:20.220471 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.220393 2569 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-config\") on node \"ip-10-0-128-102.ec2.internal\" DevicePath \"\"" Apr 23 16:39:20.220471 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.220401 2569 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-secret-prometheus-k8s-tls\") on node \"ip-10-0-128-102.ec2.internal\" DevicePath \"\"" Apr 23 16:39:20.220471 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.220414 2569 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-secret-grpc-tls\") on node \"ip-10-0-128-102.ec2.internal\" DevicePath \"\"" Apr 23 16:39:20.220471 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.220426 2569 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-thanos-prometheus-http-client-file\") on node \"ip-10-0-128-102.ec2.internal\" DevicePath \"\"" Apr 23 16:39:20.220471 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.220434 2569 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-128-102.ec2.internal\" DevicePath \"\"" Apr 23 16:39:20.220471 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.220443 2569 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-128-102.ec2.internal\" DevicePath \"\"" Apr 23 16:39:20.220471 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.220452 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wcf9r\" (UniqueName: \"kubernetes.io/projected/ad39d609-b577-48de-b485-a28556d6f1a1-kube-api-access-wcf9r\") on node \"ip-10-0-128-102.ec2.internal\" DevicePath \"\"" Apr 23 16:39:20.220471 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.220461 2569 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ad39d609-b577-48de-b485-a28556d6f1a1-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-128-102.ec2.internal\" DevicePath \"\"" Apr 23 16:39:20.220471 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.220470 2569 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ad39d609-b577-48de-b485-a28556d6f1a1-config-out\") on node \"ip-10-0-128-102.ec2.internal\" DevicePath \"\"" Apr 23 16:39:20.220471 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.220478 2569 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ad39d609-b577-48de-b485-a28556d6f1a1-secret-metrics-client-certs\") on node \"ip-10-0-128-102.ec2.internal\" DevicePath \"\"" Apr 23 16:39:20.226820 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.226638 2569 scope.go:117] "RemoveContainer" containerID="8ad48c584b2e15f7b64e57c87b4665a3807f021c680932e118861919623029e3" Apr 23 16:39:20.233390 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.233373 2569 scope.go:117] "RemoveContainer" containerID="8b6f78c6930bce920e2204b25adbb35a3cf9847f528533b19d7b02ffebf8b891" Apr 23 16:39:20.239494 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.239478 2569 scope.go:117] "RemoveContainer" containerID="25d4812ab1484ebbcc32f2c601f3a0248754354728f191a3e5103cff55d956bb" Apr 23 16:39:20.243174 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.243153 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 16:39:20.245811 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.245797 2569 scope.go:117] "RemoveContainer" containerID="93bfa28b8e0fa7c089f896274532039b6eb07a597d8b251c368f70aaadc161f2" Apr 23 16:39:20.250192 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.250172 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 16:39:20.252388 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.252367 2569 scope.go:117] "RemoveContainer" containerID="cccf5db571ae535719e3c5b597ee0ec050c1fe97ec686e3e3c77319f8747430c" Apr 23 16:39:20.258908 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.258892 2569 scope.go:117] "RemoveContainer" containerID="185faac1df253a6e5a74de092e85abfde931063b4b56507d27764ec425a2124b" Apr 23 16:39:20.264837 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.264821 2569 scope.go:117] "RemoveContainer" containerID="4c29b67348cbb86fa1d969236980dadbdbd1b00fb44ecfcb6ac6ed1533fcb73b" Apr 23 16:39:20.265083 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:39:20.265066 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c29b67348cbb86fa1d969236980dadbdbd1b00fb44ecfcb6ac6ed1533fcb73b\": container with ID starting with 4c29b67348cbb86fa1d969236980dadbdbd1b00fb44ecfcb6ac6ed1533fcb73b not found: ID does not exist" containerID="4c29b67348cbb86fa1d969236980dadbdbd1b00fb44ecfcb6ac6ed1533fcb73b" Apr 23 16:39:20.265127 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.265092 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c29b67348cbb86fa1d969236980dadbdbd1b00fb44ecfcb6ac6ed1533fcb73b"} err="failed to get container status \"4c29b67348cbb86fa1d969236980dadbdbd1b00fb44ecfcb6ac6ed1533fcb73b\": rpc error: code = NotFound desc = could not find container \"4c29b67348cbb86fa1d969236980dadbdbd1b00fb44ecfcb6ac6ed1533fcb73b\": container with ID starting with 4c29b67348cbb86fa1d969236980dadbdbd1b00fb44ecfcb6ac6ed1533fcb73b not found: ID does not exist" Apr 23 16:39:20.265127 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.265111 2569 scope.go:117] "RemoveContainer" containerID="8ad48c584b2e15f7b64e57c87b4665a3807f021c680932e118861919623029e3" Apr 23 16:39:20.265343 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:39:20.265322 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ad48c584b2e15f7b64e57c87b4665a3807f021c680932e118861919623029e3\": container with ID starting with 8ad48c584b2e15f7b64e57c87b4665a3807f021c680932e118861919623029e3 not found: ID does not exist" containerID="8ad48c584b2e15f7b64e57c87b4665a3807f021c680932e118861919623029e3" Apr 23 16:39:20.265379 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.265349 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ad48c584b2e15f7b64e57c87b4665a3807f021c680932e118861919623029e3"} err="failed to get container status \"8ad48c584b2e15f7b64e57c87b4665a3807f021c680932e118861919623029e3\": rpc error: code = NotFound desc = could not find container \"8ad48c584b2e15f7b64e57c87b4665a3807f021c680932e118861919623029e3\": container with ID starting with 8ad48c584b2e15f7b64e57c87b4665a3807f021c680932e118861919623029e3 not found: ID does not exist" Apr 23 16:39:20.265379 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.265366 2569 scope.go:117] "RemoveContainer" containerID="8b6f78c6930bce920e2204b25adbb35a3cf9847f528533b19d7b02ffebf8b891" Apr 23 16:39:20.265588 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:39:20.265571 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b6f78c6930bce920e2204b25adbb35a3cf9847f528533b19d7b02ffebf8b891\": container with ID starting with 8b6f78c6930bce920e2204b25adbb35a3cf9847f528533b19d7b02ffebf8b891 not found: ID does not exist" containerID="8b6f78c6930bce920e2204b25adbb35a3cf9847f528533b19d7b02ffebf8b891" Apr 23 16:39:20.265638 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.265592 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b6f78c6930bce920e2204b25adbb35a3cf9847f528533b19d7b02ffebf8b891"} err="failed to get container status \"8b6f78c6930bce920e2204b25adbb35a3cf9847f528533b19d7b02ffebf8b891\": rpc error: code = NotFound desc = could not find container \"8b6f78c6930bce920e2204b25adbb35a3cf9847f528533b19d7b02ffebf8b891\": container with ID starting with 8b6f78c6930bce920e2204b25adbb35a3cf9847f528533b19d7b02ffebf8b891 not found: ID does not exist" Apr 23 16:39:20.265638 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.265608 2569 scope.go:117] "RemoveContainer" containerID="25d4812ab1484ebbcc32f2c601f3a0248754354728f191a3e5103cff55d956bb" Apr 23 16:39:20.265833 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:39:20.265814 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25d4812ab1484ebbcc32f2c601f3a0248754354728f191a3e5103cff55d956bb\": container with ID starting with 25d4812ab1484ebbcc32f2c601f3a0248754354728f191a3e5103cff55d956bb not found: ID does not exist" containerID="25d4812ab1484ebbcc32f2c601f3a0248754354728f191a3e5103cff55d956bb" Apr 23 16:39:20.265887 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.265836 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25d4812ab1484ebbcc32f2c601f3a0248754354728f191a3e5103cff55d956bb"} err="failed to get container status \"25d4812ab1484ebbcc32f2c601f3a0248754354728f191a3e5103cff55d956bb\": rpc error: code = NotFound desc = could not find container \"25d4812ab1484ebbcc32f2c601f3a0248754354728f191a3e5103cff55d956bb\": container with ID starting with 25d4812ab1484ebbcc32f2c601f3a0248754354728f191a3e5103cff55d956bb not found: ID does not exist" Apr 23 16:39:20.265887 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.265850 2569 scope.go:117] "RemoveContainer" containerID="93bfa28b8e0fa7c089f896274532039b6eb07a597d8b251c368f70aaadc161f2" Apr 23 16:39:20.266050 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:39:20.266036 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93bfa28b8e0fa7c089f896274532039b6eb07a597d8b251c368f70aaadc161f2\": container with ID starting with 93bfa28b8e0fa7c089f896274532039b6eb07a597d8b251c368f70aaadc161f2 not found: ID does not exist" containerID="93bfa28b8e0fa7c089f896274532039b6eb07a597d8b251c368f70aaadc161f2" Apr 23 16:39:20.266090 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.266054 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93bfa28b8e0fa7c089f896274532039b6eb07a597d8b251c368f70aaadc161f2"} err="failed to get container status \"93bfa28b8e0fa7c089f896274532039b6eb07a597d8b251c368f70aaadc161f2\": rpc error: code = NotFound desc = could not find container \"93bfa28b8e0fa7c089f896274532039b6eb07a597d8b251c368f70aaadc161f2\": container with ID starting with 93bfa28b8e0fa7c089f896274532039b6eb07a597d8b251c368f70aaadc161f2 not found: ID does not exist" Apr 23 16:39:20.266090 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.266067 2569 scope.go:117] "RemoveContainer" containerID="cccf5db571ae535719e3c5b597ee0ec050c1fe97ec686e3e3c77319f8747430c" Apr 23 16:39:20.266280 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:39:20.266265 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cccf5db571ae535719e3c5b597ee0ec050c1fe97ec686e3e3c77319f8747430c\": container with ID starting with cccf5db571ae535719e3c5b597ee0ec050c1fe97ec686e3e3c77319f8747430c not found: ID does not exist" containerID="cccf5db571ae535719e3c5b597ee0ec050c1fe97ec686e3e3c77319f8747430c" Apr 23 16:39:20.266316 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.266288 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cccf5db571ae535719e3c5b597ee0ec050c1fe97ec686e3e3c77319f8747430c"} err="failed to get container status \"cccf5db571ae535719e3c5b597ee0ec050c1fe97ec686e3e3c77319f8747430c\": rpc error: code = NotFound desc = could not find container \"cccf5db571ae535719e3c5b597ee0ec050c1fe97ec686e3e3c77319f8747430c\": container with ID starting with cccf5db571ae535719e3c5b597ee0ec050c1fe97ec686e3e3c77319f8747430c not found: ID does not exist" Apr 23 16:39:20.266316 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.266302 2569 scope.go:117] "RemoveContainer" containerID="185faac1df253a6e5a74de092e85abfde931063b4b56507d27764ec425a2124b" Apr 23 16:39:20.266502 ip-10-0-128-102 kubenswrapper[2569]: E0423 16:39:20.266488 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"185faac1df253a6e5a74de092e85abfde931063b4b56507d27764ec425a2124b\": container with ID starting with 185faac1df253a6e5a74de092e85abfde931063b4b56507d27764ec425a2124b not found: ID does not exist" containerID="185faac1df253a6e5a74de092e85abfde931063b4b56507d27764ec425a2124b" Apr 23 16:39:20.266541 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.266505 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"185faac1df253a6e5a74de092e85abfde931063b4b56507d27764ec425a2124b"} err="failed to get container status \"185faac1df253a6e5a74de092e85abfde931063b4b56507d27764ec425a2124b\": rpc error: code = NotFound desc = could not find container \"185faac1df253a6e5a74de092e85abfde931063b4b56507d27764ec425a2124b\": container with ID starting with 185faac1df253a6e5a74de092e85abfde931063b4b56507d27764ec425a2124b not found: ID does not exist" Apr 23 16:39:20.266541 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.266521 2569 scope.go:117] "RemoveContainer" containerID="4c29b67348cbb86fa1d969236980dadbdbd1b00fb44ecfcb6ac6ed1533fcb73b" Apr 23 16:39:20.266742 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.266722 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c29b67348cbb86fa1d969236980dadbdbd1b00fb44ecfcb6ac6ed1533fcb73b"} err="failed to get container status \"4c29b67348cbb86fa1d969236980dadbdbd1b00fb44ecfcb6ac6ed1533fcb73b\": rpc error: code = NotFound desc = could not find container \"4c29b67348cbb86fa1d969236980dadbdbd1b00fb44ecfcb6ac6ed1533fcb73b\": container with ID starting with 4c29b67348cbb86fa1d969236980dadbdbd1b00fb44ecfcb6ac6ed1533fcb73b not found: ID does not exist" Apr 23 16:39:20.266787 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.266744 2569 scope.go:117] "RemoveContainer" containerID="8ad48c584b2e15f7b64e57c87b4665a3807f021c680932e118861919623029e3" Apr 23 16:39:20.266957 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.266937 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ad48c584b2e15f7b64e57c87b4665a3807f021c680932e118861919623029e3"} err="failed to get container status \"8ad48c584b2e15f7b64e57c87b4665a3807f021c680932e118861919623029e3\": rpc error: code = NotFound desc = could not find container \"8ad48c584b2e15f7b64e57c87b4665a3807f021c680932e118861919623029e3\": container with ID starting with 8ad48c584b2e15f7b64e57c87b4665a3807f021c680932e118861919623029e3 not found: ID does not exist" Apr 23 16:39:20.267000 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.266958 2569 scope.go:117] "RemoveContainer" containerID="8b6f78c6930bce920e2204b25adbb35a3cf9847f528533b19d7b02ffebf8b891" Apr 23 16:39:20.267150 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.267134 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b6f78c6930bce920e2204b25adbb35a3cf9847f528533b19d7b02ffebf8b891"} err="failed to get container status \"8b6f78c6930bce920e2204b25adbb35a3cf9847f528533b19d7b02ffebf8b891\": rpc error: code = NotFound desc = could not find container \"8b6f78c6930bce920e2204b25adbb35a3cf9847f528533b19d7b02ffebf8b891\": container with ID starting with 8b6f78c6930bce920e2204b25adbb35a3cf9847f528533b19d7b02ffebf8b891 not found: ID does not exist" Apr 23 16:39:20.267198 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.267152 2569 scope.go:117] "RemoveContainer" containerID="25d4812ab1484ebbcc32f2c601f3a0248754354728f191a3e5103cff55d956bb" Apr 23 16:39:20.267360 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.267339 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25d4812ab1484ebbcc32f2c601f3a0248754354728f191a3e5103cff55d956bb"} err="failed to get container status \"25d4812ab1484ebbcc32f2c601f3a0248754354728f191a3e5103cff55d956bb\": rpc error: code = NotFound desc = could not find container \"25d4812ab1484ebbcc32f2c601f3a0248754354728f191a3e5103cff55d956bb\": container with ID starting with 25d4812ab1484ebbcc32f2c601f3a0248754354728f191a3e5103cff55d956bb not found: ID does not exist" Apr 23 16:39:20.267427 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.267361 2569 scope.go:117] "RemoveContainer" containerID="93bfa28b8e0fa7c089f896274532039b6eb07a597d8b251c368f70aaadc161f2" Apr 23 16:39:20.267563 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.267544 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93bfa28b8e0fa7c089f896274532039b6eb07a597d8b251c368f70aaadc161f2"} err="failed to get container status \"93bfa28b8e0fa7c089f896274532039b6eb07a597d8b251c368f70aaadc161f2\": rpc error: code = NotFound desc = could not find container \"93bfa28b8e0fa7c089f896274532039b6eb07a597d8b251c368f70aaadc161f2\": container with ID starting with 93bfa28b8e0fa7c089f896274532039b6eb07a597d8b251c368f70aaadc161f2 not found: ID does not exist" Apr 23 16:39:20.267609 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.267563 2569 scope.go:117] "RemoveContainer" containerID="cccf5db571ae535719e3c5b597ee0ec050c1fe97ec686e3e3c77319f8747430c" Apr 23 16:39:20.267774 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.267752 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cccf5db571ae535719e3c5b597ee0ec050c1fe97ec686e3e3c77319f8747430c"} err="failed to get container status \"cccf5db571ae535719e3c5b597ee0ec050c1fe97ec686e3e3c77319f8747430c\": rpc error: code = NotFound desc = could not find container \"cccf5db571ae535719e3c5b597ee0ec050c1fe97ec686e3e3c77319f8747430c\": container with ID starting with cccf5db571ae535719e3c5b597ee0ec050c1fe97ec686e3e3c77319f8747430c not found: ID does not exist" Apr 23 16:39:20.267812 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.267775 2569 scope.go:117] "RemoveContainer" containerID="185faac1df253a6e5a74de092e85abfde931063b4b56507d27764ec425a2124b" Apr 23 16:39:20.268001 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.267983 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"185faac1df253a6e5a74de092e85abfde931063b4b56507d27764ec425a2124b"} err="failed to get container status \"185faac1df253a6e5a74de092e85abfde931063b4b56507d27764ec425a2124b\": rpc error: code = NotFound desc = could not find container \"185faac1df253a6e5a74de092e85abfde931063b4b56507d27764ec425a2124b\": container with ID starting with 185faac1df253a6e5a74de092e85abfde931063b4b56507d27764ec425a2124b not found: ID does not exist" Apr 23 16:39:20.268059 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.268002 2569 scope.go:117] "RemoveContainer" containerID="4c29b67348cbb86fa1d969236980dadbdbd1b00fb44ecfcb6ac6ed1533fcb73b" Apr 23 16:39:20.268226 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.268209 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c29b67348cbb86fa1d969236980dadbdbd1b00fb44ecfcb6ac6ed1533fcb73b"} err="failed to get container status \"4c29b67348cbb86fa1d969236980dadbdbd1b00fb44ecfcb6ac6ed1533fcb73b\": rpc error: code = NotFound desc = could not find container \"4c29b67348cbb86fa1d969236980dadbdbd1b00fb44ecfcb6ac6ed1533fcb73b\": container with ID starting with 4c29b67348cbb86fa1d969236980dadbdbd1b00fb44ecfcb6ac6ed1533fcb73b not found: ID does not exist" Apr 23 16:39:20.268267 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.268227 2569 scope.go:117] "RemoveContainer" containerID="8ad48c584b2e15f7b64e57c87b4665a3807f021c680932e118861919623029e3" Apr 23 16:39:20.268409 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.268390 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ad48c584b2e15f7b64e57c87b4665a3807f021c680932e118861919623029e3"} err="failed to get container status \"8ad48c584b2e15f7b64e57c87b4665a3807f021c680932e118861919623029e3\": rpc error: code = NotFound desc = could not find container \"8ad48c584b2e15f7b64e57c87b4665a3807f021c680932e118861919623029e3\": container with ID starting with 8ad48c584b2e15f7b64e57c87b4665a3807f021c680932e118861919623029e3 not found: ID does not exist" Apr 23 16:39:20.268450 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.268410 2569 scope.go:117] "RemoveContainer" containerID="8b6f78c6930bce920e2204b25adbb35a3cf9847f528533b19d7b02ffebf8b891" Apr 23 16:39:20.268601 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.268585 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b6f78c6930bce920e2204b25adbb35a3cf9847f528533b19d7b02ffebf8b891"} err="failed to get container status \"8b6f78c6930bce920e2204b25adbb35a3cf9847f528533b19d7b02ffebf8b891\": rpc error: code = NotFound desc = could not find container \"8b6f78c6930bce920e2204b25adbb35a3cf9847f528533b19d7b02ffebf8b891\": container with ID starting with 8b6f78c6930bce920e2204b25adbb35a3cf9847f528533b19d7b02ffebf8b891 not found: ID does not exist" Apr 23 16:39:20.268646 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.268601 2569 scope.go:117] "RemoveContainer" containerID="25d4812ab1484ebbcc32f2c601f3a0248754354728f191a3e5103cff55d956bb" Apr 23 16:39:20.268814 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.268797 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25d4812ab1484ebbcc32f2c601f3a0248754354728f191a3e5103cff55d956bb"} err="failed to get container status \"25d4812ab1484ebbcc32f2c601f3a0248754354728f191a3e5103cff55d956bb\": rpc error: code = NotFound desc = could not find container \"25d4812ab1484ebbcc32f2c601f3a0248754354728f191a3e5103cff55d956bb\": container with ID starting with 25d4812ab1484ebbcc32f2c601f3a0248754354728f191a3e5103cff55d956bb not found: ID does not exist" Apr 23 16:39:20.268814 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.268813 2569 scope.go:117] "RemoveContainer" containerID="93bfa28b8e0fa7c089f896274532039b6eb07a597d8b251c368f70aaadc161f2" Apr 23 16:39:20.269013 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.268991 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93bfa28b8e0fa7c089f896274532039b6eb07a597d8b251c368f70aaadc161f2"} err="failed to get container status \"93bfa28b8e0fa7c089f896274532039b6eb07a597d8b251c368f70aaadc161f2\": rpc error: code = NotFound desc = could not find container \"93bfa28b8e0fa7c089f896274532039b6eb07a597d8b251c368f70aaadc161f2\": container with ID starting with 93bfa28b8e0fa7c089f896274532039b6eb07a597d8b251c368f70aaadc161f2 not found: ID does not exist" Apr 23 16:39:20.269056 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.269014 2569 scope.go:117] "RemoveContainer" containerID="cccf5db571ae535719e3c5b597ee0ec050c1fe97ec686e3e3c77319f8747430c" Apr 23 16:39:20.269238 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.269223 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cccf5db571ae535719e3c5b597ee0ec050c1fe97ec686e3e3c77319f8747430c"} err="failed to get container status \"cccf5db571ae535719e3c5b597ee0ec050c1fe97ec686e3e3c77319f8747430c\": rpc error: code = NotFound desc = could not find container \"cccf5db571ae535719e3c5b597ee0ec050c1fe97ec686e3e3c77319f8747430c\": container with ID starting with cccf5db571ae535719e3c5b597ee0ec050c1fe97ec686e3e3c77319f8747430c not found: ID does not exist" Apr 23 16:39:20.269291 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.269240 2569 scope.go:117] "RemoveContainer" containerID="185faac1df253a6e5a74de092e85abfde931063b4b56507d27764ec425a2124b" Apr 23 16:39:20.269455 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.269438 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"185faac1df253a6e5a74de092e85abfde931063b4b56507d27764ec425a2124b"} err="failed to get container status \"185faac1df253a6e5a74de092e85abfde931063b4b56507d27764ec425a2124b\": rpc error: code = NotFound desc = could not find container \"185faac1df253a6e5a74de092e85abfde931063b4b56507d27764ec425a2124b\": container with ID starting with 185faac1df253a6e5a74de092e85abfde931063b4b56507d27764ec425a2124b not found: ID does not exist" Apr 23 16:39:20.269503 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.269455 2569 scope.go:117] "RemoveContainer" containerID="4c29b67348cbb86fa1d969236980dadbdbd1b00fb44ecfcb6ac6ed1533fcb73b" Apr 23 16:39:20.269752 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.269649 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c29b67348cbb86fa1d969236980dadbdbd1b00fb44ecfcb6ac6ed1533fcb73b"} err="failed to get container status \"4c29b67348cbb86fa1d969236980dadbdbd1b00fb44ecfcb6ac6ed1533fcb73b\": rpc error: code = NotFound desc = could not find container \"4c29b67348cbb86fa1d969236980dadbdbd1b00fb44ecfcb6ac6ed1533fcb73b\": container with ID starting with 4c29b67348cbb86fa1d969236980dadbdbd1b00fb44ecfcb6ac6ed1533fcb73b not found: ID does not exist" Apr 23 16:39:20.269831 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.269757 2569 scope.go:117] "RemoveContainer" containerID="8ad48c584b2e15f7b64e57c87b4665a3807f021c680932e118861919623029e3" Apr 23 16:39:20.269997 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.269980 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ad48c584b2e15f7b64e57c87b4665a3807f021c680932e118861919623029e3"} err="failed to get container status \"8ad48c584b2e15f7b64e57c87b4665a3807f021c680932e118861919623029e3\": rpc error: code = NotFound desc = could not find container \"8ad48c584b2e15f7b64e57c87b4665a3807f021c680932e118861919623029e3\": container with ID starting with 8ad48c584b2e15f7b64e57c87b4665a3807f021c680932e118861919623029e3 not found: ID does not exist" Apr 23 16:39:20.270044 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.269997 2569 scope.go:117] "RemoveContainer" containerID="8b6f78c6930bce920e2204b25adbb35a3cf9847f528533b19d7b02ffebf8b891" Apr 23 16:39:20.270210 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.270192 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b6f78c6930bce920e2204b25adbb35a3cf9847f528533b19d7b02ffebf8b891"} err="failed to get container status \"8b6f78c6930bce920e2204b25adbb35a3cf9847f528533b19d7b02ffebf8b891\": rpc error: code = NotFound desc = could not find container \"8b6f78c6930bce920e2204b25adbb35a3cf9847f528533b19d7b02ffebf8b891\": container with ID starting with 8b6f78c6930bce920e2204b25adbb35a3cf9847f528533b19d7b02ffebf8b891 not found: ID does not exist" Apr 23 16:39:20.270254 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.270211 2569 scope.go:117] "RemoveContainer" containerID="25d4812ab1484ebbcc32f2c601f3a0248754354728f191a3e5103cff55d956bb" Apr 23 16:39:20.270404 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.270387 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25d4812ab1484ebbcc32f2c601f3a0248754354728f191a3e5103cff55d956bb"} err="failed to get container status \"25d4812ab1484ebbcc32f2c601f3a0248754354728f191a3e5103cff55d956bb\": rpc error: code = NotFound desc = could not find container \"25d4812ab1484ebbcc32f2c601f3a0248754354728f191a3e5103cff55d956bb\": container with ID starting with 25d4812ab1484ebbcc32f2c601f3a0248754354728f191a3e5103cff55d956bb not found: ID does not exist" Apr 23 16:39:20.270469 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.270407 2569 scope.go:117] "RemoveContainer" containerID="93bfa28b8e0fa7c089f896274532039b6eb07a597d8b251c368f70aaadc161f2" Apr 23 16:39:20.270618 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.270602 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93bfa28b8e0fa7c089f896274532039b6eb07a597d8b251c368f70aaadc161f2"} err="failed to get container status \"93bfa28b8e0fa7c089f896274532039b6eb07a597d8b251c368f70aaadc161f2\": rpc error: code = NotFound desc = could not find container \"93bfa28b8e0fa7c089f896274532039b6eb07a597d8b251c368f70aaadc161f2\": container with ID starting with 93bfa28b8e0fa7c089f896274532039b6eb07a597d8b251c368f70aaadc161f2 not found: ID does not exist" Apr 23 16:39:20.270694 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.270618 2569 scope.go:117] "RemoveContainer" containerID="cccf5db571ae535719e3c5b597ee0ec050c1fe97ec686e3e3c77319f8747430c" Apr 23 16:39:20.270825 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.270809 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cccf5db571ae535719e3c5b597ee0ec050c1fe97ec686e3e3c77319f8747430c"} err="failed to get container status \"cccf5db571ae535719e3c5b597ee0ec050c1fe97ec686e3e3c77319f8747430c\": rpc error: code = NotFound desc = could not find container \"cccf5db571ae535719e3c5b597ee0ec050c1fe97ec686e3e3c77319f8747430c\": container with ID starting with cccf5db571ae535719e3c5b597ee0ec050c1fe97ec686e3e3c77319f8747430c not found: ID does not exist" Apr 23 16:39:20.270893 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.270827 2569 scope.go:117] "RemoveContainer" containerID="185faac1df253a6e5a74de092e85abfde931063b4b56507d27764ec425a2124b" Apr 23 16:39:20.271046 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.271029 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"185faac1df253a6e5a74de092e85abfde931063b4b56507d27764ec425a2124b"} err="failed to get container status \"185faac1df253a6e5a74de092e85abfde931063b4b56507d27764ec425a2124b\": rpc error: code = NotFound desc = could not find container \"185faac1df253a6e5a74de092e85abfde931063b4b56507d27764ec425a2124b\": container with ID starting with 185faac1df253a6e5a74de092e85abfde931063b4b56507d27764ec425a2124b not found: ID does not exist" Apr 23 16:39:20.271092 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.271047 2569 scope.go:117] "RemoveContainer" containerID="4c29b67348cbb86fa1d969236980dadbdbd1b00fb44ecfcb6ac6ed1533fcb73b" Apr 23 16:39:20.271266 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.271250 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c29b67348cbb86fa1d969236980dadbdbd1b00fb44ecfcb6ac6ed1533fcb73b"} err="failed to get container status \"4c29b67348cbb86fa1d969236980dadbdbd1b00fb44ecfcb6ac6ed1533fcb73b\": rpc error: code = NotFound desc = could not find container \"4c29b67348cbb86fa1d969236980dadbdbd1b00fb44ecfcb6ac6ed1533fcb73b\": container with ID starting with 4c29b67348cbb86fa1d969236980dadbdbd1b00fb44ecfcb6ac6ed1533fcb73b not found: ID does not exist" Apr 23 16:39:20.271330 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.271268 2569 scope.go:117] "RemoveContainer" containerID="8ad48c584b2e15f7b64e57c87b4665a3807f021c680932e118861919623029e3" Apr 23 16:39:20.271482 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.271460 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ad48c584b2e15f7b64e57c87b4665a3807f021c680932e118861919623029e3"} err="failed to get container status \"8ad48c584b2e15f7b64e57c87b4665a3807f021c680932e118861919623029e3\": rpc error: code = NotFound desc = could not find container \"8ad48c584b2e15f7b64e57c87b4665a3807f021c680932e118861919623029e3\": container with ID starting with 8ad48c584b2e15f7b64e57c87b4665a3807f021c680932e118861919623029e3 not found: ID does not exist" Apr 23 16:39:20.271482 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.271481 2569 scope.go:117] "RemoveContainer" containerID="8b6f78c6930bce920e2204b25adbb35a3cf9847f528533b19d7b02ffebf8b891" Apr 23 16:39:20.271751 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.271711 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b6f78c6930bce920e2204b25adbb35a3cf9847f528533b19d7b02ffebf8b891"} err="failed to get container status \"8b6f78c6930bce920e2204b25adbb35a3cf9847f528533b19d7b02ffebf8b891\": rpc error: code = NotFound desc = could not find container \"8b6f78c6930bce920e2204b25adbb35a3cf9847f528533b19d7b02ffebf8b891\": container with ID starting with 8b6f78c6930bce920e2204b25adbb35a3cf9847f528533b19d7b02ffebf8b891 not found: ID does not exist" Apr 23 16:39:20.271751 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.271742 2569 scope.go:117] "RemoveContainer" containerID="25d4812ab1484ebbcc32f2c601f3a0248754354728f191a3e5103cff55d956bb" Apr 23 16:39:20.272086 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.272065 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25d4812ab1484ebbcc32f2c601f3a0248754354728f191a3e5103cff55d956bb"} err="failed to get container status \"25d4812ab1484ebbcc32f2c601f3a0248754354728f191a3e5103cff55d956bb\": rpc error: code = NotFound desc = could not find container \"25d4812ab1484ebbcc32f2c601f3a0248754354728f191a3e5103cff55d956bb\": container with ID starting with 25d4812ab1484ebbcc32f2c601f3a0248754354728f191a3e5103cff55d956bb not found: ID does not exist" Apr 23 16:39:20.272086 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.272085 2569 scope.go:117] "RemoveContainer" containerID="93bfa28b8e0fa7c089f896274532039b6eb07a597d8b251c368f70aaadc161f2" Apr 23 16:39:20.272314 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.272292 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93bfa28b8e0fa7c089f896274532039b6eb07a597d8b251c368f70aaadc161f2"} err="failed to get container status \"93bfa28b8e0fa7c089f896274532039b6eb07a597d8b251c368f70aaadc161f2\": rpc error: code = NotFound desc = could not find container \"93bfa28b8e0fa7c089f896274532039b6eb07a597d8b251c368f70aaadc161f2\": container with ID starting with 93bfa28b8e0fa7c089f896274532039b6eb07a597d8b251c368f70aaadc161f2 not found: ID does not exist" Apr 23 16:39:20.272369 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.272316 2569 scope.go:117] "RemoveContainer" containerID="cccf5db571ae535719e3c5b597ee0ec050c1fe97ec686e3e3c77319f8747430c" Apr 23 16:39:20.272515 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.272497 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cccf5db571ae535719e3c5b597ee0ec050c1fe97ec686e3e3c77319f8747430c"} err="failed to get container status \"cccf5db571ae535719e3c5b597ee0ec050c1fe97ec686e3e3c77319f8747430c\": rpc error: code = NotFound desc = could not find container \"cccf5db571ae535719e3c5b597ee0ec050c1fe97ec686e3e3c77319f8747430c\": container with ID starting with cccf5db571ae535719e3c5b597ee0ec050c1fe97ec686e3e3c77319f8747430c not found: ID does not exist" Apr 23 16:39:20.272562 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.272516 2569 scope.go:117] "RemoveContainer" containerID="185faac1df253a6e5a74de092e85abfde931063b4b56507d27764ec425a2124b" Apr 23 16:39:20.272725 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.272708 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"185faac1df253a6e5a74de092e85abfde931063b4b56507d27764ec425a2124b"} err="failed to get container status \"185faac1df253a6e5a74de092e85abfde931063b4b56507d27764ec425a2124b\": rpc error: code = NotFound desc = could not find container \"185faac1df253a6e5a74de092e85abfde931063b4b56507d27764ec425a2124b\": container with ID starting with 185faac1df253a6e5a74de092e85abfde931063b4b56507d27764ec425a2124b not found: ID does not exist" Apr 23 16:39:20.272777 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.272724 2569 scope.go:117] "RemoveContainer" containerID="4c29b67348cbb86fa1d969236980dadbdbd1b00fb44ecfcb6ac6ed1533fcb73b" Apr 23 16:39:20.272868 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.272854 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c29b67348cbb86fa1d969236980dadbdbd1b00fb44ecfcb6ac6ed1533fcb73b"} err="failed to get container status \"4c29b67348cbb86fa1d969236980dadbdbd1b00fb44ecfcb6ac6ed1533fcb73b\": rpc error: code = NotFound desc = could not find container \"4c29b67348cbb86fa1d969236980dadbdbd1b00fb44ecfcb6ac6ed1533fcb73b\": container with ID starting with 4c29b67348cbb86fa1d969236980dadbdbd1b00fb44ecfcb6ac6ed1533fcb73b not found: ID does not exist" Apr 23 16:39:20.272913 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.272868 2569 scope.go:117] "RemoveContainer" containerID="8ad48c584b2e15f7b64e57c87b4665a3807f021c680932e118861919623029e3" Apr 23 16:39:20.273073 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.273058 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ad48c584b2e15f7b64e57c87b4665a3807f021c680932e118861919623029e3"} err="failed to get container status \"8ad48c584b2e15f7b64e57c87b4665a3807f021c680932e118861919623029e3\": rpc error: code = NotFound desc = could not find container \"8ad48c584b2e15f7b64e57c87b4665a3807f021c680932e118861919623029e3\": container with ID starting with 8ad48c584b2e15f7b64e57c87b4665a3807f021c680932e118861919623029e3 not found: ID does not exist" Apr 23 16:39:20.273117 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.273073 2569 scope.go:117] "RemoveContainer" containerID="8b6f78c6930bce920e2204b25adbb35a3cf9847f528533b19d7b02ffebf8b891" Apr 23 16:39:20.273281 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.273262 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b6f78c6930bce920e2204b25adbb35a3cf9847f528533b19d7b02ffebf8b891"} err="failed to get container status \"8b6f78c6930bce920e2204b25adbb35a3cf9847f528533b19d7b02ffebf8b891\": rpc error: code = NotFound desc = could not find container \"8b6f78c6930bce920e2204b25adbb35a3cf9847f528533b19d7b02ffebf8b891\": container with ID starting with 8b6f78c6930bce920e2204b25adbb35a3cf9847f528533b19d7b02ffebf8b891 not found: ID does not exist" Apr 23 16:39:20.273322 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.273282 2569 scope.go:117] "RemoveContainer" containerID="25d4812ab1484ebbcc32f2c601f3a0248754354728f191a3e5103cff55d956bb" Apr 23 16:39:20.273467 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.273444 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25d4812ab1484ebbcc32f2c601f3a0248754354728f191a3e5103cff55d956bb"} err="failed to get container status \"25d4812ab1484ebbcc32f2c601f3a0248754354728f191a3e5103cff55d956bb\": rpc error: code = NotFound desc = could not find container \"25d4812ab1484ebbcc32f2c601f3a0248754354728f191a3e5103cff55d956bb\": container with ID starting with 25d4812ab1484ebbcc32f2c601f3a0248754354728f191a3e5103cff55d956bb not found: ID does not exist" Apr 23 16:39:20.273510 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.273468 2569 scope.go:117] "RemoveContainer" containerID="93bfa28b8e0fa7c089f896274532039b6eb07a597d8b251c368f70aaadc161f2" Apr 23 16:39:20.273704 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.273685 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93bfa28b8e0fa7c089f896274532039b6eb07a597d8b251c368f70aaadc161f2"} err="failed to get container status \"93bfa28b8e0fa7c089f896274532039b6eb07a597d8b251c368f70aaadc161f2\": rpc error: code = NotFound desc = could not find container \"93bfa28b8e0fa7c089f896274532039b6eb07a597d8b251c368f70aaadc161f2\": container with ID starting with 93bfa28b8e0fa7c089f896274532039b6eb07a597d8b251c368f70aaadc161f2 not found: ID does not exist" Apr 23 16:39:20.273773 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.273704 2569 scope.go:117] "RemoveContainer" containerID="cccf5db571ae535719e3c5b597ee0ec050c1fe97ec686e3e3c77319f8747430c" Apr 23 16:39:20.273915 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.273900 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cccf5db571ae535719e3c5b597ee0ec050c1fe97ec686e3e3c77319f8747430c"} err="failed to get container status \"cccf5db571ae535719e3c5b597ee0ec050c1fe97ec686e3e3c77319f8747430c\": rpc error: code = NotFound desc = could not find container \"cccf5db571ae535719e3c5b597ee0ec050c1fe97ec686e3e3c77319f8747430c\": container with ID starting with cccf5db571ae535719e3c5b597ee0ec050c1fe97ec686e3e3c77319f8747430c not found: ID does not exist" Apr 23 16:39:20.273965 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.273915 2569 scope.go:117] "RemoveContainer" containerID="185faac1df253a6e5a74de092e85abfde931063b4b56507d27764ec425a2124b" Apr 23 16:39:20.274110 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.274085 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"185faac1df253a6e5a74de092e85abfde931063b4b56507d27764ec425a2124b"} err="failed to get container status \"185faac1df253a6e5a74de092e85abfde931063b4b56507d27764ec425a2124b\": rpc error: code = NotFound desc = could not find container \"185faac1df253a6e5a74de092e85abfde931063b4b56507d27764ec425a2124b\": container with ID starting with 185faac1df253a6e5a74de092e85abfde931063b4b56507d27764ec425a2124b not found: ID does not exist" Apr 23 16:39:20.281708 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.281688 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 16:39:20.282020 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.282007 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad39d609-b577-48de-b485-a28556d6f1a1" containerName="kube-rbac-proxy" Apr 23 16:39:20.282068 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.282022 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad39d609-b577-48de-b485-a28556d6f1a1" containerName="kube-rbac-proxy" Apr 23 16:39:20.282068 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.282032 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad39d609-b577-48de-b485-a28556d6f1a1" containerName="kube-rbac-proxy-web" Apr 23 16:39:20.282068 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.282037 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad39d609-b577-48de-b485-a28556d6f1a1" containerName="kube-rbac-proxy-web" Apr 23 16:39:20.282068 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.282047 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5ea87ff-a22d-4952-8409-ae584643f659" containerName="console" Apr 23 16:39:20.282068 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.282052 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5ea87ff-a22d-4952-8409-ae584643f659" containerName="console" Apr 23 16:39:20.282068 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.282060 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad39d609-b577-48de-b485-a28556d6f1a1" containerName="kube-rbac-proxy-thanos" Apr 23 16:39:20.282068 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.282068 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad39d609-b577-48de-b485-a28556d6f1a1" containerName="kube-rbac-proxy-thanos" Apr 23 16:39:20.282273 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.282081 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad39d609-b577-48de-b485-a28556d6f1a1" containerName="prometheus" Apr 23 16:39:20.282273 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.282086 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad39d609-b577-48de-b485-a28556d6f1a1" containerName="prometheus" Apr 23 16:39:20.282273 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.282093 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad39d609-b577-48de-b485-a28556d6f1a1" containerName="config-reloader" Apr 23 16:39:20.282273 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.282098 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad39d609-b577-48de-b485-a28556d6f1a1" containerName="config-reloader" Apr 23 16:39:20.282273 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.282107 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad39d609-b577-48de-b485-a28556d6f1a1" containerName="init-config-reloader" Apr 23 16:39:20.282273 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.282121 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad39d609-b577-48de-b485-a28556d6f1a1" containerName="init-config-reloader" Apr 23 16:39:20.282273 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.282133 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cddb0dbb-053c-4682-bdea-dbe11df4a62e" containerName="console" Apr 23 16:39:20.282273 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.282141 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="cddb0dbb-053c-4682-bdea-dbe11df4a62e" containerName="console" Apr 23 16:39:20.282273 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.282150 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad39d609-b577-48de-b485-a28556d6f1a1" containerName="thanos-sidecar" Apr 23 16:39:20.282273 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.282155 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad39d609-b577-48de-b485-a28556d6f1a1" containerName="thanos-sidecar" Apr 23 16:39:20.282273 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.282198 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad39d609-b577-48de-b485-a28556d6f1a1" containerName="thanos-sidecar" Apr 23 16:39:20.282273 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.282207 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad39d609-b577-48de-b485-a28556d6f1a1" containerName="kube-rbac-proxy-web" Apr 23 16:39:20.282273 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.282216 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad39d609-b577-48de-b485-a28556d6f1a1" containerName="prometheus" Apr 23 16:39:20.282273 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.282225 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad39d609-b577-48de-b485-a28556d6f1a1" containerName="kube-rbac-proxy-thanos" Apr 23 16:39:20.282273 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.282233 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad39d609-b577-48de-b485-a28556d6f1a1" containerName="kube-rbac-proxy" Apr 23 16:39:20.282273 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.282240 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad39d609-b577-48de-b485-a28556d6f1a1" containerName="config-reloader" Apr 23 16:39:20.282273 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.282246 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="c5ea87ff-a22d-4952-8409-ae584643f659" containerName="console" Apr 23 16:39:20.282273 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.282252 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="cddb0dbb-053c-4682-bdea-dbe11df4a62e" containerName="console" Apr 23 16:39:20.287278 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.287260 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.289827 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.289810 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 23 16:39:20.289827 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.289821 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 23 16:39:20.289974 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.289832 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-b4cl2\"" Apr 23 16:39:20.290114 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.290098 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 23 16:39:20.290185 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.290110 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 23 16:39:20.290237 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.290211 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 23 16:39:20.290437 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.290385 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 16:39:20.290542 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.290526 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 23 16:39:20.290783 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.290742 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 23 16:39:20.290890 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.290814 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 23 16:39:20.291070 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.291051 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-d1jk41q0nm03v\"" Apr 23 16:39:20.291252 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.291235 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 23 16:39:20.291252 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.291243 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 23 16:39:20.294958 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.294890 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 23 16:39:20.296129 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.296108 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 23 16:39:20.301041 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.301021 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 16:39:20.321852 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.321822 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/17c4d885-af9d-4f0a-bf16-4ec2e083444a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.321963 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.321865 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/17c4d885-af9d-4f0a-bf16-4ec2e083444a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.321963 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.321891 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17c4d885-af9d-4f0a-bf16-4ec2e083444a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.322071 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.321977 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/17c4d885-af9d-4f0a-bf16-4ec2e083444a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.322071 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.322002 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/17c4d885-af9d-4f0a-bf16-4ec2e083444a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.322071 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.322029 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/17c4d885-af9d-4f0a-bf16-4ec2e083444a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.322071 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.322059 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/17c4d885-af9d-4f0a-bf16-4ec2e083444a-web-config\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.322256 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.322092 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17c4d885-af9d-4f0a-bf16-4ec2e083444a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.322256 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.322117 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17c4d885-af9d-4f0a-bf16-4ec2e083444a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.322256 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.322150 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/17c4d885-af9d-4f0a-bf16-4ec2e083444a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.322256 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.322200 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/17c4d885-af9d-4f0a-bf16-4ec2e083444a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.322256 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.322229 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/17c4d885-af9d-4f0a-bf16-4ec2e083444a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.322490 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.322278 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/17c4d885-af9d-4f0a-bf16-4ec2e083444a-config\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.322490 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.322299 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/17c4d885-af9d-4f0a-bf16-4ec2e083444a-config-out\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.322490 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.322330 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/17c4d885-af9d-4f0a-bf16-4ec2e083444a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.322490 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.322354 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/17c4d885-af9d-4f0a-bf16-4ec2e083444a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.322490 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.322389 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/17c4d885-af9d-4f0a-bf16-4ec2e083444a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.322490 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.322425 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfvhp\" (UniqueName: \"kubernetes.io/projected/17c4d885-af9d-4f0a-bf16-4ec2e083444a-kube-api-access-hfvhp\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.423646 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.423613 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/17c4d885-af9d-4f0a-bf16-4ec2e083444a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.423646 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.423650 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/17c4d885-af9d-4f0a-bf16-4ec2e083444a-web-config\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.423900 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.423693 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17c4d885-af9d-4f0a-bf16-4ec2e083444a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.423900 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.423711 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17c4d885-af9d-4f0a-bf16-4ec2e083444a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.423900 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.423739 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/17c4d885-af9d-4f0a-bf16-4ec2e083444a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.423900 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.423768 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/17c4d885-af9d-4f0a-bf16-4ec2e083444a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.423900 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.423793 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/17c4d885-af9d-4f0a-bf16-4ec2e083444a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.423900 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.423821 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/17c4d885-af9d-4f0a-bf16-4ec2e083444a-config\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.423900 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.423845 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/17c4d885-af9d-4f0a-bf16-4ec2e083444a-config-out\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.423900 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.423870 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/17c4d885-af9d-4f0a-bf16-4ec2e083444a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.423900 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.423897 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/17c4d885-af9d-4f0a-bf16-4ec2e083444a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.424224 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.423932 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/17c4d885-af9d-4f0a-bf16-4ec2e083444a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.424224 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.423991 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hfvhp\" (UniqueName: \"kubernetes.io/projected/17c4d885-af9d-4f0a-bf16-4ec2e083444a-kube-api-access-hfvhp\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.424224 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.424084 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/17c4d885-af9d-4f0a-bf16-4ec2e083444a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.424224 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.424142 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/17c4d885-af9d-4f0a-bf16-4ec2e083444a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.424224 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.424169 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17c4d885-af9d-4f0a-bf16-4ec2e083444a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.424224 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.424205 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/17c4d885-af9d-4f0a-bf16-4ec2e083444a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.424430 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.424231 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/17c4d885-af9d-4f0a-bf16-4ec2e083444a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.424811 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.424685 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17c4d885-af9d-4f0a-bf16-4ec2e083444a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.424936 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.424874 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17c4d885-af9d-4f0a-bf16-4ec2e083444a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.425804 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.425476 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/17c4d885-af9d-4f0a-bf16-4ec2e083444a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.426817 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.426792 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/17c4d885-af9d-4f0a-bf16-4ec2e083444a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.426923 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.426812 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/17c4d885-af9d-4f0a-bf16-4ec2e083444a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.426991 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.426957 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/17c4d885-af9d-4f0a-bf16-4ec2e083444a-config\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.427614 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.427588 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/17c4d885-af9d-4f0a-bf16-4ec2e083444a-web-config\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.427719 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.427625 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/17c4d885-af9d-4f0a-bf16-4ec2e083444a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.427778 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.427731 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/17c4d885-af9d-4f0a-bf16-4ec2e083444a-config-out\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.428000 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.427972 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17c4d885-af9d-4f0a-bf16-4ec2e083444a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.428232 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.428208 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/17c4d885-af9d-4f0a-bf16-4ec2e083444a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.428309 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.428269 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/17c4d885-af9d-4f0a-bf16-4ec2e083444a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.428987 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.428959 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/17c4d885-af9d-4f0a-bf16-4ec2e083444a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.429601 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.429584 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/17c4d885-af9d-4f0a-bf16-4ec2e083444a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.429700 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.429684 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/17c4d885-af9d-4f0a-bf16-4ec2e083444a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.429880 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.429858 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/17c4d885-af9d-4f0a-bf16-4ec2e083444a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.430143 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.430128 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/17c4d885-af9d-4f0a-bf16-4ec2e083444a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.435109 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.435091 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfvhp\" (UniqueName: \"kubernetes.io/projected/17c4d885-af9d-4f0a-bf16-4ec2e083444a-kube-api-access-hfvhp\") pod \"prometheus-k8s-0\" (UID: \"17c4d885-af9d-4f0a-bf16-4ec2e083444a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.598311 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.598279 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:20.719509 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:20.719485 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 16:39:20.721580 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:39:20.721552 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17c4d885_af9d_4f0a_bf16_4ec2e083444a.slice/crio-a4f4982bd8c43d3d8363e1ba760232282ec61e9d03609d50f0c16c0a7751fc2a WatchSource:0}: Error finding container a4f4982bd8c43d3d8363e1ba760232282ec61e9d03609d50f0c16c0a7751fc2a: Status 404 returned error can't find the container with id a4f4982bd8c43d3d8363e1ba760232282ec61e9d03609d50f0c16c0a7751fc2a Apr 23 16:39:21.223919 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:21.223889 2569 generic.go:358] "Generic (PLEG): container finished" podID="17c4d885-af9d-4f0a-bf16-4ec2e083444a" containerID="e08e6fd5fc52feecc4502c3e07720fbc1041f5194ce09fc9db4de8dc588db513" exitCode=0 Apr 23 16:39:21.224309 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:21.223972 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"17c4d885-af9d-4f0a-bf16-4ec2e083444a","Type":"ContainerDied","Data":"e08e6fd5fc52feecc4502c3e07720fbc1041f5194ce09fc9db4de8dc588db513"} Apr 23 16:39:21.224309 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:21.223993 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"17c4d885-af9d-4f0a-bf16-4ec2e083444a","Type":"ContainerStarted","Data":"a4f4982bd8c43d3d8363e1ba760232282ec61e9d03609d50f0c16c0a7751fc2a"} Apr 23 16:39:21.308752 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:21.308718 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad39d609-b577-48de-b485-a28556d6f1a1" path="/var/lib/kubelet/pods/ad39d609-b577-48de-b485-a28556d6f1a1/volumes" Apr 23 16:39:22.230428 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:22.230394 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"17c4d885-af9d-4f0a-bf16-4ec2e083444a","Type":"ContainerStarted","Data":"7e3864a8c1fb11325767344b7ee391788dfb5027694da2152d9c80c8c706f09d"} Apr 23 16:39:22.230428 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:22.230430 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"17c4d885-af9d-4f0a-bf16-4ec2e083444a","Type":"ContainerStarted","Data":"53a6f32945e09b3b67fbadcee2126cea0298137a17baa3a1b67b25a827c194b2"} Apr 23 16:39:22.230876 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:22.230440 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"17c4d885-af9d-4f0a-bf16-4ec2e083444a","Type":"ContainerStarted","Data":"43761a823c05f49072ba33eb3313bf9411db8a5da9a45db68f85cf8812ab67a6"} Apr 23 16:39:22.230876 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:22.230449 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"17c4d885-af9d-4f0a-bf16-4ec2e083444a","Type":"ContainerStarted","Data":"14003dacd7459f4433f659f44ef0db21ffdc84737f19098b2ef30cae5f1b68ba"} Apr 23 16:39:22.230876 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:22.230457 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"17c4d885-af9d-4f0a-bf16-4ec2e083444a","Type":"ContainerStarted","Data":"c084b14e6a75e0ccf967b2bab8c92fb30199a4ea842c1b1084c8d08d66da7fa3"} Apr 23 16:39:22.230876 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:22.230465 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"17c4d885-af9d-4f0a-bf16-4ec2e083444a","Type":"ContainerStarted","Data":"393c26d833dbd6cd01bff399ec122731b3fb80eb895464d7538ff3ab0acdaf9c"} Apr 23 16:39:22.259631 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:22.259576 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.259560883 podStartE2EDuration="2.259560883s" podCreationTimestamp="2026-04-23 16:39:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:39:22.2576032 +0000 UTC m=+247.594866167" watchObservedRunningTime="2026-04-23 16:39:22.259560883 +0000 UTC m=+247.596823866" Apr 23 16:39:25.599110 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:25.599022 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:26.070995 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:26.070877 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1eabe990-f610-4e94-8a89-7cff1c9a6a23-metrics-certs\") pod \"network-metrics-daemon-k7n97\" (UID: \"1eabe990-f610-4e94-8a89-7cff1c9a6a23\") " pod="openshift-multus/network-metrics-daemon-k7n97" Apr 23 16:39:26.073159 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:26.073137 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1eabe990-f610-4e94-8a89-7cff1c9a6a23-metrics-certs\") pod \"network-metrics-daemon-k7n97\" (UID: \"1eabe990-f610-4e94-8a89-7cff1c9a6a23\") " pod="openshift-multus/network-metrics-daemon-k7n97" Apr 23 16:39:26.203210 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:26.203178 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-f89h4\"" Apr 23 16:39:26.211567 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:26.211547 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7n97" Apr 23 16:39:26.535476 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:26.535445 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-k7n97"] Apr 23 16:39:26.538817 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:39:26.538787 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1eabe990_f610_4e94_8a89_7cff1c9a6a23.slice/crio-49db8b896faa3414581c1f2ca72310835301b8ad4edfd1b6cd740ac78e26dfd3 WatchSource:0}: Error finding container 49db8b896faa3414581c1f2ca72310835301b8ad4edfd1b6cd740ac78e26dfd3: Status 404 returned error can't find the container with id 49db8b896faa3414581c1f2ca72310835301b8ad4edfd1b6cd740ac78e26dfd3 Apr 23 16:39:27.251990 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:27.251946 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-k7n97" event={"ID":"1eabe990-f610-4e94-8a89-7cff1c9a6a23","Type":"ContainerStarted","Data":"49db8b896faa3414581c1f2ca72310835301b8ad4edfd1b6cd740ac78e26dfd3"} Apr 23 16:39:28.256782 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:28.256739 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-k7n97" event={"ID":"1eabe990-f610-4e94-8a89-7cff1c9a6a23","Type":"ContainerStarted","Data":"e7e053e047c736f912ac844e44dabb8b31fe67c83d78dd8d632524e3d9eedd90"} Apr 23 16:39:28.256782 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:28.256781 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-k7n97" event={"ID":"1eabe990-f610-4e94-8a89-7cff1c9a6a23","Type":"ContainerStarted","Data":"e6e423109191be8b61b3ef41b5583f9aa0a9ad00224b7bcde16cdd4440763cae"} Apr 23 16:39:28.274495 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:39:28.274448 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-k7n97" podStartSLOduration=252.228626228 podStartE2EDuration="4m13.274433025s" podCreationTimestamp="2026-04-23 16:35:15 +0000 UTC" firstStartedPulling="2026-04-23 16:39:26.540820492 +0000 UTC m=+251.878083434" lastFinishedPulling="2026-04-23 16:39:27.586627284 +0000 UTC m=+252.923890231" observedRunningTime="2026-04-23 16:39:28.27282909 +0000 UTC m=+253.610092054" watchObservedRunningTime="2026-04-23 16:39:28.274433025 +0000 UTC m=+253.611695989" Apr 23 16:40:06.229585 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:40:06.229549 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-m2shj"] Apr 23 16:40:06.232921 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:40:06.232902 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m2shj" Apr 23 16:40:06.235148 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:40:06.235130 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 16:40:06.242199 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:40:06.242179 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-m2shj"] Apr 23 16:40:06.304146 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:40:06.304109 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/79449e8d-d089-44a6-a94a-313bc1e4228d-dbus\") pod \"global-pull-secret-syncer-m2shj\" (UID: \"79449e8d-d089-44a6-a94a-313bc1e4228d\") " pod="kube-system/global-pull-secret-syncer-m2shj" Apr 23 16:40:06.304146 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:40:06.304147 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/79449e8d-d089-44a6-a94a-313bc1e4228d-kubelet-config\") pod \"global-pull-secret-syncer-m2shj\" (UID: \"79449e8d-d089-44a6-a94a-313bc1e4228d\") " pod="kube-system/global-pull-secret-syncer-m2shj" Apr 23 16:40:06.304352 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:40:06.304173 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/79449e8d-d089-44a6-a94a-313bc1e4228d-original-pull-secret\") pod \"global-pull-secret-syncer-m2shj\" (UID: \"79449e8d-d089-44a6-a94a-313bc1e4228d\") " pod="kube-system/global-pull-secret-syncer-m2shj" Apr 23 16:40:06.404590 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:40:06.404556 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/79449e8d-d089-44a6-a94a-313bc1e4228d-dbus\") pod \"global-pull-secret-syncer-m2shj\" (UID: \"79449e8d-d089-44a6-a94a-313bc1e4228d\") " pod="kube-system/global-pull-secret-syncer-m2shj" Apr 23 16:40:06.404963 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:40:06.404596 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/79449e8d-d089-44a6-a94a-313bc1e4228d-kubelet-config\") pod \"global-pull-secret-syncer-m2shj\" (UID: \"79449e8d-d089-44a6-a94a-313bc1e4228d\") " pod="kube-system/global-pull-secret-syncer-m2shj" Apr 23 16:40:06.404963 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:40:06.404623 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/79449e8d-d089-44a6-a94a-313bc1e4228d-original-pull-secret\") pod \"global-pull-secret-syncer-m2shj\" (UID: \"79449e8d-d089-44a6-a94a-313bc1e4228d\") " pod="kube-system/global-pull-secret-syncer-m2shj" Apr 23 16:40:06.404963 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:40:06.404712 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/79449e8d-d089-44a6-a94a-313bc1e4228d-kubelet-config\") pod \"global-pull-secret-syncer-m2shj\" (UID: \"79449e8d-d089-44a6-a94a-313bc1e4228d\") " pod="kube-system/global-pull-secret-syncer-m2shj" Apr 23 16:40:06.404963 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:40:06.404773 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/79449e8d-d089-44a6-a94a-313bc1e4228d-dbus\") pod \"global-pull-secret-syncer-m2shj\" (UID: \"79449e8d-d089-44a6-a94a-313bc1e4228d\") " pod="kube-system/global-pull-secret-syncer-m2shj" Apr 23 16:40:06.406922 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:40:06.406906 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/79449e8d-d089-44a6-a94a-313bc1e4228d-original-pull-secret\") pod \"global-pull-secret-syncer-m2shj\" (UID: \"79449e8d-d089-44a6-a94a-313bc1e4228d\") " pod="kube-system/global-pull-secret-syncer-m2shj" Apr 23 16:40:06.541794 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:40:06.541710 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m2shj" Apr 23 16:40:06.664732 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:40:06.664689 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-m2shj"] Apr 23 16:40:06.667455 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:40:06.667427 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79449e8d_d089_44a6_a94a_313bc1e4228d.slice/crio-20b5879e1dce74ea991e713e44d0c1af7b42da4f9274780409109f08095420d4 WatchSource:0}: Error finding container 20b5879e1dce74ea991e713e44d0c1af7b42da4f9274780409109f08095420d4: Status 404 returned error can't find the container with id 20b5879e1dce74ea991e713e44d0c1af7b42da4f9274780409109f08095420d4 Apr 23 16:40:07.371653 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:40:07.371612 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-m2shj" event={"ID":"79449e8d-d089-44a6-a94a-313bc1e4228d","Type":"ContainerStarted","Data":"20b5879e1dce74ea991e713e44d0c1af7b42da4f9274780409109f08095420d4"} Apr 23 16:40:11.385369 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:40:11.385332 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-m2shj" event={"ID":"79449e8d-d089-44a6-a94a-313bc1e4228d","Type":"ContainerStarted","Data":"c3a53ae6e65a2d62eabe71a7e918da20b9a2f44314ea959ce281114bfb011c7a"} Apr 23 16:40:11.403718 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:40:11.403602 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-m2shj" podStartSLOduration=0.882740646 podStartE2EDuration="5.403581961s" podCreationTimestamp="2026-04-23 16:40:06 +0000 UTC" firstStartedPulling="2026-04-23 16:40:06.669053715 +0000 UTC m=+292.006316658" lastFinishedPulling="2026-04-23 16:40:11.189895027 +0000 UTC m=+296.527157973" observedRunningTime="2026-04-23 16:40:11.402772325 +0000 UTC m=+296.740035289" watchObservedRunningTime="2026-04-23 16:40:11.403581961 +0000 UTC m=+296.740844926" Apr 23 16:40:15.180645 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:40:15.180617 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gfsdv_3481652d-e5fb-498e-84c3-e2c629340367/console-operator/1.log" Apr 23 16:40:15.183487 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:40:15.183461 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddr2h_a063c6b7-80d4-45d7-815d-88d94693a0b1/ovn-acl-logging/0.log" Apr 23 16:40:15.184473 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:40:15.184436 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gfsdv_3481652d-e5fb-498e-84c3-e2c629340367/console-operator/1.log" Apr 23 16:40:15.187860 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:40:15.187839 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddr2h_a063c6b7-80d4-45d7-815d-88d94693a0b1/ovn-acl-logging/0.log" Apr 23 16:40:15.191629 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:40:15.191612 2569 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 16:40:20.599478 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:40:20.599442 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:40:20.614921 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:40:20.614894 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:40:21.428425 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:40:21.428395 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:43:40.143010 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:43:40.142975 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-c2gw8"] Apr 23 16:43:40.146517 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:43:40.146496 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-c2gw8" Apr 23 16:43:40.148524 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:43:40.148504 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 23 16:43:40.148653 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:43:40.148576 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 23 16:43:40.148653 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:43:40.148638 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-xrv5h\"" Apr 23 16:43:40.148913 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:43:40.148897 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 23 16:43:40.152960 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:43:40.152936 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-c2gw8"] Apr 23 16:43:40.204384 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:43:40.204346 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwpm4\" (UniqueName: \"kubernetes.io/projected/9953e210-80fc-47e3-9bde-7f00b37b8f53-kube-api-access-zwpm4\") pod \"s3-init-c2gw8\" (UID: \"9953e210-80fc-47e3-9bde-7f00b37b8f53\") " pod="kserve/s3-init-c2gw8" Apr 23 16:43:40.305205 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:43:40.305173 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zwpm4\" (UniqueName: \"kubernetes.io/projected/9953e210-80fc-47e3-9bde-7f00b37b8f53-kube-api-access-zwpm4\") pod \"s3-init-c2gw8\" (UID: \"9953e210-80fc-47e3-9bde-7f00b37b8f53\") " pod="kserve/s3-init-c2gw8" Apr 23 16:43:40.313197 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:43:40.313171 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwpm4\" (UniqueName: \"kubernetes.io/projected/9953e210-80fc-47e3-9bde-7f00b37b8f53-kube-api-access-zwpm4\") pod \"s3-init-c2gw8\" (UID: \"9953e210-80fc-47e3-9bde-7f00b37b8f53\") " pod="kserve/s3-init-c2gw8" Apr 23 16:43:40.469656 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:43:40.469556 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-c2gw8" Apr 23 16:43:40.591652 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:43:40.590996 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-c2gw8"] Apr 23 16:43:40.594023 ip-10-0-128-102 kubenswrapper[2569]: W0423 16:43:40.593985 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9953e210_80fc_47e3_9bde_7f00b37b8f53.slice/crio-7d831df9ae680fd7ca23ae4f77f2a24f75d979dfa8175a44e5e2b63892cf0090 WatchSource:0}: Error finding container 7d831df9ae680fd7ca23ae4f77f2a24f75d979dfa8175a44e5e2b63892cf0090: Status 404 returned error can't find the container with id 7d831df9ae680fd7ca23ae4f77f2a24f75d979dfa8175a44e5e2b63892cf0090 Apr 23 16:43:40.596318 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:43:40.596300 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:43:41.002928 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:43:41.002882 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-c2gw8" event={"ID":"9953e210-80fc-47e3-9bde-7f00b37b8f53","Type":"ContainerStarted","Data":"7d831df9ae680fd7ca23ae4f77f2a24f75d979dfa8175a44e5e2b63892cf0090"} Apr 23 16:43:46.020998 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:43:46.020956 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-c2gw8" event={"ID":"9953e210-80fc-47e3-9bde-7f00b37b8f53","Type":"ContainerStarted","Data":"3034fa039dfcd1eb1ec564bec3f1e2234e9e0995b18df29fe7530a0546164f9f"} Apr 23 16:43:49.031614 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:43:49.031580 2569 generic.go:358] "Generic (PLEG): container finished" podID="9953e210-80fc-47e3-9bde-7f00b37b8f53" containerID="3034fa039dfcd1eb1ec564bec3f1e2234e9e0995b18df29fe7530a0546164f9f" exitCode=0 Apr 23 16:43:49.031994 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:43:49.031655 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-c2gw8" event={"ID":"9953e210-80fc-47e3-9bde-7f00b37b8f53","Type":"ContainerDied","Data":"3034fa039dfcd1eb1ec564bec3f1e2234e9e0995b18df29fe7530a0546164f9f"} Apr 23 16:43:50.155532 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:43:50.155510 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-c2gw8" Apr 23 16:43:50.197777 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:43:50.197752 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwpm4\" (UniqueName: \"kubernetes.io/projected/9953e210-80fc-47e3-9bde-7f00b37b8f53-kube-api-access-zwpm4\") pod \"9953e210-80fc-47e3-9bde-7f00b37b8f53\" (UID: \"9953e210-80fc-47e3-9bde-7f00b37b8f53\") " Apr 23 16:43:50.199853 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:43:50.199830 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9953e210-80fc-47e3-9bde-7f00b37b8f53-kube-api-access-zwpm4" (OuterVolumeSpecName: "kube-api-access-zwpm4") pod "9953e210-80fc-47e3-9bde-7f00b37b8f53" (UID: "9953e210-80fc-47e3-9bde-7f00b37b8f53"). InnerVolumeSpecName "kube-api-access-zwpm4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:43:50.298970 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:43:50.298876 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zwpm4\" (UniqueName: \"kubernetes.io/projected/9953e210-80fc-47e3-9bde-7f00b37b8f53-kube-api-access-zwpm4\") on node \"ip-10-0-128-102.ec2.internal\" DevicePath \"\"" Apr 23 16:43:51.037900 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:43:51.037869 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-c2gw8" Apr 23 16:43:51.038106 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:43:51.037907 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-c2gw8" event={"ID":"9953e210-80fc-47e3-9bde-7f00b37b8f53","Type":"ContainerDied","Data":"7d831df9ae680fd7ca23ae4f77f2a24f75d979dfa8175a44e5e2b63892cf0090"} Apr 23 16:43:51.038106 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:43:51.037939 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d831df9ae680fd7ca23ae4f77f2a24f75d979dfa8175a44e5e2b63892cf0090" Apr 23 16:45:15.208981 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:45:15.208949 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gfsdv_3481652d-e5fb-498e-84c3-e2c629340367/console-operator/1.log" Apr 23 16:45:15.210901 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:45:15.210876 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gfsdv_3481652d-e5fb-498e-84c3-e2c629340367/console-operator/1.log" Apr 23 16:45:15.211693 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:45:15.211655 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddr2h_a063c6b7-80d4-45d7-815d-88d94693a0b1/ovn-acl-logging/0.log" Apr 23 16:45:15.213618 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:45:15.213598 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddr2h_a063c6b7-80d4-45d7-815d-88d94693a0b1/ovn-acl-logging/0.log" Apr 23 16:50:15.232689 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:50:15.232643 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gfsdv_3481652d-e5fb-498e-84c3-e2c629340367/console-operator/1.log" Apr 23 16:50:15.233221 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:50:15.232643 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gfsdv_3481652d-e5fb-498e-84c3-e2c629340367/console-operator/1.log" Apr 23 16:50:15.235688 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:50:15.235655 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddr2h_a063c6b7-80d4-45d7-815d-88d94693a0b1/ovn-acl-logging/0.log" Apr 23 16:50:15.235809 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:50:15.235654 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddr2h_a063c6b7-80d4-45d7-815d-88d94693a0b1/ovn-acl-logging/0.log" Apr 23 16:55:15.261025 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:55:15.260993 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gfsdv_3481652d-e5fb-498e-84c3-e2c629340367/console-operator/1.log" Apr 23 16:55:15.262600 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:55:15.262579 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gfsdv_3481652d-e5fb-498e-84c3-e2c629340367/console-operator/1.log" Apr 23 16:55:15.263709 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:55:15.263692 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddr2h_a063c6b7-80d4-45d7-815d-88d94693a0b1/ovn-acl-logging/0.log" Apr 23 16:55:15.265355 ip-10-0-128-102 kubenswrapper[2569]: I0423 16:55:15.265338 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddr2h_a063c6b7-80d4-45d7-815d-88d94693a0b1/ovn-acl-logging/0.log" Apr 23 17:00:15.283549 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:00:15.283517 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gfsdv_3481652d-e5fb-498e-84c3-e2c629340367/console-operator/1.log" Apr 23 17:00:15.286136 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:00:15.285107 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gfsdv_3481652d-e5fb-498e-84c3-e2c629340367/console-operator/1.log" Apr 23 17:00:15.286291 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:00:15.286276 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddr2h_a063c6b7-80d4-45d7-815d-88d94693a0b1/ovn-acl-logging/0.log" Apr 23 17:00:15.287729 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:00:15.287712 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddr2h_a063c6b7-80d4-45d7-815d-88d94693a0b1/ovn-acl-logging/0.log" Apr 23 17:05:15.312023 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:05:15.311914 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gfsdv_3481652d-e5fb-498e-84c3-e2c629340367/console-operator/1.log" Apr 23 17:05:15.317966 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:05:15.317442 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gfsdv_3481652d-e5fb-498e-84c3-e2c629340367/console-operator/1.log" Apr 23 17:05:15.317966 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:05:15.317472 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddr2h_a063c6b7-80d4-45d7-815d-88d94693a0b1/ovn-acl-logging/0.log" Apr 23 17:05:15.320513 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:05:15.320483 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddr2h_a063c6b7-80d4-45d7-815d-88d94693a0b1/ovn-acl-logging/0.log" Apr 23 17:10:15.336311 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:10:15.336285 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gfsdv_3481652d-e5fb-498e-84c3-e2c629340367/console-operator/1.log" Apr 23 17:10:15.339324 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:10:15.339305 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddr2h_a063c6b7-80d4-45d7-815d-88d94693a0b1/ovn-acl-logging/0.log" Apr 23 17:10:15.339991 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:10:15.339972 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gfsdv_3481652d-e5fb-498e-84c3-e2c629340367/console-operator/1.log" Apr 23 17:10:15.342744 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:10:15.342727 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddr2h_a063c6b7-80d4-45d7-815d-88d94693a0b1/ovn-acl-logging/0.log" Apr 23 17:15:15.361502 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:15:15.361388 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gfsdv_3481652d-e5fb-498e-84c3-e2c629340367/console-operator/1.log" Apr 23 17:15:15.365592 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:15:15.364317 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddr2h_a063c6b7-80d4-45d7-815d-88d94693a0b1/ovn-acl-logging/0.log" Apr 23 17:15:15.365592 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:15:15.365499 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gfsdv_3481652d-e5fb-498e-84c3-e2c629340367/console-operator/1.log" Apr 23 17:15:15.368315 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:15:15.368295 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddr2h_a063c6b7-80d4-45d7-815d-88d94693a0b1/ovn-acl-logging/0.log" Apr 23 17:20:15.386650 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:20:15.386521 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gfsdv_3481652d-e5fb-498e-84c3-e2c629340367/console-operator/1.log" Apr 23 17:20:15.390889 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:20:15.389200 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddr2h_a063c6b7-80d4-45d7-815d-88d94693a0b1/ovn-acl-logging/0.log" Apr 23 17:20:15.391702 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:20:15.391680 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gfsdv_3481652d-e5fb-498e-84c3-e2c629340367/console-operator/1.log" Apr 23 17:20:15.399849 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:20:15.399825 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddr2h_a063c6b7-80d4-45d7-815d-88d94693a0b1/ovn-acl-logging/0.log" Apr 23 17:23:51.516059 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:23:51.516020 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tf72x/must-gather-dnxd8"] Apr 23 17:23:51.516761 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:23:51.516529 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9953e210-80fc-47e3-9bde-7f00b37b8f53" containerName="s3-init" Apr 23 17:23:51.516761 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:23:51.516548 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="9953e210-80fc-47e3-9bde-7f00b37b8f53" containerName="s3-init" Apr 23 17:23:51.516761 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:23:51.516639 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="9953e210-80fc-47e3-9bde-7f00b37b8f53" containerName="s3-init" Apr 23 17:23:51.519856 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:23:51.519835 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tf72x/must-gather-dnxd8" Apr 23 17:23:51.521743 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:23:51.521716 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tf72x\"/\"kube-root-ca.crt\"" Apr 23 17:23:51.522212 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:23:51.522190 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tf72x\"/\"openshift-service-ca.crt\"" Apr 23 17:23:51.522313 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:23:51.522195 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-tf72x\"/\"default-dockercfg-bnmz4\"" Apr 23 17:23:51.532861 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:23:51.532834 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tf72x/must-gather-dnxd8"] Apr 23 17:23:51.629770 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:23:51.629733 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7dmp\" (UniqueName: \"kubernetes.io/projected/b19fbb68-f0fa-4c09-94da-76960bbe359e-kube-api-access-h7dmp\") pod \"must-gather-dnxd8\" (UID: \"b19fbb68-f0fa-4c09-94da-76960bbe359e\") " pod="openshift-must-gather-tf72x/must-gather-dnxd8" Apr 23 17:23:51.629770 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:23:51.629776 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b19fbb68-f0fa-4c09-94da-76960bbe359e-must-gather-output\") pod \"must-gather-dnxd8\" (UID: \"b19fbb68-f0fa-4c09-94da-76960bbe359e\") " pod="openshift-must-gather-tf72x/must-gather-dnxd8" Apr 23 17:23:51.730898 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:23:51.730857 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h7dmp\" (UniqueName: \"kubernetes.io/projected/b19fbb68-f0fa-4c09-94da-76960bbe359e-kube-api-access-h7dmp\") pod \"must-gather-dnxd8\" (UID: \"b19fbb68-f0fa-4c09-94da-76960bbe359e\") " pod="openshift-must-gather-tf72x/must-gather-dnxd8" Apr 23 17:23:51.730898 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:23:51.730898 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b19fbb68-f0fa-4c09-94da-76960bbe359e-must-gather-output\") pod \"must-gather-dnxd8\" (UID: \"b19fbb68-f0fa-4c09-94da-76960bbe359e\") " pod="openshift-must-gather-tf72x/must-gather-dnxd8" Apr 23 17:23:51.731234 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:23:51.731206 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b19fbb68-f0fa-4c09-94da-76960bbe359e-must-gather-output\") pod \"must-gather-dnxd8\" (UID: \"b19fbb68-f0fa-4c09-94da-76960bbe359e\") " pod="openshift-must-gather-tf72x/must-gather-dnxd8" Apr 23 17:23:51.738851 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:23:51.738823 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7dmp\" (UniqueName: \"kubernetes.io/projected/b19fbb68-f0fa-4c09-94da-76960bbe359e-kube-api-access-h7dmp\") pod \"must-gather-dnxd8\" (UID: \"b19fbb68-f0fa-4c09-94da-76960bbe359e\") " pod="openshift-must-gather-tf72x/must-gather-dnxd8" Apr 23 17:23:51.845245 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:23:51.845196 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tf72x/must-gather-dnxd8" Apr 23 17:23:51.969305 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:23:51.969274 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tf72x/must-gather-dnxd8"] Apr 23 17:23:51.969652 ip-10-0-128-102 kubenswrapper[2569]: W0423 17:23:51.969630 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb19fbb68_f0fa_4c09_94da_76960bbe359e.slice/crio-26ad9b363609ebeb5ad2443d452b1b560790c7580f82e178846190a548a67ab1 WatchSource:0}: Error finding container 26ad9b363609ebeb5ad2443d452b1b560790c7580f82e178846190a548a67ab1: Status 404 returned error can't find the container with id 26ad9b363609ebeb5ad2443d452b1b560790c7580f82e178846190a548a67ab1 Apr 23 17:23:51.971335 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:23:51.971319 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 17:23:52.937641 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:23:52.937573 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tf72x/must-gather-dnxd8" event={"ID":"b19fbb68-f0fa-4c09-94da-76960bbe359e","Type":"ContainerStarted","Data":"26ad9b363609ebeb5ad2443d452b1b560790c7580f82e178846190a548a67ab1"} Apr 23 17:23:56.956844 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:23:56.956803 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tf72x/must-gather-dnxd8" event={"ID":"b19fbb68-f0fa-4c09-94da-76960bbe359e","Type":"ContainerStarted","Data":"81e6945feecc2269f56fb96135b941d1fae4cd261661216a84586a3dc73fea4c"} Apr 23 17:23:56.956844 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:23:56.956845 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tf72x/must-gather-dnxd8" event={"ID":"b19fbb68-f0fa-4c09-94da-76960bbe359e","Type":"ContainerStarted","Data":"04655e7648abe7c52b1e7971ffdc07baa15a49924a79714e2e11ad4f80d18219"} Apr 23 17:23:56.973430 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:23:56.973373 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tf72x/must-gather-dnxd8" podStartSLOduration=1.492158539 podStartE2EDuration="5.97335704s" podCreationTimestamp="2026-04-23 17:23:51 +0000 UTC" firstStartedPulling="2026-04-23 17:23:51.97145093 +0000 UTC m=+2917.308713872" lastFinishedPulling="2026-04-23 17:23:56.452649418 +0000 UTC m=+2921.789912373" observedRunningTime="2026-04-23 17:23:56.971624013 +0000 UTC m=+2922.308886982" watchObservedRunningTime="2026-04-23 17:23:56.97335704 +0000 UTC m=+2922.310620004" Apr 23 17:24:15.016558 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:15.016517 2569 generic.go:358] "Generic (PLEG): container finished" podID="b19fbb68-f0fa-4c09-94da-76960bbe359e" containerID="04655e7648abe7c52b1e7971ffdc07baa15a49924a79714e2e11ad4f80d18219" exitCode=0 Apr 23 17:24:15.017061 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:15.016591 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tf72x/must-gather-dnxd8" event={"ID":"b19fbb68-f0fa-4c09-94da-76960bbe359e","Type":"ContainerDied","Data":"04655e7648abe7c52b1e7971ffdc07baa15a49924a79714e2e11ad4f80d18219"} Apr 23 17:24:15.017061 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:15.016948 2569 scope.go:117] "RemoveContainer" containerID="04655e7648abe7c52b1e7971ffdc07baa15a49924a79714e2e11ad4f80d18219" Apr 23 17:24:15.594900 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:15.594871 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-tf72x_must-gather-dnxd8_b19fbb68-f0fa-4c09-94da-76960bbe359e/gather/0.log" Apr 23 17:24:18.855113 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:18.855082 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-m2shj_79449e8d-d089-44a6-a94a-313bc1e4228d/global-pull-secret-syncer/0.log" Apr 23 17:24:18.952621 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:18.952586 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-dnws5_3d17f781-cff6-4f98-92ab-d090568476a4/konnectivity-agent/0.log" Apr 23 17:24:19.004187 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:19.004154 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-102.ec2.internal_a0c005bc2e76bc3454363e5204d0f408/haproxy/0.log" Apr 23 17:24:20.988324 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:20.988290 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-tf72x/must-gather-dnxd8"] Apr 23 17:24:20.988781 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:20.988507 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-tf72x/must-gather-dnxd8" podUID="b19fbb68-f0fa-4c09-94da-76960bbe359e" containerName="copy" containerID="cri-o://81e6945feecc2269f56fb96135b941d1fae4cd261661216a84586a3dc73fea4c" gracePeriod=2 Apr 23 17:24:20.992637 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:20.992005 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-tf72x/must-gather-dnxd8"] Apr 23 17:24:21.214914 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:21.214889 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-tf72x_must-gather-dnxd8_b19fbb68-f0fa-4c09-94da-76960bbe359e/copy/0.log" Apr 23 17:24:21.215228 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:21.215212 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tf72x/must-gather-dnxd8" Apr 23 17:24:21.216697 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:21.216668 2569 status_manager.go:895] "Failed to get status for pod" podUID="b19fbb68-f0fa-4c09-94da-76960bbe359e" pod="openshift-must-gather-tf72x/must-gather-dnxd8" err="pods \"must-gather-dnxd8\" is forbidden: User \"system:node:ip-10-0-128-102.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-tf72x\": no relationship found between node 'ip-10-0-128-102.ec2.internal' and this object" Apr 23 17:24:21.297141 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:21.297072 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b19fbb68-f0fa-4c09-94da-76960bbe359e-must-gather-output\") pod \"b19fbb68-f0fa-4c09-94da-76960bbe359e\" (UID: \"b19fbb68-f0fa-4c09-94da-76960bbe359e\") " Apr 23 17:24:21.297141 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:21.297127 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7dmp\" (UniqueName: \"kubernetes.io/projected/b19fbb68-f0fa-4c09-94da-76960bbe359e-kube-api-access-h7dmp\") pod \"b19fbb68-f0fa-4c09-94da-76960bbe359e\" (UID: \"b19fbb68-f0fa-4c09-94da-76960bbe359e\") " Apr 23 17:24:21.298495 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:21.298466 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b19fbb68-f0fa-4c09-94da-76960bbe359e-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "b19fbb68-f0fa-4c09-94da-76960bbe359e" (UID: "b19fbb68-f0fa-4c09-94da-76960bbe359e"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:24:21.299179 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:21.299147 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b19fbb68-f0fa-4c09-94da-76960bbe359e-kube-api-access-h7dmp" (OuterVolumeSpecName: "kube-api-access-h7dmp") pod "b19fbb68-f0fa-4c09-94da-76960bbe359e" (UID: "b19fbb68-f0fa-4c09-94da-76960bbe359e"). InnerVolumeSpecName "kube-api-access-h7dmp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:24:21.304166 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:21.304133 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b19fbb68-f0fa-4c09-94da-76960bbe359e" path="/var/lib/kubelet/pods/b19fbb68-f0fa-4c09-94da-76960bbe359e/volumes" Apr 23 17:24:21.397838 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:21.397814 2569 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b19fbb68-f0fa-4c09-94da-76960bbe359e-must-gather-output\") on node \"ip-10-0-128-102.ec2.internal\" DevicePath \"\"" Apr 23 17:24:21.397838 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:21.397836 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h7dmp\" (UniqueName: \"kubernetes.io/projected/b19fbb68-f0fa-4c09-94da-76960bbe359e-kube-api-access-h7dmp\") on node \"ip-10-0-128-102.ec2.internal\" DevicePath \"\"" Apr 23 17:24:22.036112 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:22.036085 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-tf72x_must-gather-dnxd8_b19fbb68-f0fa-4c09-94da-76960bbe359e/copy/0.log" Apr 23 17:24:22.036533 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:22.036407 2569 generic.go:358] "Generic (PLEG): container finished" podID="b19fbb68-f0fa-4c09-94da-76960bbe359e" containerID="81e6945feecc2269f56fb96135b941d1fae4cd261661216a84586a3dc73fea4c" exitCode=143 Apr 23 17:24:22.036533 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:22.036489 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tf72x/must-gather-dnxd8" Apr 23 17:24:22.036637 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:22.036501 2569 scope.go:117] "RemoveContainer" containerID="81e6945feecc2269f56fb96135b941d1fae4cd261661216a84586a3dc73fea4c" Apr 23 17:24:22.043813 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:22.043792 2569 scope.go:117] "RemoveContainer" containerID="04655e7648abe7c52b1e7971ffdc07baa15a49924a79714e2e11ad4f80d18219" Apr 23 17:24:22.055883 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:22.055866 2569 scope.go:117] "RemoveContainer" containerID="81e6945feecc2269f56fb96135b941d1fae4cd261661216a84586a3dc73fea4c" Apr 23 17:24:22.056141 ip-10-0-128-102 kubenswrapper[2569]: E0423 17:24:22.056119 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81e6945feecc2269f56fb96135b941d1fae4cd261661216a84586a3dc73fea4c\": container with ID starting with 81e6945feecc2269f56fb96135b941d1fae4cd261661216a84586a3dc73fea4c not found: ID does not exist" containerID="81e6945feecc2269f56fb96135b941d1fae4cd261661216a84586a3dc73fea4c" Apr 23 17:24:22.056192 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:22.056149 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81e6945feecc2269f56fb96135b941d1fae4cd261661216a84586a3dc73fea4c"} err="failed to get container status \"81e6945feecc2269f56fb96135b941d1fae4cd261661216a84586a3dc73fea4c\": rpc error: code = NotFound desc = could not find container \"81e6945feecc2269f56fb96135b941d1fae4cd261661216a84586a3dc73fea4c\": container with ID starting with 81e6945feecc2269f56fb96135b941d1fae4cd261661216a84586a3dc73fea4c not found: ID does not exist" Apr 23 17:24:22.056192 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:22.056168 2569 scope.go:117] "RemoveContainer" containerID="04655e7648abe7c52b1e7971ffdc07baa15a49924a79714e2e11ad4f80d18219" Apr 23 17:24:22.056415 ip-10-0-128-102 kubenswrapper[2569]: E0423 17:24:22.056394 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04655e7648abe7c52b1e7971ffdc07baa15a49924a79714e2e11ad4f80d18219\": container with ID starting with 04655e7648abe7c52b1e7971ffdc07baa15a49924a79714e2e11ad4f80d18219 not found: ID does not exist" containerID="04655e7648abe7c52b1e7971ffdc07baa15a49924a79714e2e11ad4f80d18219" Apr 23 17:24:22.056473 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:22.056424 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04655e7648abe7c52b1e7971ffdc07baa15a49924a79714e2e11ad4f80d18219"} err="failed to get container status \"04655e7648abe7c52b1e7971ffdc07baa15a49924a79714e2e11ad4f80d18219\": rpc error: code = NotFound desc = could not find container \"04655e7648abe7c52b1e7971ffdc07baa15a49924a79714e2e11ad4f80d18219\": container with ID starting with 04655e7648abe7c52b1e7971ffdc07baa15a49924a79714e2e11ad4f80d18219 not found: ID does not exist" Apr 23 17:24:22.674332 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:22.674257 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-qhxfp_711d34ad-5d72-4eec-b9be-535d9896a2f6/kube-state-metrics/0.log" Apr 23 17:24:22.698089 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:22.698055 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-qhxfp_711d34ad-5d72-4eec-b9be-535d9896a2f6/kube-rbac-proxy-main/0.log" Apr 23 17:24:22.722892 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:22.722871 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-qhxfp_711d34ad-5d72-4eec-b9be-535d9896a2f6/kube-rbac-proxy-self/0.log" Apr 23 17:24:22.860129 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:22.860099 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-b7ln8_7b73fcba-01db-4fdf-b70e-16248a785061/node-exporter/0.log" Apr 23 17:24:22.881572 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:22.881548 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-b7ln8_7b73fcba-01db-4fdf-b70e-16248a785061/kube-rbac-proxy/0.log" Apr 23 17:24:22.907023 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:22.907004 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-b7ln8_7b73fcba-01db-4fdf-b70e-16248a785061/init-textfile/0.log" Apr 23 17:24:23.085732 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:23.085706 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_17c4d885-af9d-4f0a-bf16-4ec2e083444a/prometheus/0.log" Apr 23 17:24:23.102209 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:23.102185 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_17c4d885-af9d-4f0a-bf16-4ec2e083444a/config-reloader/0.log" Apr 23 17:24:23.125472 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:23.125450 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_17c4d885-af9d-4f0a-bf16-4ec2e083444a/thanos-sidecar/0.log" Apr 23 17:24:23.144771 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:23.144750 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_17c4d885-af9d-4f0a-bf16-4ec2e083444a/kube-rbac-proxy-web/0.log" Apr 23 17:24:23.164191 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:23.164170 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_17c4d885-af9d-4f0a-bf16-4ec2e083444a/kube-rbac-proxy/0.log" Apr 23 17:24:23.187279 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:23.187256 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_17c4d885-af9d-4f0a-bf16-4ec2e083444a/kube-rbac-proxy-thanos/0.log" Apr 23 17:24:23.210676 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:23.210644 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_17c4d885-af9d-4f0a-bf16-4ec2e083444a/init-config-reloader/0.log" Apr 23 17:24:23.238944 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:23.238913 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-72g6n_a848667a-8bb1-4689-8537-bfdcf691c441/prometheus-operator/0.log" Apr 23 17:24:23.259836 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:23.259806 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-72g6n_a848667a-8bb1-4689-8537-bfdcf691c441/kube-rbac-proxy/0.log" Apr 23 17:24:23.284000 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:23.283977 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-9cght_2259fa71-9594-41a1-afbb-cb25cb955aba/prometheus-operator-admission-webhook/0.log" Apr 23 17:24:23.313378 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:23.313355 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-558bdd47f8-ck8sf_120bc9ec-9a39-4aa8-862f-b8b2867ca401/telemeter-client/0.log" Apr 23 17:24:23.335471 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:23.335449 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-558bdd47f8-ck8sf_120bc9ec-9a39-4aa8-862f-b8b2867ca401/reload/0.log" Apr 23 17:24:23.357129 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:23.357075 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-558bdd47f8-ck8sf_120bc9ec-9a39-4aa8-862f-b8b2867ca401/kube-rbac-proxy/0.log" Apr 23 17:24:23.392843 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:23.392820 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-84956f9864-64zbb_086aa7f2-f4e0-44d1-8d28-be8fba79787b/thanos-query/0.log" Apr 23 17:24:23.413423 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:23.413404 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-84956f9864-64zbb_086aa7f2-f4e0-44d1-8d28-be8fba79787b/kube-rbac-proxy-web/0.log" Apr 23 17:24:23.436512 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:23.436490 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-84956f9864-64zbb_086aa7f2-f4e0-44d1-8d28-be8fba79787b/kube-rbac-proxy/0.log" Apr 23 17:24:23.456239 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:23.456217 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-84956f9864-64zbb_086aa7f2-f4e0-44d1-8d28-be8fba79787b/prom-label-proxy/0.log" Apr 23 17:24:23.476278 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:23.476258 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-84956f9864-64zbb_086aa7f2-f4e0-44d1-8d28-be8fba79787b/kube-rbac-proxy-rules/0.log" Apr 23 17:24:23.496565 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:23.496543 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-84956f9864-64zbb_086aa7f2-f4e0-44d1-8d28-be8fba79787b/kube-rbac-proxy-metrics/0.log" Apr 23 17:24:25.009454 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:25.009421 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gfsdv_3481652d-e5fb-498e-84c3-e2c629340367/console-operator/1.log" Apr 23 17:24:25.013811 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:25.013788 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gfsdv_3481652d-e5fb-498e-84c3-e2c629340367/console-operator/2.log" Apr 23 17:24:25.720385 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:25.720308 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-g5f4s_7ad7b86f-e66c-4d5e-9d95-ce20650adeba/volume-data-source-validator/0.log" Apr 23 17:24:26.184188 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:26.184151 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jm5zz/perf-node-gather-daemonset-8ptmt"] Apr 23 17:24:26.184703 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:26.184689 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b19fbb68-f0fa-4c09-94da-76960bbe359e" containerName="gather" Apr 23 17:24:26.184790 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:26.184709 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b19fbb68-f0fa-4c09-94da-76960bbe359e" containerName="gather" Apr 23 17:24:26.184790 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:26.184722 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b19fbb68-f0fa-4c09-94da-76960bbe359e" containerName="copy" Apr 23 17:24:26.184790 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:26.184729 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b19fbb68-f0fa-4c09-94da-76960bbe359e" containerName="copy" Apr 23 17:24:26.184945 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:26.184819 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="b19fbb68-f0fa-4c09-94da-76960bbe359e" containerName="copy" Apr 23 17:24:26.184945 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:26.184839 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="b19fbb68-f0fa-4c09-94da-76960bbe359e" containerName="gather" Apr 23 17:24:26.190270 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:26.190245 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-8ptmt" Apr 23 17:24:26.192775 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:26.192749 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jm5zz\"/\"kube-root-ca.crt\"" Apr 23 17:24:26.193348 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:26.193328 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-jm5zz\"/\"default-dockercfg-rwbms\"" Apr 23 17:24:26.193509 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:26.193460 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jm5zz\"/\"openshift-service-ca.crt\"" Apr 23 17:24:26.194261 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:26.194241 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jm5zz/perf-node-gather-daemonset-8ptmt"] Apr 23 17:24:26.237434 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:26.237395 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb9t6\" (UniqueName: \"kubernetes.io/projected/20e0f4f2-a8f2-4c48-a647-3a0e73fda562-kube-api-access-rb9t6\") pod \"perf-node-gather-daemonset-8ptmt\" (UID: \"20e0f4f2-a8f2-4c48-a647-3a0e73fda562\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-8ptmt" Apr 23 17:24:26.237434 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:26.237437 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/20e0f4f2-a8f2-4c48-a647-3a0e73fda562-sys\") pod \"perf-node-gather-daemonset-8ptmt\" (UID: \"20e0f4f2-a8f2-4c48-a647-3a0e73fda562\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-8ptmt" Apr 23 17:24:26.237685 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:26.237454 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/20e0f4f2-a8f2-4c48-a647-3a0e73fda562-lib-modules\") pod \"perf-node-gather-daemonset-8ptmt\" (UID: \"20e0f4f2-a8f2-4c48-a647-3a0e73fda562\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-8ptmt" Apr 23 17:24:26.237685 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:26.237559 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/20e0f4f2-a8f2-4c48-a647-3a0e73fda562-podres\") pod \"perf-node-gather-daemonset-8ptmt\" (UID: \"20e0f4f2-a8f2-4c48-a647-3a0e73fda562\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-8ptmt" Apr 23 17:24:26.237685 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:26.237603 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/20e0f4f2-a8f2-4c48-a647-3a0e73fda562-proc\") pod \"perf-node-gather-daemonset-8ptmt\" (UID: \"20e0f4f2-a8f2-4c48-a647-3a0e73fda562\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-8ptmt" Apr 23 17:24:26.338830 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:26.338787 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rb9t6\" (UniqueName: \"kubernetes.io/projected/20e0f4f2-a8f2-4c48-a647-3a0e73fda562-kube-api-access-rb9t6\") pod \"perf-node-gather-daemonset-8ptmt\" (UID: \"20e0f4f2-a8f2-4c48-a647-3a0e73fda562\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-8ptmt" Apr 23 17:24:26.338830 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:26.338831 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/20e0f4f2-a8f2-4c48-a647-3a0e73fda562-sys\") pod \"perf-node-gather-daemonset-8ptmt\" (UID: \"20e0f4f2-a8f2-4c48-a647-3a0e73fda562\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-8ptmt" Apr 23 17:24:26.339089 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:26.338848 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/20e0f4f2-a8f2-4c48-a647-3a0e73fda562-lib-modules\") pod \"perf-node-gather-daemonset-8ptmt\" (UID: \"20e0f4f2-a8f2-4c48-a647-3a0e73fda562\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-8ptmt" Apr 23 17:24:26.339089 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:26.338883 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/20e0f4f2-a8f2-4c48-a647-3a0e73fda562-podres\") pod \"perf-node-gather-daemonset-8ptmt\" (UID: \"20e0f4f2-a8f2-4c48-a647-3a0e73fda562\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-8ptmt" Apr 23 17:24:26.339089 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:26.338904 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/20e0f4f2-a8f2-4c48-a647-3a0e73fda562-proc\") pod \"perf-node-gather-daemonset-8ptmt\" (UID: \"20e0f4f2-a8f2-4c48-a647-3a0e73fda562\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-8ptmt" Apr 23 17:24:26.339089 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:26.338935 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/20e0f4f2-a8f2-4c48-a647-3a0e73fda562-sys\") pod \"perf-node-gather-daemonset-8ptmt\" (UID: \"20e0f4f2-a8f2-4c48-a647-3a0e73fda562\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-8ptmt" Apr 23 17:24:26.339089 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:26.338993 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/20e0f4f2-a8f2-4c48-a647-3a0e73fda562-proc\") pod \"perf-node-gather-daemonset-8ptmt\" (UID: \"20e0f4f2-a8f2-4c48-a647-3a0e73fda562\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-8ptmt" Apr 23 17:24:26.339089 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:26.339045 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/20e0f4f2-a8f2-4c48-a647-3a0e73fda562-lib-modules\") pod \"perf-node-gather-daemonset-8ptmt\" (UID: \"20e0f4f2-a8f2-4c48-a647-3a0e73fda562\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-8ptmt" Apr 23 17:24:26.339089 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:26.339052 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/20e0f4f2-a8f2-4c48-a647-3a0e73fda562-podres\") pod \"perf-node-gather-daemonset-8ptmt\" (UID: \"20e0f4f2-a8f2-4c48-a647-3a0e73fda562\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-8ptmt" Apr 23 17:24:26.345641 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:26.345611 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb9t6\" (UniqueName: \"kubernetes.io/projected/20e0f4f2-a8f2-4c48-a647-3a0e73fda562-kube-api-access-rb9t6\") pod \"perf-node-gather-daemonset-8ptmt\" (UID: \"20e0f4f2-a8f2-4c48-a647-3a0e73fda562\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-8ptmt" Apr 23 17:24:26.408891 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:26.408862 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-jfzk8_e2c88812-4055-43aa-8e5a-25b432f9041d/dns/0.log" Apr 23 17:24:26.430830 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:26.430804 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-jfzk8_e2c88812-4055-43aa-8e5a-25b432f9041d/kube-rbac-proxy/0.log" Apr 23 17:24:26.470567 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:26.470475 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-fgwnc_486e65b1-cb27-4533-8ab9-9a91c79c58b1/dns-node-resolver/0.log" Apr 23 17:24:26.500129 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:26.500097 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-8ptmt" Apr 23 17:24:26.616291 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:26.616268 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jm5zz/perf-node-gather-daemonset-8ptmt"] Apr 23 17:24:26.618930 ip-10-0-128-102 kubenswrapper[2569]: W0423 17:24:26.618903 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod20e0f4f2_a8f2_4c48_a647_3a0e73fda562.slice/crio-335aebaf0653ffc8e5d01eb15d37c216f0f6b3a76c8d2229bc4244d4a2626cec WatchSource:0}: Error finding container 335aebaf0653ffc8e5d01eb15d37c216f0f6b3a76c8d2229bc4244d4a2626cec: Status 404 returned error can't find the container with id 335aebaf0653ffc8e5d01eb15d37c216f0f6b3a76c8d2229bc4244d4a2626cec Apr 23 17:24:26.906036 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:26.906004 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-pmktz_919d79c7-8b2d-41ad-b0ba-bf48e8815841/node-ca/0.log" Apr 23 17:24:27.052285 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:27.052200 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-8ptmt" event={"ID":"20e0f4f2-a8f2-4c48-a647-3a0e73fda562","Type":"ContainerStarted","Data":"1c3cd81ee5641b29cd0c0a8312b139738b6dbf9659c62d9f281dff0bf80df096"} Apr 23 17:24:27.052285 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:27.052241 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-8ptmt" event={"ID":"20e0f4f2-a8f2-4c48-a647-3a0e73fda562","Type":"ContainerStarted","Data":"335aebaf0653ffc8e5d01eb15d37c216f0f6b3a76c8d2229bc4244d4a2626cec"} Apr 23 17:24:27.052467 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:27.052358 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-8ptmt" Apr 23 17:24:27.066958 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:27.066913 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-8ptmt" podStartSLOduration=1.066898819 podStartE2EDuration="1.066898819s" podCreationTimestamp="2026-04-23 17:24:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:24:27.065203163 +0000 UTC m=+2952.402466127" watchObservedRunningTime="2026-04-23 17:24:27.066898819 +0000 UTC m=+2952.404161783" Apr 23 17:24:27.527643 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:27.527615 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-b88c6764-pc8q7_4983d124-dedd-4eec-8bdd-7d87844e7eaf/router/0.log" Apr 23 17:24:27.856012 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:27.855975 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-bsgxs_3bfd1dfe-900e-4260-b0fc-9dc05d2c604c/serve-healthcheck-canary/0.log" Apr 23 17:24:28.169836 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:28.169756 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-hbwp6_5e7b64cf-d8ab-48a3-86f5-9ea5db912782/insights-operator/0.log" Apr 23 17:24:28.170864 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:28.170844 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-hbwp6_5e7b64cf-d8ab-48a3-86f5-9ea5db912782/insights-operator/1.log" Apr 23 17:24:28.188322 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:28.188291 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-b5vnc_88e1f618-5f3b-4306-a6ad-52dec47aee87/kube-rbac-proxy/0.log" Apr 23 17:24:28.206654 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:28.206625 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-b5vnc_88e1f618-5f3b-4306-a6ad-52dec47aee87/exporter/0.log" Apr 23 17:24:28.224897 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:28.224869 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-b5vnc_88e1f618-5f3b-4306-a6ad-52dec47aee87/extractor/0.log" Apr 23 17:24:30.527911 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:30.527876 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-c2gw8_9953e210-80fc-47e3-9bde-7f00b37b8f53/s3-init/0.log" Apr 23 17:24:33.064805 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:33.064775 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-8ptmt" Apr 23 17:24:33.857376 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:33.857344 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-rwgzf_e972d7fb-a515-4ee1-9487-5af95477b522/migrator/0.log" Apr 23 17:24:33.875985 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:33.875960 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-rwgzf_e972d7fb-a515-4ee1-9487-5af95477b522/graceful-termination/0.log" Apr 23 17:24:35.018020 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:35.017990 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8slg2_63340da2-b7c8-4798-a1ed-d8a80bf900b6/kube-multus/0.log" Apr 23 17:24:35.318540 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:35.318511 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z8jnl_64f2e8d8-0a24-4b00-a66e-91dd67594081/kube-multus-additional-cni-plugins/0.log" Apr 23 17:24:35.339486 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:35.339453 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z8jnl_64f2e8d8-0a24-4b00-a66e-91dd67594081/egress-router-binary-copy/0.log" Apr 23 17:24:35.360993 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:35.360975 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z8jnl_64f2e8d8-0a24-4b00-a66e-91dd67594081/cni-plugins/0.log" Apr 23 17:24:35.381737 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:35.381711 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z8jnl_64f2e8d8-0a24-4b00-a66e-91dd67594081/bond-cni-plugin/0.log" Apr 23 17:24:35.401779 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:35.401755 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z8jnl_64f2e8d8-0a24-4b00-a66e-91dd67594081/routeoverride-cni/0.log" Apr 23 17:24:35.424549 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:35.424525 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z8jnl_64f2e8d8-0a24-4b00-a66e-91dd67594081/whereabouts-cni-bincopy/0.log" Apr 23 17:24:35.444277 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:35.444259 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z8jnl_64f2e8d8-0a24-4b00-a66e-91dd67594081/whereabouts-cni/0.log" Apr 23 17:24:35.533405 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:35.533379 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-k7n97_1eabe990-f610-4e94-8a89-7cff1c9a6a23/network-metrics-daemon/0.log" Apr 23 17:24:35.553405 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:35.553386 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-k7n97_1eabe990-f610-4e94-8a89-7cff1c9a6a23/kube-rbac-proxy/0.log" Apr 23 17:24:36.286263 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:36.286238 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddr2h_a063c6b7-80d4-45d7-815d-88d94693a0b1/ovn-controller/0.log" Apr 23 17:24:36.304450 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:36.304419 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddr2h_a063c6b7-80d4-45d7-815d-88d94693a0b1/ovn-acl-logging/0.log" Apr 23 17:24:36.317107 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:36.317072 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddr2h_a063c6b7-80d4-45d7-815d-88d94693a0b1/ovn-acl-logging/1.log" Apr 23 17:24:36.334638 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:36.334606 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddr2h_a063c6b7-80d4-45d7-815d-88d94693a0b1/kube-rbac-proxy-node/0.log" Apr 23 17:24:36.353044 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:36.353008 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddr2h_a063c6b7-80d4-45d7-815d-88d94693a0b1/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 17:24:36.369961 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:36.369938 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddr2h_a063c6b7-80d4-45d7-815d-88d94693a0b1/northd/0.log" Apr 23 17:24:36.388518 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:36.388484 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddr2h_a063c6b7-80d4-45d7-815d-88d94693a0b1/nbdb/0.log" Apr 23 17:24:36.407695 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:36.407651 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddr2h_a063c6b7-80d4-45d7-815d-88d94693a0b1/sbdb/0.log" Apr 23 17:24:36.505312 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:36.505271 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddr2h_a063c6b7-80d4-45d7-815d-88d94693a0b1/ovnkube-controller/0.log" Apr 23 17:24:37.993571 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:37.993544 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-ktsss_1ba421e3-97e2-473e-a145-bf072f2b9393/network-check-target-container/0.log" Apr 23 17:24:38.826293 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:38.826265 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-6xw6f_2c38e0bd-62a0-43a1-b0e2-05e9ca0f084b/iptables-alerter/0.log" Apr 23 17:24:39.485199 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:39.485165 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-s4jdv_14a8d0f4-f62f-4c14-8b41-bf2f7476215d/tuned/0.log" Apr 23 17:24:42.210607 ip-10-0-128-102 kubenswrapper[2569]: I0423 17:24:42.210575 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca_service-ca-865cb79987-txswz_7af67d0f-8d44-419f-b4a9-d9ac7d69ed85/service-ca-controller/0.log"