Apr 20 12:11:50.825022 ip-10-0-137-91 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 20 12:11:50.825039 ip-10-0-137-91 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 20 12:11:50.825051 ip-10-0-137-91 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 20 12:11:50.825385 ip-10-0-137-91 systemd[1]: Failed to start Kubernetes Kubelet. Apr 20 12:12:01.053286 ip-10-0-137-91 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 20 12:12:01.053300 ip-10-0-137-91 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot c986758d2d184a72bad8740a3665b63a -- Apr 20 12:14:26.163745 ip-10-0-137-91 systemd[1]: Starting Kubernetes Kubelet... Apr 20 12:14:26.580299 ip-10-0-137-91 kubenswrapper[2580]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 12:14:26.580299 ip-10-0-137-91 kubenswrapper[2580]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 12:14:26.580299 ip-10-0-137-91 kubenswrapper[2580]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 12:14:26.580299 ip-10-0-137-91 kubenswrapper[2580]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 12:14:26.580299 ip-10-0-137-91 kubenswrapper[2580]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 12:14:26.582054 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.581904 2580 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 12:14:26.584768 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584750 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 12:14:26.584768 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584768 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 12:14:26.584836 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584773 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 12:14:26.584836 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584777 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 12:14:26.584836 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584780 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 12:14:26.584836 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584783 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 12:14:26.584836 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584787 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 12:14:26.584836 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584790 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 12:14:26.584836 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584793 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 12:14:26.584836 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584795 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 12:14:26.584836 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584805 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 12:14:26.584836 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584808 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 12:14:26.584836 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584811 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 12:14:26.584836 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584813 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 12:14:26.584836 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584816 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 12:14:26.584836 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584819 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 12:14:26.584836 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584821 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 12:14:26.584836 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584824 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 12:14:26.584836 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584827 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 12:14:26.584836 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584829 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 12:14:26.584836 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584832 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 12:14:26.584836 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584834 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 12:14:26.585318 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584837 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 12:14:26.585318 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584839 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 12:14:26.585318 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584842 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 12:14:26.585318 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584845 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 12:14:26.585318 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584848 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 12:14:26.585318 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584851 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 12:14:26.585318 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584853 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 12:14:26.585318 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584856 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 12:14:26.585318 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584859 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 12:14:26.585318 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584861 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 12:14:26.585318 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584863 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 12:14:26.585318 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584866 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 12:14:26.585318 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584869 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 12:14:26.585318 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584871 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 12:14:26.585318 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584873 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 12:14:26.585318 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584876 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 12:14:26.585318 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584879 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 12:14:26.585318 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584882 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 12:14:26.585318 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584885 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 12:14:26.585318 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584887 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 12:14:26.585832 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584890 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 12:14:26.585832 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584893 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 12:14:26.585832 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584896 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 12:14:26.585832 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584898 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 12:14:26.585832 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584901 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 12:14:26.585832 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584903 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 12:14:26.585832 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584906 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 12:14:26.585832 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584910 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 12:14:26.585832 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584914 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 12:14:26.585832 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584917 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 12:14:26.585832 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584920 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 12:14:26.585832 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584922 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 12:14:26.585832 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584926 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 12:14:26.585832 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584931 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 12:14:26.585832 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584934 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 12:14:26.585832 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584937 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 12:14:26.585832 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584940 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 12:14:26.585832 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584944 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 12:14:26.585832 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584947 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 12:14:26.586291 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584949 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 12:14:26.586291 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584952 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 12:14:26.586291 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584954 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 12:14:26.586291 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584957 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 12:14:26.586291 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584960 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 12:14:26.586291 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584962 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 12:14:26.586291 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584965 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 12:14:26.586291 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584967 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 12:14:26.586291 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584970 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 12:14:26.586291 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584973 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 12:14:26.586291 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584976 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 12:14:26.586291 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584978 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 12:14:26.586291 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584981 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 20 12:14:26.586291 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584984 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 12:14:26.586291 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584987 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 12:14:26.586291 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584990 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 12:14:26.586291 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584992 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 12:14:26.586291 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584995 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 12:14:26.586291 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584997 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 12:14:26.586291 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.584999 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 12:14:26.586774 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585002 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 12:14:26.586774 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585005 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 12:14:26.586774 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585008 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 12:14:26.586774 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585011 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 12:14:26.586774 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585014 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 12:14:26.586774 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585445 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 12:14:26.586774 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585451 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 12:14:26.586774 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585453 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 12:14:26.586774 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585456 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 12:14:26.586774 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585459 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 12:14:26.586774 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585462 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 12:14:26.586774 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585464 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 12:14:26.586774 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585467 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 12:14:26.586774 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585471 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 12:14:26.586774 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585474 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 12:14:26.586774 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585477 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 12:14:26.586774 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585480 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 12:14:26.586774 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585482 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 12:14:26.586774 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585485 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 12:14:26.587229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585488 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 12:14:26.587229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585491 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 12:14:26.587229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585493 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 12:14:26.587229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585496 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 12:14:26.587229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585499 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 12:14:26.587229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585501 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 12:14:26.587229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585505 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 12:14:26.587229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585508 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 12:14:26.587229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585510 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 12:14:26.587229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585513 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 12:14:26.587229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585516 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 12:14:26.587229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585518 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 12:14:26.587229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585521 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 12:14:26.587229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585523 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 12:14:26.587229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585526 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 12:14:26.587229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585529 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 12:14:26.587229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585531 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 12:14:26.587229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585534 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 12:14:26.587229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585537 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 12:14:26.587229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585539 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 12:14:26.587770 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585541 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 12:14:26.587770 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585544 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 12:14:26.587770 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585546 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 12:14:26.587770 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585549 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 12:14:26.587770 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585553 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 12:14:26.587770 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585555 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 12:14:26.587770 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585558 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 12:14:26.587770 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585560 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 12:14:26.587770 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585563 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 12:14:26.587770 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585565 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 12:14:26.587770 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585568 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 12:14:26.587770 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585572 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 12:14:26.587770 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585575 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 12:14:26.587770 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585577 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 20 12:14:26.587770 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585580 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 12:14:26.587770 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585583 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 12:14:26.587770 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585585 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 12:14:26.587770 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585588 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 12:14:26.587770 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585590 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 12:14:26.587770 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585594 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 12:14:26.588252 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585596 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 12:14:26.588252 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585598 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 12:14:26.588252 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585602 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 12:14:26.588252 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585604 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 12:14:26.588252 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585607 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 12:14:26.588252 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585610 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 12:14:26.588252 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585612 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 12:14:26.588252 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585615 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 12:14:26.588252 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585617 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 12:14:26.588252 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585620 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 12:14:26.588252 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585623 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 12:14:26.588252 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585625 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 12:14:26.588252 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585628 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 12:14:26.588252 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585630 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 12:14:26.588252 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585633 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 12:14:26.588252 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585635 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 12:14:26.588252 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585638 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 12:14:26.588252 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585640 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 12:14:26.588252 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585644 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 12:14:26.588252 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585648 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 12:14:26.588758 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585652 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 12:14:26.588758 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585655 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 12:14:26.588758 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585658 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 12:14:26.588758 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585661 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 12:14:26.588758 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585663 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 12:14:26.588758 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585666 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 12:14:26.588758 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585669 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 12:14:26.588758 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585671 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 12:14:26.588758 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585674 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 12:14:26.588758 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585677 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 12:14:26.588758 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585679 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 12:14:26.588758 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.585681 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 12:14:26.588758 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586696 2580 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 12:14:26.588758 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586706 2580 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 12:14:26.588758 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586714 2580 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 12:14:26.588758 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586719 2580 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 12:14:26.588758 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586724 2580 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 12:14:26.588758 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586728 2580 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 12:14:26.588758 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586733 2580 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 12:14:26.588758 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586737 2580 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 12:14:26.588758 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586741 2580 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 12:14:26.589276 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586744 2580 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 12:14:26.589276 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586748 2580 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 12:14:26.589276 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586751 2580 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 12:14:26.589276 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586754 2580 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 12:14:26.589276 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586757 2580 flags.go:64] FLAG: --cgroup-root="" Apr 20 12:14:26.589276 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586761 2580 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 12:14:26.589276 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586764 2580 flags.go:64] FLAG: --client-ca-file="" Apr 20 12:14:26.589276 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586767 2580 flags.go:64] FLAG: --cloud-config="" Apr 20 12:14:26.589276 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586769 2580 flags.go:64] FLAG: --cloud-provider="external" Apr 20 12:14:26.589276 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586773 2580 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 12:14:26.589276 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586777 2580 flags.go:64] FLAG: --cluster-domain="" Apr 20 12:14:26.589276 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586780 2580 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 12:14:26.589276 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586784 2580 flags.go:64] FLAG: --config-dir="" Apr 20 12:14:26.589276 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586787 2580 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 12:14:26.589276 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586790 2580 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 12:14:26.589276 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586794 2580 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 12:14:26.589276 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586797 2580 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 12:14:26.589276 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586801 2580 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 12:14:26.589276 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586804 2580 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 12:14:26.589276 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586807 2580 flags.go:64] FLAG: --contention-profiling="false" Apr 20 12:14:26.589276 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586810 2580 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 12:14:26.589276 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586813 2580 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 12:14:26.589276 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586816 2580 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 12:14:26.589276 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586819 2580 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 12:14:26.589276 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586825 2580 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 12:14:26.589922 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586828 2580 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 12:14:26.589922 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586832 2580 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 12:14:26.589922 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586835 2580 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 12:14:26.589922 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586838 2580 flags.go:64] FLAG: --enable-server="true" Apr 20 12:14:26.589922 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586841 2580 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 12:14:26.589922 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586847 2580 flags.go:64] FLAG: --event-burst="100" Apr 20 12:14:26.589922 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586850 2580 flags.go:64] FLAG: --event-qps="50" Apr 20 12:14:26.589922 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586853 2580 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 12:14:26.589922 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586857 2580 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 12:14:26.589922 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586860 2580 flags.go:64] FLAG: --eviction-hard="" Apr 20 12:14:26.589922 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586864 2580 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 12:14:26.589922 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586867 2580 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 12:14:26.589922 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586871 2580 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 12:14:26.589922 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586874 2580 flags.go:64] FLAG: --eviction-soft="" Apr 20 12:14:26.589922 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586877 2580 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 12:14:26.589922 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586880 2580 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 12:14:26.589922 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586883 2580 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 12:14:26.589922 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586886 2580 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 12:14:26.589922 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586889 2580 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 12:14:26.589922 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586892 2580 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 12:14:26.589922 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586895 2580 flags.go:64] FLAG: --feature-gates="" Apr 20 12:14:26.589922 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586899 2580 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 12:14:26.589922 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586902 2580 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 12:14:26.589922 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586905 2580 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 12:14:26.589922 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586908 2580 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 12:14:26.590593 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586911 2580 flags.go:64] FLAG: --healthz-port="10248" Apr 20 12:14:26.590593 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586915 2580 flags.go:64] FLAG: --help="false" Apr 20 12:14:26.590593 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586918 2580 flags.go:64] FLAG: --hostname-override="ip-10-0-137-91.ec2.internal" Apr 20 12:14:26.590593 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586921 2580 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 12:14:26.590593 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586924 2580 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 12:14:26.590593 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586927 2580 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 12:14:26.590593 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586931 2580 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 12:14:26.590593 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586935 2580 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 12:14:26.590593 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586938 2580 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 12:14:26.590593 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586941 2580 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 12:14:26.590593 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586944 2580 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 12:14:26.590593 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586947 2580 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 12:14:26.590593 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586950 2580 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 12:14:26.590593 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586953 2580 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 12:14:26.590593 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586956 2580 flags.go:64] FLAG: --kube-reserved="" Apr 20 12:14:26.590593 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586959 2580 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 12:14:26.590593 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586962 2580 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 12:14:26.590593 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586965 2580 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 12:14:26.590593 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586968 2580 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 12:14:26.590593 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586971 2580 flags.go:64] FLAG: --lock-file="" Apr 20 12:14:26.590593 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586974 2580 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 12:14:26.590593 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586977 2580 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 12:14:26.590593 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586980 2580 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 12:14:26.590593 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586985 2580 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 12:14:26.591174 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586988 2580 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 12:14:26.591174 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586991 2580 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 12:14:26.591174 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586994 2580 flags.go:64] FLAG: --logging-format="text" Apr 20 12:14:26.591174 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.586997 2580 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 12:14:26.591174 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587000 2580 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 12:14:26.591174 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587003 2580 flags.go:64] FLAG: --manifest-url="" Apr 20 12:14:26.591174 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587006 2580 flags.go:64] FLAG: --manifest-url-header="" Apr 20 12:14:26.591174 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587010 2580 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 12:14:26.591174 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587013 2580 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 12:14:26.591174 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587017 2580 flags.go:64] FLAG: --max-pods="110" Apr 20 12:14:26.591174 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587020 2580 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 12:14:26.591174 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587026 2580 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 12:14:26.591174 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587029 2580 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 12:14:26.591174 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587032 2580 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 12:14:26.591174 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587035 2580 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 12:14:26.591174 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587039 2580 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 12:14:26.591174 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587042 2580 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 12:14:26.591174 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587050 2580 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 12:14:26.591174 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587053 2580 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 12:14:26.591174 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587056 2580 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 12:14:26.591174 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587060 2580 flags.go:64] FLAG: --pod-cidr="" Apr 20 12:14:26.591174 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587063 2580 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 12:14:26.591174 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587069 2580 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 12:14:26.591769 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587072 2580 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 12:14:26.591769 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587075 2580 flags.go:64] FLAG: --pods-per-core="0" Apr 20 12:14:26.591769 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587078 2580 flags.go:64] FLAG: --port="10250" Apr 20 12:14:26.591769 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587081 2580 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 12:14:26.591769 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587084 2580 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-07e4fa8f2f7cdf211" Apr 20 12:14:26.591769 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587088 2580 flags.go:64] FLAG: --qos-reserved="" Apr 20 12:14:26.591769 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587090 2580 flags.go:64] FLAG: --read-only-port="10255" Apr 20 12:14:26.591769 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587093 2580 flags.go:64] FLAG: --register-node="true" Apr 20 12:14:26.591769 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587096 2580 flags.go:64] FLAG: --register-schedulable="true" Apr 20 12:14:26.591769 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587099 2580 flags.go:64] FLAG: --register-with-taints="" Apr 20 12:14:26.591769 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587103 2580 flags.go:64] FLAG: --registry-burst="10" Apr 20 12:14:26.591769 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587106 2580 flags.go:64] FLAG: --registry-qps="5" Apr 20 12:14:26.591769 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587108 2580 flags.go:64] FLAG: --reserved-cpus="" Apr 20 12:14:26.591769 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587111 2580 flags.go:64] FLAG: --reserved-memory="" Apr 20 12:14:26.591769 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587115 2580 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 12:14:26.591769 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587118 2580 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 12:14:26.591769 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587121 2580 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 12:14:26.591769 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587124 2580 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 12:14:26.591769 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587127 2580 flags.go:64] FLAG: --runonce="false" Apr 20 12:14:26.591769 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587130 2580 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 12:14:26.591769 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587135 2580 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 12:14:26.591769 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587138 2580 flags.go:64] FLAG: --seccomp-default="false" Apr 20 12:14:26.591769 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587141 2580 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 12:14:26.591769 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587144 2580 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 12:14:26.591769 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587147 2580 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 12:14:26.591769 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587151 2580 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 12:14:26.592423 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587155 2580 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 12:14:26.592423 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587158 2580 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 12:14:26.592423 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587161 2580 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 12:14:26.592423 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587164 2580 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 12:14:26.592423 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587167 2580 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 12:14:26.592423 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587170 2580 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 12:14:26.592423 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587173 2580 flags.go:64] FLAG: --system-cgroups="" Apr 20 12:14:26.592423 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587176 2580 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 12:14:26.592423 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587182 2580 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 12:14:26.592423 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587184 2580 flags.go:64] FLAG: --tls-cert-file="" Apr 20 12:14:26.592423 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587187 2580 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 12:14:26.592423 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587191 2580 flags.go:64] FLAG: --tls-min-version="" Apr 20 12:14:26.592423 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587194 2580 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 12:14:26.592423 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587198 2580 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 12:14:26.592423 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587201 2580 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 12:14:26.592423 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587204 2580 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 12:14:26.592423 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587207 2580 flags.go:64] FLAG: --v="2" Apr 20 12:14:26.592423 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587211 2580 flags.go:64] FLAG: --version="false" Apr 20 12:14:26.592423 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587215 2580 flags.go:64] FLAG: --vmodule="" Apr 20 12:14:26.592423 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587220 2580 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 12:14:26.592423 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.587223 2580 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 12:14:26.592423 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587328 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 12:14:26.592423 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587332 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 12:14:26.592423 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587335 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 12:14:26.593013 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587338 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 12:14:26.593013 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587340 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 12:14:26.593013 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587344 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 12:14:26.593013 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587347 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 12:14:26.593013 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587350 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 12:14:26.593013 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587352 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 12:14:26.593013 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587356 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 12:14:26.593013 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587359 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 12:14:26.593013 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587363 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 12:14:26.593013 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587366 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 12:14:26.593013 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587369 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 12:14:26.593013 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587371 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 12:14:26.593013 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587374 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 12:14:26.593013 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587377 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 12:14:26.593013 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587380 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 12:14:26.593013 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587383 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 12:14:26.593013 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587385 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 12:14:26.593013 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587388 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 12:14:26.593013 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587390 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 12:14:26.593519 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587405 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 12:14:26.593519 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587408 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 12:14:26.593519 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587411 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 20 12:14:26.593519 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587413 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 12:14:26.593519 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587416 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 12:14:26.593519 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587420 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 12:14:26.593519 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587424 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 12:14:26.593519 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587427 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 12:14:26.593519 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587431 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 12:14:26.593519 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587434 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 12:14:26.593519 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587437 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 12:14:26.593519 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587440 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 12:14:26.593519 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587442 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 12:14:26.593519 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587445 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 12:14:26.593519 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587448 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 12:14:26.593519 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587452 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 12:14:26.593519 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587455 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 12:14:26.593519 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587459 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 12:14:26.593519 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587461 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 12:14:26.594087 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587464 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 12:14:26.594087 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587468 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 12:14:26.594087 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587470 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 12:14:26.594087 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587473 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 12:14:26.594087 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587476 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 12:14:26.594087 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587479 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 12:14:26.594087 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587482 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 12:14:26.594087 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587485 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 12:14:26.594087 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587488 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 12:14:26.594087 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587490 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 12:14:26.594087 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587493 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 12:14:26.594087 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587495 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 12:14:26.594087 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587498 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 12:14:26.594087 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587501 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 12:14:26.594087 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587503 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 12:14:26.594087 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587506 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 12:14:26.594087 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587508 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 12:14:26.594087 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587511 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 12:14:26.594087 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587513 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 12:14:26.594087 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587517 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 12:14:26.594657 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587521 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 12:14:26.594657 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587523 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 12:14:26.594657 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587526 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 12:14:26.594657 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587528 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 12:14:26.594657 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587531 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 12:14:26.594657 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587533 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 12:14:26.594657 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587536 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 12:14:26.594657 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587538 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 12:14:26.594657 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587542 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 12:14:26.594657 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587545 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 12:14:26.594657 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587547 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 12:14:26.594657 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587550 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 12:14:26.594657 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587553 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 12:14:26.594657 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587556 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 12:14:26.594657 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587559 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 12:14:26.594657 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587561 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 12:14:26.594657 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587564 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 12:14:26.594657 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587567 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 12:14:26.594657 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587570 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 12:14:26.594657 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587572 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 12:14:26.595198 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587575 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 12:14:26.595198 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587578 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 12:14:26.595198 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587580 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 12:14:26.595198 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587583 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 12:14:26.595198 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.587586 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 12:14:26.595198 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.588347 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 12:14:26.595198 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.595024 2580 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 12:14:26.595198 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.595043 2580 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 12:14:26.595198 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595095 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 12:14:26.595198 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595100 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 12:14:26.595198 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595103 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 12:14:26.595198 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595106 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 12:14:26.595198 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595111 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 12:14:26.595198 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595115 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 12:14:26.595198 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595118 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 12:14:26.595198 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595122 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 12:14:26.595667 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595124 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 12:14:26.595667 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595127 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 12:14:26.595667 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595130 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 12:14:26.595667 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595133 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 12:14:26.595667 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595135 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 12:14:26.595667 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595138 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 12:14:26.595667 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595141 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 12:14:26.595667 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595143 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 12:14:26.595667 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595146 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 12:14:26.595667 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595148 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 12:14:26.595667 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595151 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 12:14:26.595667 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595155 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 12:14:26.595667 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595159 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 12:14:26.595667 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595162 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 12:14:26.595667 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595165 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 12:14:26.595667 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595168 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 12:14:26.595667 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595170 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 12:14:26.595667 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595173 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 12:14:26.595667 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595175 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 12:14:26.595667 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595178 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 12:14:26.596229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595180 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 20 12:14:26.596229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595183 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 12:14:26.596229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595188 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 12:14:26.596229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595190 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 12:14:26.596229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595193 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 12:14:26.596229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595196 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 12:14:26.596229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595198 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 12:14:26.596229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595201 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 12:14:26.596229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595204 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 12:14:26.596229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595208 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 12:14:26.596229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595210 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 12:14:26.596229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595213 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 12:14:26.596229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595216 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 12:14:26.596229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595219 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 12:14:26.596229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595221 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 12:14:26.596229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595224 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 12:14:26.596229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595226 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 12:14:26.596229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595229 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 12:14:26.596229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595231 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 12:14:26.596229 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595234 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 12:14:26.597095 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595237 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 12:14:26.597095 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595239 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 12:14:26.597095 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595242 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 12:14:26.597095 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595244 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 12:14:26.597095 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595247 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 12:14:26.597095 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595249 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 12:14:26.597095 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595253 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 12:14:26.597095 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595255 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 12:14:26.597095 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595258 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 12:14:26.597095 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595260 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 12:14:26.597095 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595263 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 12:14:26.597095 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595265 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 12:14:26.597095 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595268 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 12:14:26.597095 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595271 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 12:14:26.597095 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595273 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 12:14:26.597095 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595277 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 12:14:26.597095 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595279 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 12:14:26.597095 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595282 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 12:14:26.597095 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595285 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 12:14:26.597095 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595287 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 12:14:26.597703 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595290 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 12:14:26.597703 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595293 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 12:14:26.597703 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595295 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 12:14:26.597703 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595298 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 12:14:26.597703 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595300 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 12:14:26.597703 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595303 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 12:14:26.597703 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595306 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 12:14:26.597703 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595308 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 12:14:26.597703 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595311 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 12:14:26.597703 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595313 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 12:14:26.597703 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595316 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 12:14:26.597703 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595318 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 12:14:26.597703 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595321 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 12:14:26.597703 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595323 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 12:14:26.597703 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595326 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 12:14:26.597703 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595328 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 12:14:26.597703 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595331 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 12:14:26.597703 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595333 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 12:14:26.598151 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.595339 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 12:14:26.598151 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595458 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 12:14:26.598151 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595464 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 12:14:26.598151 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595469 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 12:14:26.598151 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595473 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 20 12:14:26.598151 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595476 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 12:14:26.598151 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595479 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 12:14:26.598151 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595482 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 12:14:26.598151 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595485 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 12:14:26.598151 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595488 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 12:14:26.598151 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595491 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 12:14:26.598151 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595494 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 12:14:26.598151 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595497 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 12:14:26.598151 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595499 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 12:14:26.598151 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595502 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 12:14:26.598602 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595505 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 12:14:26.598602 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595507 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 12:14:26.598602 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595510 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 12:14:26.598602 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595513 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 12:14:26.598602 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595515 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 12:14:26.598602 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595518 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 12:14:26.598602 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595520 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 12:14:26.598602 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595523 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 12:14:26.598602 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595525 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 12:14:26.598602 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595528 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 12:14:26.598602 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595530 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 12:14:26.598602 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595533 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 12:14:26.598602 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595535 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 12:14:26.598602 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595538 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 12:14:26.598602 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595540 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 12:14:26.598602 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595543 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 12:14:26.598602 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595545 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 12:14:26.598602 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595548 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 12:14:26.598602 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595551 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 12:14:26.599057 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595600 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 12:14:26.599057 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595654 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 12:14:26.599057 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595659 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 12:14:26.599057 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595663 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 12:14:26.599057 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595668 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 12:14:26.599057 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595673 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 12:14:26.599057 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595679 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 12:14:26.599057 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595683 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 12:14:26.599057 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595688 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 12:14:26.599057 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595693 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 12:14:26.599057 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595698 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 12:14:26.599057 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595703 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 12:14:26.599057 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595708 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 12:14:26.599057 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595712 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 12:14:26.599057 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.595721 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 12:14:26.599057 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.596454 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 12:14:26.599057 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.596460 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 12:14:26.599057 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.596466 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 12:14:26.599057 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.596472 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 12:14:26.599057 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.596478 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 12:14:26.599580 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.596483 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 12:14:26.599580 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.596488 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 12:14:26.599580 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.596494 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 12:14:26.599580 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.596498 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 12:14:26.599580 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.596504 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 12:14:26.599580 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.596508 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 12:14:26.599580 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.596519 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 12:14:26.599580 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.596523 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 12:14:26.599580 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.596527 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 12:14:26.599580 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.596532 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 12:14:26.599580 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.596536 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 12:14:26.599580 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.596540 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 12:14:26.599580 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.596544 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 12:14:26.599580 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.596548 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 12:14:26.599580 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.596552 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 12:14:26.599580 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.596557 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 12:14:26.599580 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.596561 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 12:14:26.599580 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.596565 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 12:14:26.599580 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.596569 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 12:14:26.599580 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.596577 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 12:14:26.600087 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.596581 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 12:14:26.600087 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.596585 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 12:14:26.600087 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.596589 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 12:14:26.600087 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.596594 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 12:14:26.600087 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.596598 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 12:14:26.600087 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.596602 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 12:14:26.600087 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.596606 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 12:14:26.600087 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.596612 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 12:14:26.600087 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.596617 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 12:14:26.600087 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.596622 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 12:14:26.600087 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.596627 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 12:14:26.600087 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.596631 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 12:14:26.600087 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:26.596640 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 12:14:26.600087 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.596649 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 12:14:26.600087 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.598173 2580 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 12:14:26.600482 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.600105 2580 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 12:14:26.601080 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.601067 2580 server.go:1019] "Starting client certificate rotation" Apr 20 12:14:26.601177 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.601165 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 12:14:26.601211 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.601203 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 12:14:26.626782 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.626756 2580 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 12:14:26.631018 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.630991 2580 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 12:14:26.645266 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.645238 2580 log.go:25] "Validated CRI v1 runtime API" Apr 20 12:14:26.650747 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.650727 2580 log.go:25] "Validated CRI v1 image API" Apr 20 12:14:26.652863 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.652841 2580 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 12:14:26.657013 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.656983 2580 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 a94acf25-1993-40d0-a38b-856438dc4a76:/dev/nvme0n1p3 c787c446-48ef-43dd-8ba8-21b444f32a96:/dev/nvme0n1p4] Apr 20 12:14:26.657122 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.657009 2580 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 12:14:26.657585 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.657554 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 12:14:26.664150 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.664010 2580 manager.go:217] Machine: {Timestamp:2026-04-20 12:14:26.662307251 +0000 UTC m=+0.382321235 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3104692 MemoryCapacity:33164476416 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec241402fc78ca71bdd32d06e83e4886 SystemUUID:ec241402-fc78-ca71-bdd3-2d06e83e4886 BootID:c986758d-2d18-4a72-bad8-740a3665b63a Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582238208 Type:vfs Inodes:4048398 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582238208 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:c7:e0:bf:c9:ef Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:c7:e0:bf:c9:ef Speed:0 Mtu:9001} {Name:ovs-system MacAddress:da:0a:82:87:97:74 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164476416 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 12:14:26.664639 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.664627 2580 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 12:14:26.664744 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.664731 2580 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 12:14:26.665741 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.665705 2580 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 12:14:26.665888 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.665745 2580 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-91.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 12:14:26.665930 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.665898 2580 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 12:14:26.665930 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.665908 2580 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 12:14:26.665930 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.665921 2580 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 12:14:26.666621 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.666609 2580 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 12:14:26.667959 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.667947 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 20 12:14:26.668074 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.668065 2580 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 12:14:26.670652 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.670639 2580 kubelet.go:491] "Attempting to sync node with API server" Apr 20 12:14:26.670695 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.670657 2580 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 12:14:26.670695 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.670674 2580 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 12:14:26.670695 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.670684 2580 kubelet.go:397] "Adding apiserver pod source" Apr 20 12:14:26.670695 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.670694 2580 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 12:14:26.671953 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.671925 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 12:14:26.671953 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.671948 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 12:14:26.675377 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.675353 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-ph7ch" Apr 20 12:14:26.675778 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.675758 2580 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 12:14:26.677098 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.677083 2580 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 12:14:26.678906 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.678891 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 12:14:26.678966 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.678916 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 12:14:26.678966 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.678926 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 12:14:26.678966 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.678934 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 12:14:26.678966 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.678943 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 12:14:26.678966 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.678949 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 12:14:26.678966 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.678955 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 12:14:26.678966 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.678961 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 12:14:26.678966 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.678969 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 12:14:26.679172 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.678975 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 12:14:26.679172 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.678984 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 12:14:26.679172 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.678994 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 12:14:26.680644 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.680626 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 12:14:26.680644 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.680640 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 12:14:26.682567 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.682544 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-ph7ch" Apr 20 12:14:26.683204 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:26.683178 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-137-91.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 12:14:26.683249 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:26.683178 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 12:14:26.684763 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.684750 2580 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 12:14:26.684817 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.684790 2580 server.go:1295] "Started kubelet" Apr 20 12:14:26.685554 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.685510 2580 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 12:14:26.685547 ip-10-0-137-91 systemd[1]: Started Kubernetes Kubelet. Apr 20 12:14:26.685717 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.685576 2580 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 12:14:26.685717 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.685527 2580 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 12:14:26.686655 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.686638 2580 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 12:14:26.687376 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.687356 2580 server.go:317] "Adding debug handlers to kubelet server" Apr 20 12:14:26.693051 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.693030 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 12:14:26.693560 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.693544 2580 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 12:14:26.693654 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:26.693621 2580 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 12:14:26.694170 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.694153 2580 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 12:14:26.694170 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.694173 2580 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 12:14:26.694304 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.694160 2580 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 12:14:26.694345 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.694304 2580 reconstruct.go:97] "Volume reconstruction finished" Apr 20 12:14:26.694345 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.694312 2580 reconciler.go:26] "Reconciler: start to sync state" Apr 20 12:14:26.694451 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:26.694351 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-91.ec2.internal\" not found" Apr 20 12:14:26.694531 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.694517 2580 factory.go:153] Registering CRI-O factory Apr 20 12:14:26.694580 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.694566 2580 factory.go:223] Registration of the crio container factory successfully Apr 20 12:14:26.694641 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.694631 2580 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 12:14:26.694641 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.694641 2580 factory.go:55] Registering systemd factory Apr 20 12:14:26.694721 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.694648 2580 factory.go:223] Registration of the systemd container factory successfully Apr 20 12:14:26.694721 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.694663 2580 factory.go:103] Registering Raw factory Apr 20 12:14:26.694721 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.694674 2580 manager.go:1196] Started watching for new ooms in manager Apr 20 12:14:26.695808 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.695788 2580 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 12:14:26.695880 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.695850 2580 manager.go:319] Starting recovery of all containers Apr 20 12:14:26.697039 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.697016 2580 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-137-91.ec2.internal" not found Apr 20 12:14:26.699228 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:26.699204 2580 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-137-91.ec2.internal\" not found" node="ip-10-0-137-91.ec2.internal" Apr 20 12:14:26.707466 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.707283 2580 manager.go:324] Recovery completed Apr 20 12:14:26.708934 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:26.708859 2580 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 20 12:14:26.711844 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.711831 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 12:14:26.712640 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.712625 2580 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-137-91.ec2.internal" not found Apr 20 12:14:26.714210 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.714193 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-91.ec2.internal" event="NodeHasSufficientMemory" Apr 20 12:14:26.714265 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.714224 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-91.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 12:14:26.714265 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.714237 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-91.ec2.internal" event="NodeHasSufficientPID" Apr 20 12:14:26.714815 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.714797 2580 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 12:14:26.714815 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.714815 2580 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 12:14:26.714948 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.714836 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 20 12:14:26.717100 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.717085 2580 policy_none.go:49] "None policy: Start" Apr 20 12:14:26.717171 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.717108 2580 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 12:14:26.717171 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.717121 2580 state_mem.go:35] "Initializing new in-memory state store" Apr 20 12:14:26.764912 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.761121 2580 manager.go:341] "Starting Device Plugin manager" Apr 20 12:14:26.764912 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:26.761155 2580 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 12:14:26.764912 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.761166 2580 server.go:85] "Starting device plugin registration server" Apr 20 12:14:26.764912 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.761485 2580 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 12:14:26.764912 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.761500 2580 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 12:14:26.764912 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.761591 2580 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 12:14:26.764912 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.761689 2580 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 12:14:26.764912 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.761704 2580 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 12:14:26.764912 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:26.762343 2580 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 12:14:26.764912 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:26.762382 2580 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-91.ec2.internal\" not found" Apr 20 12:14:26.769213 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.769195 2580 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-137-91.ec2.internal" not found Apr 20 12:14:26.840657 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.840559 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 12:14:26.841779 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.841752 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 12:14:26.841779 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.841782 2580 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 12:14:26.841938 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.841804 2580 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 12:14:26.841938 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.841811 2580 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 12:14:26.841938 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:26.841847 2580 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 12:14:26.845074 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.845054 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 12:14:26.861929 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.861903 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 12:14:26.862872 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.862854 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-91.ec2.internal" event="NodeHasSufficientMemory" Apr 20 12:14:26.862943 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.862887 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-91.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 12:14:26.862943 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.862897 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-91.ec2.internal" event="NodeHasSufficientPID" Apr 20 12:14:26.862943 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.862926 2580 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-91.ec2.internal" Apr 20 12:14:26.871311 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.871292 2580 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-91.ec2.internal" Apr 20 12:14:26.871366 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:26.871319 2580 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-91.ec2.internal\": node \"ip-10-0-137-91.ec2.internal\" not found" Apr 20 12:14:26.888236 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:26.888204 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-91.ec2.internal\" not found" Apr 20 12:14:26.942404 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.942362 2580 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-91.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-91.ec2.internal"] Apr 20 12:14:26.942543 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.942467 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 12:14:26.943450 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.943430 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-91.ec2.internal" event="NodeHasSufficientMemory" Apr 20 12:14:26.943450 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.943460 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-91.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 12:14:26.943584 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.943474 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-91.ec2.internal" event="NodeHasSufficientPID" Apr 20 12:14:26.944884 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.944869 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 12:14:26.945040 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.945024 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-91.ec2.internal" Apr 20 12:14:26.945079 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.945060 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 12:14:26.945650 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.945633 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-91.ec2.internal" event="NodeHasSufficientMemory" Apr 20 12:14:26.945732 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.945664 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-91.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 12:14:26.945732 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.945668 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-91.ec2.internal" event="NodeHasSufficientMemory" Apr 20 12:14:26.945732 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.945679 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-91.ec2.internal" event="NodeHasSufficientPID" Apr 20 12:14:26.945732 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.945694 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-91.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 12:14:26.945732 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.945704 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-91.ec2.internal" event="NodeHasSufficientPID" Apr 20 12:14:26.946823 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.946811 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-91.ec2.internal" Apr 20 12:14:26.946882 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.946835 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 12:14:26.947511 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.947494 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-91.ec2.internal" event="NodeHasSufficientMemory" Apr 20 12:14:26.947594 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.947523 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-91.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 12:14:26.947594 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.947536 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-91.ec2.internal" event="NodeHasSufficientPID" Apr 20 12:14:26.967310 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:26.967287 2580 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-91.ec2.internal\" not found" node="ip-10-0-137-91.ec2.internal" Apr 20 12:14:26.971767 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:26.971751 2580 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-91.ec2.internal\" not found" node="ip-10-0-137-91.ec2.internal" Apr 20 12:14:26.988964 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:26.988940 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-91.ec2.internal\" not found" Apr 20 12:14:26.996328 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.996304 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/98a7b4133bfe59f130f091a3ed1a068f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-91.ec2.internal\" (UID: \"98a7b4133bfe59f130f091a3ed1a068f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-91.ec2.internal" Apr 20 12:14:26.996391 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.996335 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4e4cd406deda512383d472584c7956de-config\") pod \"kube-apiserver-proxy-ip-10-0-137-91.ec2.internal\" (UID: \"4e4cd406deda512383d472584c7956de\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-91.ec2.internal" Apr 20 12:14:26.996446 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:26.996417 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/98a7b4133bfe59f130f091a3ed1a068f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-91.ec2.internal\" (UID: \"98a7b4133bfe59f130f091a3ed1a068f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-91.ec2.internal" Apr 20 12:14:27.089416 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:27.089363 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-91.ec2.internal\" not found" Apr 20 12:14:27.096824 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.096763 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/98a7b4133bfe59f130f091a3ed1a068f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-91.ec2.internal\" (UID: \"98a7b4133bfe59f130f091a3ed1a068f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-91.ec2.internal" Apr 20 12:14:27.096824 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.096722 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/98a7b4133bfe59f130f091a3ed1a068f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-91.ec2.internal\" (UID: \"98a7b4133bfe59f130f091a3ed1a068f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-91.ec2.internal" Apr 20 12:14:27.096824 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.096822 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/98a7b4133bfe59f130f091a3ed1a068f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-91.ec2.internal\" (UID: \"98a7b4133bfe59f130f091a3ed1a068f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-91.ec2.internal" Apr 20 12:14:27.096963 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.096840 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4e4cd406deda512383d472584c7956de-config\") pod \"kube-apiserver-proxy-ip-10-0-137-91.ec2.internal\" (UID: \"4e4cd406deda512383d472584c7956de\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-91.ec2.internal" Apr 20 12:14:27.096963 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.096866 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4e4cd406deda512383d472584c7956de-config\") pod \"kube-apiserver-proxy-ip-10-0-137-91.ec2.internal\" (UID: \"4e4cd406deda512383d472584c7956de\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-91.ec2.internal" Apr 20 12:14:27.096963 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.096889 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/98a7b4133bfe59f130f091a3ed1a068f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-91.ec2.internal\" (UID: \"98a7b4133bfe59f130f091a3ed1a068f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-91.ec2.internal" Apr 20 12:14:27.190228 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:27.190182 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-91.ec2.internal\" not found" Apr 20 12:14:27.270687 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.270652 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-91.ec2.internal" Apr 20 12:14:27.274212 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.274195 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-91.ec2.internal" Apr 20 12:14:27.290958 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:27.290929 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-91.ec2.internal\" not found" Apr 20 12:14:27.391625 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:27.391530 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-91.ec2.internal\" not found" Apr 20 12:14:27.492087 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:27.492048 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-91.ec2.internal\" not found" Apr 20 12:14:27.568387 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.568356 2580 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 12:14:27.594103 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.594078 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-91.ec2.internal" Apr 20 12:14:27.601072 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.601053 2580 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 12:14:27.601193 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.601177 2580 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 12:14:27.601228 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.601209 2580 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 12:14:27.601262 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.601231 2580 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 12:14:27.601262 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:27.601238 2580 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://abfff2348aca8429495cdf20e3c66e7e-9e72cbad1064ad01.elb.us-east-1.amazonaws.com:6443/api/v1/namespaces/openshift-machine-config-operator/pods\": read tcp 10.0.137.91:51098->3.233.12.183:6443: use of closed network connection" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-91.ec2.internal" Apr 20 12:14:27.601323 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.601267 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-91.ec2.internal" Apr 20 12:14:27.615472 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.615446 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 12:14:27.671883 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.671803 2580 apiserver.go:52] "Watching apiserver" Apr 20 12:14:27.682063 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.682030 2580 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 12:14:27.682438 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.682414 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-cz8d9","openshift-ovn-kubernetes/ovnkube-node-dwphh","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-686bc","openshift-cluster-node-tuning-operator/tuned-csctt","openshift-dns/node-resolver-zwm76","openshift-image-registry/node-ca-p26xh","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-91.ec2.internal","openshift-multus/multus-26xn7","openshift-multus/multus-additional-cni-plugins-j99mh","openshift-network-operator/iptables-alerter-cf8lw","kube-system/konnectivity-agent-8hww4","kube-system/kube-apiserver-proxy-ip-10-0-137-91.ec2.internal","openshift-multus/network-metrics-daemon-nm452"] Apr 20 12:14:27.683903 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.683881 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:14:27.684016 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.683897 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 12:09:26 +0000 UTC" deadline="2028-01-04 22:33:09.860997602 +0000 UTC" Apr 20 12:14:27.684016 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.683918 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14986h18m42.177082122s" Apr 20 12:14:27.684016 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:27.683971 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cz8d9" podUID="deac4f19-5105-40df-bd7a-9d7c576cd705" Apr 20 12:14:27.685163 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.685143 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-j99mh" Apr 20 12:14:27.686286 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.686264 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-686bc" Apr 20 12:14:27.687415 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.687381 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 12:14:27.687521 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.687416 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 12:14:27.687521 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.687503 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.687521 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.687514 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-nngnt\"" Apr 20 12:14:27.687671 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.687381 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 12:14:27.687671 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.687611 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 12:14:27.687776 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.687391 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 12:14:27.688114 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.688094 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-sxnmk\"" Apr 20 12:14:27.688318 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.688304 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 12:14:27.688637 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.688622 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 12:14:27.688788 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.688770 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-p26xh" Apr 20 12:14:27.688924 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.688904 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 12:14:27.689552 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.689534 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-qsd9s\"" Apr 20 12:14:27.689644 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.689571 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 12:14:27.689644 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.689628 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 12:14:27.690079 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.690064 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zwm76" Apr 20 12:14:27.690596 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.690581 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-rh5tl\"" Apr 20 12:14:27.690677 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.690604 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 12:14:27.690870 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.690855 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 12:14:27.690944 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.690929 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 12:14:27.691297 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.691283 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.691965 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.691950 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 12:14:27.692039 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.692025 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 12:14:27.692080 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.692035 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-w7txh\"" Apr 20 12:14:27.693065 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.693045 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.693150 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.693089 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 12:14:27.693202 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.693154 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 12:14:27.693260 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.693217 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-gqd4g\"" Apr 20 12:14:27.694468 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.694452 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-cf8lw" Apr 20 12:14:27.695091 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.695070 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 12:14:27.695173 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.695110 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 12:14:27.696073 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.696054 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-8hww4" Apr 20 12:14:27.696184 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.696124 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 12:14:27.696184 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.696153 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 12:14:27.696348 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.696205 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-rjm22\"" Apr 20 12:14:27.696348 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.696124 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 12:14:27.696523 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.696379 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 12:14:27.696523 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.696455 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 12:14:27.696803 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.696785 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-9mpdr\"" Apr 20 12:14:27.697074 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.697056 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 12:14:27.697173 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.697159 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 12:14:27.697568 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.697551 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:14:27.697653 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:27.697604 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nm452" podUID="430e704c-5d70-4df6-baaa-2296216f1239" Apr 20 12:14:27.698126 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.698099 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-zpmrc\"" Apr 20 12:14:27.698296 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.698278 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 12:14:27.698296 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.698291 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 12:14:27.700486 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.700210 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-log-socket\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.700486 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.700260 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2da468d5-2794-41ba-8344-4246be8732d7-ovn-node-metrics-cert\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.700486 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.700290 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/261e5e12-8ebc-4a49-9b69-be511d818e12-host-slash\") pod \"iptables-alerter-cf8lw\" (UID: \"261e5e12-8ebc-4a49-9b69-be511d818e12\") " pod="openshift-network-operator/iptables-alerter-cf8lw" Apr 20 12:14:27.700486 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.700324 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f8dc897f-a714-4520-8393-707949cd3be7-etc-modprobe-d\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.700486 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.700387 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2d6874b2-c453-4084-b388-f029a2a9cb5f-socket-dir\") pod \"aws-ebs-csi-driver-node-686bc\" (UID: \"2d6874b2-c453-4084-b388-f029a2a9cb5f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-686bc" Apr 20 12:14:27.700486 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.700448 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0279eea6-b7aa-4e72-bec0-5aa87266cc8b-os-release\") pod \"multus-additional-cni-plugins-j99mh\" (UID: \"0279eea6-b7aa-4e72-bec0-5aa87266cc8b\") " pod="openshift-multus/multus-additional-cni-plugins-j99mh" Apr 20 12:14:27.700848 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.700497 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-host-var-lib-cni-bin\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.700848 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.700576 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-host-cni-netd\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.700848 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.700621 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-host-kubelet\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.700848 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.700731 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-run-openvswitch\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.700848 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.700767 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-run-ovn\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.700848 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.700835 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0279eea6-b7aa-4e72-bec0-5aa87266cc8b-system-cni-dir\") pod \"multus-additional-cni-plugins-j99mh\" (UID: \"0279eea6-b7aa-4e72-bec0-5aa87266cc8b\") " pod="openshift-multus/multus-additional-cni-plugins-j99mh" Apr 20 12:14:27.701113 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.700865 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f8dc897f-a714-4520-8393-707949cd3be7-sys\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.701113 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.700919 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1ab58f2d-5007-4797-8a83-489889f35e06-serviceca\") pod \"node-ca-p26xh\" (UID: \"1ab58f2d-5007-4797-8a83-489889f35e06\") " pod="openshift-image-registry/node-ca-p26xh" Apr 20 12:14:27.701113 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.700950 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-node-log\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.701248 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.701195 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/2d6874b2-c453-4084-b388-f029a2a9cb5f-device-dir\") pod \"aws-ebs-csi-driver-node-686bc\" (UID: \"2d6874b2-c453-4084-b388-f029a2a9cb5f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-686bc" Apr 20 12:14:27.701298 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.701230 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0279eea6-b7aa-4e72-bec0-5aa87266cc8b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-j99mh\" (UID: \"0279eea6-b7aa-4e72-bec0-5aa87266cc8b\") " pod="openshift-multus/multus-additional-cni-plugins-j99mh" Apr 20 12:14:27.701298 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.701277 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f8dc897f-a714-4520-8393-707949cd3be7-etc-sysconfig\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.701387 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.701305 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f8dc897f-a714-4520-8393-707949cd3be7-host\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.701387 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.701334 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-etc-kubernetes\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.701387 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.701359 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2da468d5-2794-41ba-8344-4246be8732d7-ovnkube-config\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.701551 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.701386 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f8dc897f-a714-4520-8393-707949cd3be7-run\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.701551 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.701435 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-cnibin\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.701634 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.701574 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-host-var-lib-cni-multus\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.701727 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.701707 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl8mb\" (UniqueName: \"kubernetes.io/projected/0279eea6-b7aa-4e72-bec0-5aa87266cc8b-kube-api-access-xl8mb\") pod \"multus-additional-cni-plugins-j99mh\" (UID: \"0279eea6-b7aa-4e72-bec0-5aa87266cc8b\") " pod="openshift-multus/multus-additional-cni-plugins-j99mh" Apr 20 12:14:27.701859 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.701822 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f8dc897f-a714-4520-8393-707949cd3be7-etc-tuned\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.702015 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.701992 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z884s\" (UniqueName: \"kubernetes.io/projected/f8dc897f-a714-4520-8393-707949cd3be7-kube-api-access-z884s\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.702118 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.702043 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e3f51f42-77bf-412b-970c-03006a2ef077-cni-binary-copy\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.702118 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.702084 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-host-run-netns\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.702216 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.702163 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-run-systemd\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.702267 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.702204 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.702329 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.702295 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0279eea6-b7aa-4e72-bec0-5aa87266cc8b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-j99mh\" (UID: \"0279eea6-b7aa-4e72-bec0-5aa87266cc8b\") " pod="openshift-multus/multus-additional-cni-plugins-j99mh" Apr 20 12:14:27.702414 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.702343 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f8dc897f-a714-4520-8393-707949cd3be7-etc-sysctl-conf\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.702414 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.702372 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-host-run-ovn-kubernetes\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.702521 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.702415 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2da468d5-2794-41ba-8344-4246be8732d7-ovnkube-script-lib\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.702521 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.702440 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjqlk\" (UniqueName: \"kubernetes.io/projected/deac4f19-5105-40df-bd7a-9d7c576cd705-kube-api-access-qjqlk\") pod \"network-check-target-cz8d9\" (UID: \"deac4f19-5105-40df-bd7a-9d7c576cd705\") " pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:14:27.702521 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.702492 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0279eea6-b7aa-4e72-bec0-5aa87266cc8b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-j99mh\" (UID: \"0279eea6-b7aa-4e72-bec0-5aa87266cc8b\") " pod="openshift-multus/multus-additional-cni-plugins-j99mh" Apr 20 12:14:27.702655 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.702538 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/261e5e12-8ebc-4a49-9b69-be511d818e12-iptables-alerter-script\") pod \"iptables-alerter-cf8lw\" (UID: \"261e5e12-8ebc-4a49-9b69-be511d818e12\") " pod="openshift-network-operator/iptables-alerter-cf8lw" Apr 20 12:14:27.702655 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.702571 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-os-release\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.702655 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.702613 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-multus-conf-dir\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.702655 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.702647 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e3f51f42-77bf-412b-970c-03006a2ef077-multus-daemon-config\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.702825 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.702671 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-systemd-units\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.702825 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.702705 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2da468d5-2794-41ba-8344-4246be8732d7-env-overrides\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.702825 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.702727 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c4pw\" (UniqueName: \"kubernetes.io/projected/261e5e12-8ebc-4a49-9b69-be511d818e12-kube-api-access-7c4pw\") pod \"iptables-alerter-cf8lw\" (UID: \"261e5e12-8ebc-4a49-9b69-be511d818e12\") " pod="openshift-network-operator/iptables-alerter-cf8lw" Apr 20 12:14:27.702825 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.702752 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/2d6874b2-c453-4084-b388-f029a2a9cb5f-sys-fs\") pod \"aws-ebs-csi-driver-node-686bc\" (UID: \"2d6874b2-c453-4084-b388-f029a2a9cb5f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-686bc" Apr 20 12:14:27.702825 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.702770 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0279eea6-b7aa-4e72-bec0-5aa87266cc8b-cnibin\") pod \"multus-additional-cni-plugins-j99mh\" (UID: \"0279eea6-b7aa-4e72-bec0-5aa87266cc8b\") " pod="openshift-multus/multus-additional-cni-plugins-j99mh" Apr 20 12:14:27.702825 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.702815 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-system-cni-dir\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.703099 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.702856 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-var-lib-openvswitch\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.703099 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.702900 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-etc-openvswitch\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.703099 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.702972 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6zjj\" (UniqueName: \"kubernetes.io/projected/2da468d5-2794-41ba-8344-4246be8732d7-kube-api-access-m6zjj\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.703332 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.703313 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88r9l\" (UniqueName: \"kubernetes.io/projected/2d6874b2-c453-4084-b388-f029a2a9cb5f-kube-api-access-88r9l\") pod \"aws-ebs-csi-driver-node-686bc\" (UID: \"2d6874b2-c453-4084-b388-f029a2a9cb5f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-686bc" Apr 20 12:14:27.703385 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.703345 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f8dc897f-a714-4520-8393-707949cd3be7-etc-systemd\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.703385 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.703369 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4htd\" (UniqueName: \"kubernetes.io/projected/e3f51f42-77bf-412b-970c-03006a2ef077-kube-api-access-j4htd\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.703515 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.703415 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-host-cni-bin\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.703515 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.703432 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f8dc897f-a714-4520-8393-707949cd3be7-etc-sysctl-d\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.703515 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.703446 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w6lp\" (UniqueName: \"kubernetes.io/projected/900f999a-6b3b-4648-b015-7ca045ba8dcd-kube-api-access-5w6lp\") pod \"node-resolver-zwm76\" (UID: \"900f999a-6b3b-4648-b015-7ca045ba8dcd\") " pod="openshift-dns/node-resolver-zwm76" Apr 20 12:14:27.703515 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.703461 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/900f999a-6b3b-4648-b015-7ca045ba8dcd-hosts-file\") pod \"node-resolver-zwm76\" (UID: \"900f999a-6b3b-4648-b015-7ca045ba8dcd\") " pod="openshift-dns/node-resolver-zwm76" Apr 20 12:14:27.703515 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.703484 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0279eea6-b7aa-4e72-bec0-5aa87266cc8b-cni-binary-copy\") pod \"multus-additional-cni-plugins-j99mh\" (UID: \"0279eea6-b7aa-4e72-bec0-5aa87266cc8b\") " pod="openshift-multus/multus-additional-cni-plugins-j99mh" Apr 20 12:14:27.703734 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.703520 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-multus-cni-dir\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.703734 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.703543 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-multus-socket-dir-parent\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.703734 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.703562 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-host-run-multus-certs\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.703734 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.703607 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-host-run-netns\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.703734 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.703631 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2d6874b2-c453-4084-b388-f029a2a9cb5f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-686bc\" (UID: \"2d6874b2-c453-4084-b388-f029a2a9cb5f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-686bc" Apr 20 12:14:27.703734 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.703674 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/900f999a-6b3b-4648-b015-7ca045ba8dcd-tmp-dir\") pod \"node-resolver-zwm76\" (UID: \"900f999a-6b3b-4648-b015-7ca045ba8dcd\") " pod="openshift-dns/node-resolver-zwm76" Apr 20 12:14:27.703734 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.703704 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f8dc897f-a714-4520-8393-707949cd3be7-lib-modules\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.703734 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.703726 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f8dc897f-a714-4520-8393-707949cd3be7-var-lib-kubelet\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.704112 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.703745 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1ab58f2d-5007-4797-8a83-489889f35e06-host\") pod \"node-ca-p26xh\" (UID: \"1ab58f2d-5007-4797-8a83-489889f35e06\") " pod="openshift-image-registry/node-ca-p26xh" Apr 20 12:14:27.704112 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.703767 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjqsn\" (UniqueName: \"kubernetes.io/projected/1ab58f2d-5007-4797-8a83-489889f35e06-kube-api-access-bjqsn\") pod \"node-ca-p26xh\" (UID: \"1ab58f2d-5007-4797-8a83-489889f35e06\") " pod="openshift-image-registry/node-ca-p26xh" Apr 20 12:14:27.704112 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.703787 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-host-var-lib-kubelet\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.704112 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.703821 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-hostroot\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.704112 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.703839 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/2d6874b2-c453-4084-b388-f029a2a9cb5f-etc-selinux\") pod \"aws-ebs-csi-driver-node-686bc\" (UID: \"2d6874b2-c453-4084-b388-f029a2a9cb5f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-686bc" Apr 20 12:14:27.704112 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.703869 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-host-slash\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.704112 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.703895 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2d6874b2-c453-4084-b388-f029a2a9cb5f-registration-dir\") pod \"aws-ebs-csi-driver-node-686bc\" (UID: \"2d6874b2-c453-4084-b388-f029a2a9cb5f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-686bc" Apr 20 12:14:27.704112 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.703910 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f8dc897f-a714-4520-8393-707949cd3be7-etc-kubernetes\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.704112 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.703923 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f8dc897f-a714-4520-8393-707949cd3be7-tmp\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.704112 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.703937 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-host-run-k8s-cni-cncf-io\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.704610 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.704246 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 12:14:27.726603 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.726578 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-hnqpv" Apr 20 12:14:27.732186 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.732156 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-hnqpv" Apr 20 12:14:27.795030 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.795003 2580 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 12:14:27.804086 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.804055 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjqlk\" (UniqueName: \"kubernetes.io/projected/deac4f19-5105-40df-bd7a-9d7c576cd705-kube-api-access-qjqlk\") pod \"network-check-target-cz8d9\" (UID: \"deac4f19-5105-40df-bd7a-9d7c576cd705\") " pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:14:27.804086 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.804087 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0279eea6-b7aa-4e72-bec0-5aa87266cc8b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-j99mh\" (UID: \"0279eea6-b7aa-4e72-bec0-5aa87266cc8b\") " pod="openshift-multus/multus-additional-cni-plugins-j99mh" Apr 20 12:14:27.804279 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.804109 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/261e5e12-8ebc-4a49-9b69-be511d818e12-iptables-alerter-script\") pod \"iptables-alerter-cf8lw\" (UID: \"261e5e12-8ebc-4a49-9b69-be511d818e12\") " pod="openshift-network-operator/iptables-alerter-cf8lw" Apr 20 12:14:27.804279 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.804137 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/021e270f-fe7a-402c-a482-41d496fec5fb-agent-certs\") pod \"konnectivity-agent-8hww4\" (UID: \"021e270f-fe7a-402c-a482-41d496fec5fb\") " pod="kube-system/konnectivity-agent-8hww4" Apr 20 12:14:27.804279 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.804165 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-os-release\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.804279 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.804191 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-multus-conf-dir\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.804279 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.804215 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e3f51f42-77bf-412b-970c-03006a2ef077-multus-daemon-config\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.804279 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.804264 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-systemd-units\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.804576 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.804296 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0279eea6-b7aa-4e72-bec0-5aa87266cc8b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-j99mh\" (UID: \"0279eea6-b7aa-4e72-bec0-5aa87266cc8b\") " pod="openshift-multus/multus-additional-cni-plugins-j99mh" Apr 20 12:14:27.804576 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.804306 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2da468d5-2794-41ba-8344-4246be8732d7-env-overrides\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.804576 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.804332 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7c4pw\" (UniqueName: \"kubernetes.io/projected/261e5e12-8ebc-4a49-9b69-be511d818e12-kube-api-access-7c4pw\") pod \"iptables-alerter-cf8lw\" (UID: \"261e5e12-8ebc-4a49-9b69-be511d818e12\") " pod="openshift-network-operator/iptables-alerter-cf8lw" Apr 20 12:14:27.804576 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.804360 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/2d6874b2-c453-4084-b388-f029a2a9cb5f-sys-fs\") pod \"aws-ebs-csi-driver-node-686bc\" (UID: \"2d6874b2-c453-4084-b388-f029a2a9cb5f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-686bc" Apr 20 12:14:27.804576 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.804359 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-systemd-units\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.804576 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.804380 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-os-release\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.804576 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.804451 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/2d6874b2-c453-4084-b388-f029a2a9cb5f-sys-fs\") pod \"aws-ebs-csi-driver-node-686bc\" (UID: \"2d6874b2-c453-4084-b388-f029a2a9cb5f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-686bc" Apr 20 12:14:27.804576 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.804460 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-multus-conf-dir\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.804576 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.804484 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0279eea6-b7aa-4e72-bec0-5aa87266cc8b-cnibin\") pod \"multus-additional-cni-plugins-j99mh\" (UID: \"0279eea6-b7aa-4e72-bec0-5aa87266cc8b\") " pod="openshift-multus/multus-additional-cni-plugins-j99mh" Apr 20 12:14:27.804576 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.804511 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-system-cni-dir\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.804576 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.804539 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-var-lib-openvswitch\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.804576 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.804565 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-etc-openvswitch\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.805097 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.804593 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6zjj\" (UniqueName: \"kubernetes.io/projected/2da468d5-2794-41ba-8344-4246be8732d7-kube-api-access-m6zjj\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.805097 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.804610 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-system-cni-dir\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.805097 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.804621 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-88r9l\" (UniqueName: \"kubernetes.io/projected/2d6874b2-c453-4084-b388-f029a2a9cb5f-kube-api-access-88r9l\") pod \"aws-ebs-csi-driver-node-686bc\" (UID: \"2d6874b2-c453-4084-b388-f029a2a9cb5f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-686bc" Apr 20 12:14:27.805097 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.804610 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0279eea6-b7aa-4e72-bec0-5aa87266cc8b-cnibin\") pod \"multus-additional-cni-plugins-j99mh\" (UID: \"0279eea6-b7aa-4e72-bec0-5aa87266cc8b\") " pod="openshift-multus/multus-additional-cni-plugins-j99mh" Apr 20 12:14:27.805097 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.804652 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f8dc897f-a714-4520-8393-707949cd3be7-etc-systemd\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.805097 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.804661 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-etc-openvswitch\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.805097 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.804732 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f8dc897f-a714-4520-8393-707949cd3be7-etc-systemd\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.805097 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.804717 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4htd\" (UniqueName: \"kubernetes.io/projected/e3f51f42-77bf-412b-970c-03006a2ef077-kube-api-access-j4htd\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.805097 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.804792 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/261e5e12-8ebc-4a49-9b69-be511d818e12-iptables-alerter-script\") pod \"iptables-alerter-cf8lw\" (UID: \"261e5e12-8ebc-4a49-9b69-be511d818e12\") " pod="openshift-network-operator/iptables-alerter-cf8lw" Apr 20 12:14:27.805097 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.804833 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e3f51f42-77bf-412b-970c-03006a2ef077-multus-daemon-config\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.805097 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.804812 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-var-lib-openvswitch\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.805097 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.804849 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2da468d5-2794-41ba-8344-4246be8732d7-env-overrides\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.805097 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.804886 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-host-cni-bin\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.805097 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.805030 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-host-cni-bin\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.805097 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.805063 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f8dc897f-a714-4520-8393-707949cd3be7-etc-sysctl-d\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.805907 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.805138 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5w6lp\" (UniqueName: \"kubernetes.io/projected/900f999a-6b3b-4648-b015-7ca045ba8dcd-kube-api-access-5w6lp\") pod \"node-resolver-zwm76\" (UID: \"900f999a-6b3b-4648-b015-7ca045ba8dcd\") " pod="openshift-dns/node-resolver-zwm76" Apr 20 12:14:27.805907 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.805166 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/900f999a-6b3b-4648-b015-7ca045ba8dcd-hosts-file\") pod \"node-resolver-zwm76\" (UID: \"900f999a-6b3b-4648-b015-7ca045ba8dcd\") " pod="openshift-dns/node-resolver-zwm76" Apr 20 12:14:27.805907 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.805202 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f8dc897f-a714-4520-8393-707949cd3be7-etc-sysctl-d\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.805907 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.805210 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0279eea6-b7aa-4e72-bec0-5aa87266cc8b-cni-binary-copy\") pod \"multus-additional-cni-plugins-j99mh\" (UID: \"0279eea6-b7aa-4e72-bec0-5aa87266cc8b\") " pod="openshift-multus/multus-additional-cni-plugins-j99mh" Apr 20 12:14:27.805907 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.805256 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-multus-cni-dir\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.805907 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.805274 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/900f999a-6b3b-4648-b015-7ca045ba8dcd-hosts-file\") pod \"node-resolver-zwm76\" (UID: \"900f999a-6b3b-4648-b015-7ca045ba8dcd\") " pod="openshift-dns/node-resolver-zwm76" Apr 20 12:14:27.805907 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.805286 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-multus-socket-dir-parent\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.805907 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.805338 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-multus-socket-dir-parent\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.805907 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.805337 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-host-run-multus-certs\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.805907 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.805367 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-host-run-netns\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.805907 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.805382 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-multus-cni-dir\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.805907 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.805391 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2d6874b2-c453-4084-b388-f029a2a9cb5f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-686bc\" (UID: \"2d6874b2-c453-4084-b388-f029a2a9cb5f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-686bc" Apr 20 12:14:27.805907 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.805477 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-host-run-netns\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.805907 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.805440 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-host-run-multus-certs\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.805907 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.805444 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2d6874b2-c453-4084-b388-f029a2a9cb5f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-686bc\" (UID: \"2d6874b2-c453-4084-b388-f029a2a9cb5f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-686bc" Apr 20 12:14:27.805907 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.805570 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/900f999a-6b3b-4648-b015-7ca045ba8dcd-tmp-dir\") pod \"node-resolver-zwm76\" (UID: \"900f999a-6b3b-4648-b015-7ca045ba8dcd\") " pod="openshift-dns/node-resolver-zwm76" Apr 20 12:14:27.805907 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.805613 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f8dc897f-a714-4520-8393-707949cd3be7-lib-modules\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.805907 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.805639 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f8dc897f-a714-4520-8393-707949cd3be7-var-lib-kubelet\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.806725 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.805666 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1ab58f2d-5007-4797-8a83-489889f35e06-host\") pod \"node-ca-p26xh\" (UID: \"1ab58f2d-5007-4797-8a83-489889f35e06\") " pod="openshift-image-registry/node-ca-p26xh" Apr 20 12:14:27.806725 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.805735 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f8dc897f-a714-4520-8393-707949cd3be7-var-lib-kubelet\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.806725 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.805757 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0279eea6-b7aa-4e72-bec0-5aa87266cc8b-cni-binary-copy\") pod \"multus-additional-cni-plugins-j99mh\" (UID: \"0279eea6-b7aa-4e72-bec0-5aa87266cc8b\") " pod="openshift-multus/multus-additional-cni-plugins-j99mh" Apr 20 12:14:27.806725 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.805768 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjqsn\" (UniqueName: \"kubernetes.io/projected/1ab58f2d-5007-4797-8a83-489889f35e06-kube-api-access-bjqsn\") pod \"node-ca-p26xh\" (UID: \"1ab58f2d-5007-4797-8a83-489889f35e06\") " pod="openshift-image-registry/node-ca-p26xh" Apr 20 12:14:27.806725 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.805792 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f8dc897f-a714-4520-8393-707949cd3be7-lib-modules\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.806725 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.805827 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-host-var-lib-kubelet\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.806725 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.805840 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1ab58f2d-5007-4797-8a83-489889f35e06-host\") pod \"node-ca-p26xh\" (UID: \"1ab58f2d-5007-4797-8a83-489889f35e06\") " pod="openshift-image-registry/node-ca-p26xh" Apr 20 12:14:27.806725 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.805858 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-hostroot\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.806725 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.805865 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/900f999a-6b3b-4648-b015-7ca045ba8dcd-tmp-dir\") pod \"node-resolver-zwm76\" (UID: \"900f999a-6b3b-4648-b015-7ca045ba8dcd\") " pod="openshift-dns/node-resolver-zwm76" Apr 20 12:14:27.806725 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.805882 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-host-var-lib-kubelet\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.806725 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.805884 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/2d6874b2-c453-4084-b388-f029a2a9cb5f-etc-selinux\") pod \"aws-ebs-csi-driver-node-686bc\" (UID: \"2d6874b2-c453-4084-b388-f029a2a9cb5f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-686bc" Apr 20 12:14:27.806725 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.805925 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-hostroot\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.806725 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.805921 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-host-slash\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.806725 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.805958 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2d6874b2-c453-4084-b388-f029a2a9cb5f-registration-dir\") pod \"aws-ebs-csi-driver-node-686bc\" (UID: \"2d6874b2-c453-4084-b388-f029a2a9cb5f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-686bc" Apr 20 12:14:27.806725 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806007 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/2d6874b2-c453-4084-b388-f029a2a9cb5f-etc-selinux\") pod \"aws-ebs-csi-driver-node-686bc\" (UID: \"2d6874b2-c453-4084-b388-f029a2a9cb5f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-686bc" Apr 20 12:14:27.806725 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806010 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f8dc897f-a714-4520-8393-707949cd3be7-etc-kubernetes\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.806725 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806012 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-host-slash\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.806725 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806051 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2d6874b2-c453-4084-b388-f029a2a9cb5f-registration-dir\") pod \"aws-ebs-csi-driver-node-686bc\" (UID: \"2d6874b2-c453-4084-b388-f029a2a9cb5f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-686bc" Apr 20 12:14:27.807504 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806064 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f8dc897f-a714-4520-8393-707949cd3be7-tmp\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.807504 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806090 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-host-run-k8s-cni-cncf-io\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.807504 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806115 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f8dc897f-a714-4520-8393-707949cd3be7-etc-kubernetes\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.807504 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806128 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-log-socket\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.807504 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806152 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2da468d5-2794-41ba-8344-4246be8732d7-ovn-node-metrics-cert\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.807504 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806193 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-log-socket\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.807504 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806188 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-host-run-k8s-cni-cncf-io\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.807504 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806283 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/261e5e12-8ebc-4a49-9b69-be511d818e12-host-slash\") pod \"iptables-alerter-cf8lw\" (UID: \"261e5e12-8ebc-4a49-9b69-be511d818e12\") " pod="openshift-network-operator/iptables-alerter-cf8lw" Apr 20 12:14:27.807504 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806326 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f8dc897f-a714-4520-8393-707949cd3be7-etc-modprobe-d\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.807504 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806330 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/261e5e12-8ebc-4a49-9b69-be511d818e12-host-slash\") pod \"iptables-alerter-cf8lw\" (UID: \"261e5e12-8ebc-4a49-9b69-be511d818e12\") " pod="openshift-network-operator/iptables-alerter-cf8lw" Apr 20 12:14:27.807504 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806356 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2d6874b2-c453-4084-b388-f029a2a9cb5f-socket-dir\") pod \"aws-ebs-csi-driver-node-686bc\" (UID: \"2d6874b2-c453-4084-b388-f029a2a9cb5f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-686bc" Apr 20 12:14:27.807504 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806358 2580 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 12:14:27.807504 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806386 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0279eea6-b7aa-4e72-bec0-5aa87266cc8b-os-release\") pod \"multus-additional-cni-plugins-j99mh\" (UID: \"0279eea6-b7aa-4e72-bec0-5aa87266cc8b\") " pod="openshift-multus/multus-additional-cni-plugins-j99mh" Apr 20 12:14:27.807504 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806426 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-host-var-lib-cni-bin\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.807504 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806446 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f8dc897f-a714-4520-8393-707949cd3be7-etc-modprobe-d\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.807504 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806462 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-host-cni-netd\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.807504 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806489 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-host-kubelet\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.807504 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806503 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2d6874b2-c453-4084-b388-f029a2a9cb5f-socket-dir\") pod \"aws-ebs-csi-driver-node-686bc\" (UID: \"2d6874b2-c453-4084-b388-f029a2a9cb5f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-686bc" Apr 20 12:14:27.808305 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806513 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-run-openvswitch\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.808305 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806507 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0279eea6-b7aa-4e72-bec0-5aa87266cc8b-os-release\") pod \"multus-additional-cni-plugins-j99mh\" (UID: \"0279eea6-b7aa-4e72-bec0-5aa87266cc8b\") " pod="openshift-multus/multus-additional-cni-plugins-j99mh" Apr 20 12:14:27.808305 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806542 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-host-cni-netd\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.808305 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806542 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-run-ovn\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.808305 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806566 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-host-var-lib-cni-bin\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.808305 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806580 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-run-ovn\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.808305 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806602 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-host-kubelet\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.808305 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806622 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-run-openvswitch\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.808305 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806637 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0279eea6-b7aa-4e72-bec0-5aa87266cc8b-system-cni-dir\") pod \"multus-additional-cni-plugins-j99mh\" (UID: \"0279eea6-b7aa-4e72-bec0-5aa87266cc8b\") " pod="openshift-multus/multus-additional-cni-plugins-j99mh" Apr 20 12:14:27.808305 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806663 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f8dc897f-a714-4520-8393-707949cd3be7-sys\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.808305 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806685 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1ab58f2d-5007-4797-8a83-489889f35e06-serviceca\") pod \"node-ca-p26xh\" (UID: \"1ab58f2d-5007-4797-8a83-489889f35e06\") " pod="openshift-image-registry/node-ca-p26xh" Apr 20 12:14:27.808305 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806709 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0279eea6-b7aa-4e72-bec0-5aa87266cc8b-system-cni-dir\") pod \"multus-additional-cni-plugins-j99mh\" (UID: \"0279eea6-b7aa-4e72-bec0-5aa87266cc8b\") " pod="openshift-multus/multus-additional-cni-plugins-j99mh" Apr 20 12:14:27.808305 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806730 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f8dc897f-a714-4520-8393-707949cd3be7-sys\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.808305 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806761 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-node-log\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.808305 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806785 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/2d6874b2-c453-4084-b388-f029a2a9cb5f-device-dir\") pod \"aws-ebs-csi-driver-node-686bc\" (UID: \"2d6874b2-c453-4084-b388-f029a2a9cb5f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-686bc" Apr 20 12:14:27.808305 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806812 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0279eea6-b7aa-4e72-bec0-5aa87266cc8b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-j99mh\" (UID: \"0279eea6-b7aa-4e72-bec0-5aa87266cc8b\") " pod="openshift-multus/multus-additional-cni-plugins-j99mh" Apr 20 12:14:27.808305 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806831 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-node-log\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.809124 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806843 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f8dc897f-a714-4520-8393-707949cd3be7-etc-sysconfig\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.809124 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806889 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/2d6874b2-c453-4084-b388-f029a2a9cb5f-device-dir\") pod \"aws-ebs-csi-driver-node-686bc\" (UID: \"2d6874b2-c453-4084-b388-f029a2a9cb5f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-686bc" Apr 20 12:14:27.809124 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806868 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f8dc897f-a714-4520-8393-707949cd3be7-host\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.809124 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806924 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-etc-kubernetes\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.809124 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806934 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f8dc897f-a714-4520-8393-707949cd3be7-host\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.809124 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806947 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2da468d5-2794-41ba-8344-4246be8732d7-ovnkube-config\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.809124 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.806985 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2cpf\" (UniqueName: \"kubernetes.io/projected/430e704c-5d70-4df6-baaa-2296216f1239-kube-api-access-p2cpf\") pod \"network-metrics-daemon-nm452\" (UID: \"430e704c-5d70-4df6-baaa-2296216f1239\") " pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:14:27.809124 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.807016 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f8dc897f-a714-4520-8393-707949cd3be7-run\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.809124 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.807075 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1ab58f2d-5007-4797-8a83-489889f35e06-serviceca\") pod \"node-ca-p26xh\" (UID: \"1ab58f2d-5007-4797-8a83-489889f35e06\") " pod="openshift-image-registry/node-ca-p26xh" Apr 20 12:14:27.809124 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.807073 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-cnibin\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.809124 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.807128 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-host-var-lib-cni-multus\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.809124 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.807137 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f8dc897f-a714-4520-8393-707949cd3be7-etc-sysconfig\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.809124 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.807156 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/430e704c-5d70-4df6-baaa-2296216f1239-metrics-certs\") pod \"network-metrics-daemon-nm452\" (UID: \"430e704c-5d70-4df6-baaa-2296216f1239\") " pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:14:27.809124 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.807218 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f8dc897f-a714-4520-8393-707949cd3be7-run\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.809124 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.807305 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-cnibin\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.809124 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.807342 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-host-var-lib-cni-multus\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.809124 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.807375 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3f51f42-77bf-412b-970c-03006a2ef077-etc-kubernetes\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.809124 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.807374 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2da468d5-2794-41ba-8344-4246be8732d7-ovnkube-config\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.809910 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.807423 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xl8mb\" (UniqueName: \"kubernetes.io/projected/0279eea6-b7aa-4e72-bec0-5aa87266cc8b-kube-api-access-xl8mb\") pod \"multus-additional-cni-plugins-j99mh\" (UID: \"0279eea6-b7aa-4e72-bec0-5aa87266cc8b\") " pod="openshift-multus/multus-additional-cni-plugins-j99mh" Apr 20 12:14:27.809910 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.807512 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f8dc897f-a714-4520-8393-707949cd3be7-etc-tuned\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.809910 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.807541 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z884s\" (UniqueName: \"kubernetes.io/projected/f8dc897f-a714-4520-8393-707949cd3be7-kube-api-access-z884s\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.809910 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.807567 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e3f51f42-77bf-412b-970c-03006a2ef077-cni-binary-copy\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.809910 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.807651 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-host-run-netns\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.809910 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.807688 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-run-systemd\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.809910 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.807714 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.809910 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.807754 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0279eea6-b7aa-4e72-bec0-5aa87266cc8b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-j99mh\" (UID: \"0279eea6-b7aa-4e72-bec0-5aa87266cc8b\") " pod="openshift-multus/multus-additional-cni-plugins-j99mh" Apr 20 12:14:27.809910 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.807788 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/021e270f-fe7a-402c-a482-41d496fec5fb-konnectivity-ca\") pod \"konnectivity-agent-8hww4\" (UID: \"021e270f-fe7a-402c-a482-41d496fec5fb\") " pod="kube-system/konnectivity-agent-8hww4" Apr 20 12:14:27.809910 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.807841 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f8dc897f-a714-4520-8393-707949cd3be7-etc-sysctl-conf\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.809910 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.807889 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-host-run-ovn-kubernetes\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.809910 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.807916 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2da468d5-2794-41ba-8344-4246be8732d7-ovnkube-script-lib\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.809910 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.808014 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.809910 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.808094 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0279eea6-b7aa-4e72-bec0-5aa87266cc8b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-j99mh\" (UID: \"0279eea6-b7aa-4e72-bec0-5aa87266cc8b\") " pod="openshift-multus/multus-additional-cni-plugins-j99mh" Apr 20 12:14:27.809910 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.808157 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-host-run-netns\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.809910 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.808340 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-run-systemd\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.809910 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.808423 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2da468d5-2794-41ba-8344-4246be8732d7-host-run-ovn-kubernetes\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.810533 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.808432 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f8dc897f-a714-4520-8393-707949cd3be7-etc-sysctl-conf\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.810533 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.808728 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2da468d5-2794-41ba-8344-4246be8732d7-ovnkube-script-lib\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.810533 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.808731 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e3f51f42-77bf-412b-970c-03006a2ef077-cni-binary-copy\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.810533 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.808876 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0279eea6-b7aa-4e72-bec0-5aa87266cc8b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-j99mh\" (UID: \"0279eea6-b7aa-4e72-bec0-5aa87266cc8b\") " pod="openshift-multus/multus-additional-cni-plugins-j99mh" Apr 20 12:14:27.810533 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.809922 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f8dc897f-a714-4520-8393-707949cd3be7-tmp\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.810533 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.810105 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2da468d5-2794-41ba-8344-4246be8732d7-ovn-node-metrics-cert\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.810533 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.810215 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f8dc897f-a714-4520-8393-707949cd3be7-etc-tuned\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.810844 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:27.810647 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 12:14:27.810844 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:27.810668 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 12:14:27.810844 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:27.810680 2580 projected.go:194] Error preparing data for projected volume kube-api-access-qjqlk for pod openshift-network-diagnostics/network-check-target-cz8d9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:14:27.810844 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:27.810815 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/deac4f19-5105-40df-bd7a-9d7c576cd705-kube-api-access-qjqlk podName:deac4f19-5105-40df-bd7a-9d7c576cd705 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:28.310779475 +0000 UTC m=+2.030793462 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-qjqlk" (UniqueName: "kubernetes.io/projected/deac4f19-5105-40df-bd7a-9d7c576cd705-kube-api-access-qjqlk") pod "network-check-target-cz8d9" (UID: "deac4f19-5105-40df-bd7a-9d7c576cd705") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:14:27.813036 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.813014 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c4pw\" (UniqueName: \"kubernetes.io/projected/261e5e12-8ebc-4a49-9b69-be511d818e12-kube-api-access-7c4pw\") pod \"iptables-alerter-cf8lw\" (UID: \"261e5e12-8ebc-4a49-9b69-be511d818e12\") " pod="openshift-network-operator/iptables-alerter-cf8lw" Apr 20 12:14:27.813981 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.813964 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-88r9l\" (UniqueName: \"kubernetes.io/projected/2d6874b2-c453-4084-b388-f029a2a9cb5f-kube-api-access-88r9l\") pod \"aws-ebs-csi-driver-node-686bc\" (UID: \"2d6874b2-c453-4084-b388-f029a2a9cb5f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-686bc" Apr 20 12:14:27.816583 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.815872 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6zjj\" (UniqueName: \"kubernetes.io/projected/2da468d5-2794-41ba-8344-4246be8732d7-kube-api-access-m6zjj\") pod \"ovnkube-node-dwphh\" (UID: \"2da468d5-2794-41ba-8344-4246be8732d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:27.817017 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.816973 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w6lp\" (UniqueName: \"kubernetes.io/projected/900f999a-6b3b-4648-b015-7ca045ba8dcd-kube-api-access-5w6lp\") pod \"node-resolver-zwm76\" (UID: \"900f999a-6b3b-4648-b015-7ca045ba8dcd\") " pod="openshift-dns/node-resolver-zwm76" Apr 20 12:14:27.818659 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:27.817884 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e4cd406deda512383d472584c7956de.slice/crio-15f943615a29e2a636a399596ffdd9ceae9a3de62907462870b14a82cfb7bc75 WatchSource:0}: Error finding container 15f943615a29e2a636a399596ffdd9ceae9a3de62907462870b14a82cfb7bc75: Status 404 returned error can't find the container with id 15f943615a29e2a636a399596ffdd9ceae9a3de62907462870b14a82cfb7bc75 Apr 20 12:14:27.818659 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.817962 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4htd\" (UniqueName: \"kubernetes.io/projected/e3f51f42-77bf-412b-970c-03006a2ef077-kube-api-access-j4htd\") pod \"multus-26xn7\" (UID: \"e3f51f42-77bf-412b-970c-03006a2ef077\") " pod="openshift-multus/multus-26xn7" Apr 20 12:14:27.819979 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.819957 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjqsn\" (UniqueName: \"kubernetes.io/projected/1ab58f2d-5007-4797-8a83-489889f35e06-kube-api-access-bjqsn\") pod \"node-ca-p26xh\" (UID: \"1ab58f2d-5007-4797-8a83-489889f35e06\") " pod="openshift-image-registry/node-ca-p26xh" Apr 20 12:14:27.823223 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.823208 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 12:14:27.826238 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.826218 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl8mb\" (UniqueName: \"kubernetes.io/projected/0279eea6-b7aa-4e72-bec0-5aa87266cc8b-kube-api-access-xl8mb\") pod \"multus-additional-cni-plugins-j99mh\" (UID: \"0279eea6-b7aa-4e72-bec0-5aa87266cc8b\") " pod="openshift-multus/multus-additional-cni-plugins-j99mh" Apr 20 12:14:27.826560 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.826544 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z884s\" (UniqueName: \"kubernetes.io/projected/f8dc897f-a714-4520-8393-707949cd3be7-kube-api-access-z884s\") pod \"tuned-csctt\" (UID: \"f8dc897f-a714-4520-8393-707949cd3be7\") " pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:27.844513 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.844458 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-91.ec2.internal" event={"ID":"4e4cd406deda512383d472584c7956de","Type":"ContainerStarted","Data":"15f943615a29e2a636a399596ffdd9ceae9a3de62907462870b14a82cfb7bc75"} Apr 20 12:14:27.845276 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.845251 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-91.ec2.internal" event={"ID":"98a7b4133bfe59f130f091a3ed1a068f","Type":"ContainerStarted","Data":"34b3eebed94079c66715616a14e5cbcfc77fc8f4209bb22cadacb1568853c2e5"} Apr 20 12:14:27.908328 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.908294 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p2cpf\" (UniqueName: \"kubernetes.io/projected/430e704c-5d70-4df6-baaa-2296216f1239-kube-api-access-p2cpf\") pod \"network-metrics-daemon-nm452\" (UID: \"430e704c-5d70-4df6-baaa-2296216f1239\") " pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:14:27.908328 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.908330 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/430e704c-5d70-4df6-baaa-2296216f1239-metrics-certs\") pod \"network-metrics-daemon-nm452\" (UID: \"430e704c-5d70-4df6-baaa-2296216f1239\") " pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:14:27.908580 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.908362 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/021e270f-fe7a-402c-a482-41d496fec5fb-konnectivity-ca\") pod \"konnectivity-agent-8hww4\" (UID: \"021e270f-fe7a-402c-a482-41d496fec5fb\") " pod="kube-system/konnectivity-agent-8hww4" Apr 20 12:14:27.908580 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.908432 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/021e270f-fe7a-402c-a482-41d496fec5fb-agent-certs\") pod \"konnectivity-agent-8hww4\" (UID: \"021e270f-fe7a-402c-a482-41d496fec5fb\") " pod="kube-system/konnectivity-agent-8hww4" Apr 20 12:14:27.908580 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:27.908463 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:14:27.908580 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:27.908539 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/430e704c-5d70-4df6-baaa-2296216f1239-metrics-certs podName:430e704c-5d70-4df6-baaa-2296216f1239 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:28.408515775 +0000 UTC m=+2.128529764 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/430e704c-5d70-4df6-baaa-2296216f1239-metrics-certs") pod "network-metrics-daemon-nm452" (UID: "430e704c-5d70-4df6-baaa-2296216f1239") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:14:27.908927 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.908909 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/021e270f-fe7a-402c-a482-41d496fec5fb-konnectivity-ca\") pod \"konnectivity-agent-8hww4\" (UID: \"021e270f-fe7a-402c-a482-41d496fec5fb\") " pod="kube-system/konnectivity-agent-8hww4" Apr 20 12:14:27.910845 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.910820 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/021e270f-fe7a-402c-a482-41d496fec5fb-agent-certs\") pod \"konnectivity-agent-8hww4\" (UID: \"021e270f-fe7a-402c-a482-41d496fec5fb\") " pod="kube-system/konnectivity-agent-8hww4" Apr 20 12:14:27.916990 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:27.916966 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2cpf\" (UniqueName: \"kubernetes.io/projected/430e704c-5d70-4df6-baaa-2296216f1239-kube-api-access-p2cpf\") pod \"network-metrics-daemon-nm452\" (UID: \"430e704c-5d70-4df6-baaa-2296216f1239\") " pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:14:28.008143 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:28.008055 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-j99mh" Apr 20 12:14:28.014203 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:28.014178 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0279eea6_b7aa_4e72_bec0_5aa87266cc8b.slice/crio-8dfea336564d5e7c9e31cdb19d1fdfd7efcd84de17b4a35bf13e28163d5ea2d4 WatchSource:0}: Error finding container 8dfea336564d5e7c9e31cdb19d1fdfd7efcd84de17b4a35bf13e28163d5ea2d4: Status 404 returned error can't find the container with id 8dfea336564d5e7c9e31cdb19d1fdfd7efcd84de17b4a35bf13e28163d5ea2d4 Apr 20 12:14:28.025788 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:28.025762 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-686bc" Apr 20 12:14:28.032081 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:28.032055 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d6874b2_c453_4084_b388_f029a2a9cb5f.slice/crio-10450e6896e10fa46af7fbb3c137e0a517450f1bbb95e324dd8c54bb7e2321fa WatchSource:0}: Error finding container 10450e6896e10fa46af7fbb3c137e0a517450f1bbb95e324dd8c54bb7e2321fa: Status 404 returned error can't find the container with id 10450e6896e10fa46af7fbb3c137e0a517450f1bbb95e324dd8c54bb7e2321fa Apr 20 12:14:28.042773 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:28.042752 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-csctt" Apr 20 12:14:28.046450 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:28.046426 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-p26xh" Apr 20 12:14:28.048812 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:28.048782 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8dc897f_a714_4520_8393_707949cd3be7.slice/crio-12c90cade8b2dd9dd146f239a98b3caade5c96da1a5869d408cdb6919537a419 WatchSource:0}: Error finding container 12c90cade8b2dd9dd146f239a98b3caade5c96da1a5869d408cdb6919537a419: Status 404 returned error can't find the container with id 12c90cade8b2dd9dd146f239a98b3caade5c96da1a5869d408cdb6919537a419 Apr 20 12:14:28.053974 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:28.053942 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ab58f2d_5007_4797_8a83_489889f35e06.slice/crio-b4cd24902567f937ab8d38fb5d8dd7928a4a74434edc66750449789ac3ef7a5b WatchSource:0}: Error finding container b4cd24902567f937ab8d38fb5d8dd7928a4a74434edc66750449789ac3ef7a5b: Status 404 returned error can't find the container with id b4cd24902567f937ab8d38fb5d8dd7928a4a74434edc66750449789ac3ef7a5b Apr 20 12:14:28.059718 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:28.059698 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zwm76" Apr 20 12:14:28.065594 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:28.065569 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod900f999a_6b3b_4648_b015_7ca045ba8dcd.slice/crio-1fb50104f2428f2d243fc62aca3f292aca9c588e2bcd0d2838c4389ae86e50ae WatchSource:0}: Error finding container 1fb50104f2428f2d243fc62aca3f292aca9c588e2bcd0d2838c4389ae86e50ae: Status 404 returned error can't find the container with id 1fb50104f2428f2d243fc62aca3f292aca9c588e2bcd0d2838c4389ae86e50ae Apr 20 12:14:28.066483 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:28.066461 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-26xn7" Apr 20 12:14:28.073408 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:28.073373 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3f51f42_77bf_412b_970c_03006a2ef077.slice/crio-dded8cd8f253eb97888f5b858bb0b0d8f9696bd43676dd3369326cabf455b85d WatchSource:0}: Error finding container dded8cd8f253eb97888f5b858bb0b0d8f9696bd43676dd3369326cabf455b85d: Status 404 returned error can't find the container with id dded8cd8f253eb97888f5b858bb0b0d8f9696bd43676dd3369326cabf455b85d Apr 20 12:14:28.080145 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:28.080122 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:28.086011 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:28.085985 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2da468d5_2794_41ba_8344_4246be8732d7.slice/crio-333987b0ac553ab9a07f8f8e44823aa5c650319b1e29ddfedd3425e1e2d5707b WatchSource:0}: Error finding container 333987b0ac553ab9a07f8f8e44823aa5c650319b1e29ddfedd3425e1e2d5707b: Status 404 returned error can't find the container with id 333987b0ac553ab9a07f8f8e44823aa5c650319b1e29ddfedd3425e1e2d5707b Apr 20 12:14:28.090739 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:28.090719 2580 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 12:14:28.099696 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:28.099669 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-cf8lw" Apr 20 12:14:28.106651 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:28.106627 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod261e5e12_8ebc_4a49_9b69_be511d818e12.slice/crio-60958a6a1271bb5826fb16c47cd217dae4d52add4a6fb98e49494778ca672686 WatchSource:0}: Error finding container 60958a6a1271bb5826fb16c47cd217dae4d52add4a6fb98e49494778ca672686: Status 404 returned error can't find the container with id 60958a6a1271bb5826fb16c47cd217dae4d52add4a6fb98e49494778ca672686 Apr 20 12:14:28.109619 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:28.109545 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-8hww4" Apr 20 12:14:28.117767 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:14:28.117735 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod021e270f_fe7a_402c_a482_41d496fec5fb.slice/crio-ac4153a605f004df23399c7c5d5a32f9e2074dc53873d38082c9c38919a5e6e4 WatchSource:0}: Error finding container ac4153a605f004df23399c7c5d5a32f9e2074dc53873d38082c9c38919a5e6e4: Status 404 returned error can't find the container with id ac4153a605f004df23399c7c5d5a32f9e2074dc53873d38082c9c38919a5e6e4 Apr 20 12:14:28.311575 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:28.311485 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjqlk\" (UniqueName: \"kubernetes.io/projected/deac4f19-5105-40df-bd7a-9d7c576cd705-kube-api-access-qjqlk\") pod \"network-check-target-cz8d9\" (UID: \"deac4f19-5105-40df-bd7a-9d7c576cd705\") " pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:14:28.311745 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:28.311670 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 12:14:28.311745 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:28.311692 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 12:14:28.311745 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:28.311705 2580 projected.go:194] Error preparing data for projected volume kube-api-access-qjqlk for pod openshift-network-diagnostics/network-check-target-cz8d9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:14:28.311943 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:28.311771 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/deac4f19-5105-40df-bd7a-9d7c576cd705-kube-api-access-qjqlk podName:deac4f19-5105-40df-bd7a-9d7c576cd705 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:29.311750821 +0000 UTC m=+3.031764795 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-qjqlk" (UniqueName: "kubernetes.io/projected/deac4f19-5105-40df-bd7a-9d7c576cd705-kube-api-access-qjqlk") pod "network-check-target-cz8d9" (UID: "deac4f19-5105-40df-bd7a-9d7c576cd705") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:14:28.412608 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:28.412576 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/430e704c-5d70-4df6-baaa-2296216f1239-metrics-certs\") pod \"network-metrics-daemon-nm452\" (UID: \"430e704c-5d70-4df6-baaa-2296216f1239\") " pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:14:28.412819 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:28.412795 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:14:28.412886 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:28.412877 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/430e704c-5d70-4df6-baaa-2296216f1239-metrics-certs podName:430e704c-5d70-4df6-baaa-2296216f1239 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:29.412850171 +0000 UTC m=+3.132864154 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/430e704c-5d70-4df6-baaa-2296216f1239-metrics-certs") pod "network-metrics-daemon-nm452" (UID: "430e704c-5d70-4df6-baaa-2296216f1239") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:14:28.733789 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:28.733632 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 12:09:27 +0000 UTC" deadline="2027-10-13 04:19:54.816539683 +0000 UTC" Apr 20 12:14:28.733789 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:28.733671 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12976h5m26.082873443s" Apr 20 12:14:28.843436 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:28.843382 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:14:28.843619 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:28.843558 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nm452" podUID="430e704c-5d70-4df6-baaa-2296216f1239" Apr 20 12:14:28.882334 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:28.882295 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-26xn7" event={"ID":"e3f51f42-77bf-412b-970c-03006a2ef077","Type":"ContainerStarted","Data":"dded8cd8f253eb97888f5b858bb0b0d8f9696bd43676dd3369326cabf455b85d"} Apr 20 12:14:28.891963 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:28.891917 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-686bc" event={"ID":"2d6874b2-c453-4084-b388-f029a2a9cb5f","Type":"ContainerStarted","Data":"10450e6896e10fa46af7fbb3c137e0a517450f1bbb95e324dd8c54bb7e2321fa"} Apr 20 12:14:28.909129 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:28.909070 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" event={"ID":"2da468d5-2794-41ba-8344-4246be8732d7","Type":"ContainerStarted","Data":"333987b0ac553ab9a07f8f8e44823aa5c650319b1e29ddfedd3425e1e2d5707b"} Apr 20 12:14:28.933332 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:28.933099 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 12:14:28.950466 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:28.947292 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zwm76" event={"ID":"900f999a-6b3b-4648-b015-7ca045ba8dcd","Type":"ContainerStarted","Data":"1fb50104f2428f2d243fc62aca3f292aca9c588e2bcd0d2838c4389ae86e50ae"} Apr 20 12:14:28.974601 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:28.974558 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-p26xh" event={"ID":"1ab58f2d-5007-4797-8a83-489889f35e06","Type":"ContainerStarted","Data":"b4cd24902567f937ab8d38fb5d8dd7928a4a74434edc66750449789ac3ef7a5b"} Apr 20 12:14:28.988147 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:28.988065 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-csctt" event={"ID":"f8dc897f-a714-4520-8393-707949cd3be7","Type":"ContainerStarted","Data":"12c90cade8b2dd9dd146f239a98b3caade5c96da1a5869d408cdb6919537a419"} Apr 20 12:14:28.993909 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:28.993872 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j99mh" event={"ID":"0279eea6-b7aa-4e72-bec0-5aa87266cc8b","Type":"ContainerStarted","Data":"8dfea336564d5e7c9e31cdb19d1fdfd7efcd84de17b4a35bf13e28163d5ea2d4"} Apr 20 12:14:28.996235 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:28.996198 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-8hww4" event={"ID":"021e270f-fe7a-402c-a482-41d496fec5fb","Type":"ContainerStarted","Data":"ac4153a605f004df23399c7c5d5a32f9e2074dc53873d38082c9c38919a5e6e4"} Apr 20 12:14:29.008488 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:29.008438 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-cf8lw" event={"ID":"261e5e12-8ebc-4a49-9b69-be511d818e12","Type":"ContainerStarted","Data":"60958a6a1271bb5826fb16c47cd217dae4d52add4a6fb98e49494778ca672686"} Apr 20 12:14:29.201096 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:29.200836 2580 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 12:14:29.324055 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:29.323971 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjqlk\" (UniqueName: \"kubernetes.io/projected/deac4f19-5105-40df-bd7a-9d7c576cd705-kube-api-access-qjqlk\") pod \"network-check-target-cz8d9\" (UID: \"deac4f19-5105-40df-bd7a-9d7c576cd705\") " pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:14:29.324213 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:29.324127 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 12:14:29.324213 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:29.324149 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 12:14:29.324213 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:29.324160 2580 projected.go:194] Error preparing data for projected volume kube-api-access-qjqlk for pod openshift-network-diagnostics/network-check-target-cz8d9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:14:29.324375 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:29.324217 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/deac4f19-5105-40df-bd7a-9d7c576cd705-kube-api-access-qjqlk podName:deac4f19-5105-40df-bd7a-9d7c576cd705 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:31.324198535 +0000 UTC m=+5.044212518 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-qjqlk" (UniqueName: "kubernetes.io/projected/deac4f19-5105-40df-bd7a-9d7c576cd705-kube-api-access-qjqlk") pod "network-check-target-cz8d9" (UID: "deac4f19-5105-40df-bd7a-9d7c576cd705") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:14:29.425035 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:29.424946 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/430e704c-5d70-4df6-baaa-2296216f1239-metrics-certs\") pod \"network-metrics-daemon-nm452\" (UID: \"430e704c-5d70-4df6-baaa-2296216f1239\") " pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:14:29.425223 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:29.425120 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:14:29.425223 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:29.425180 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/430e704c-5d70-4df6-baaa-2296216f1239-metrics-certs podName:430e704c-5d70-4df6-baaa-2296216f1239 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:31.425161586 +0000 UTC m=+5.145175555 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/430e704c-5d70-4df6-baaa-2296216f1239-metrics-certs") pod "network-metrics-daemon-nm452" (UID: "430e704c-5d70-4df6-baaa-2296216f1239") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:14:29.734113 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:29.734014 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 12:09:27 +0000 UTC" deadline="2028-01-21 14:37:22.071954722 +0000 UTC" Apr 20 12:14:29.734113 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:29.734053 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15386h22m52.33790613s" Apr 20 12:14:29.842760 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:29.842732 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:14:29.842948 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:29.842848 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cz8d9" podUID="deac4f19-5105-40df-bd7a-9d7c576cd705" Apr 20 12:14:30.525851 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:30.525819 2580 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 12:14:30.843079 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:30.842990 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:14:30.843549 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:30.843140 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nm452" podUID="430e704c-5d70-4df6-baaa-2296216f1239" Apr 20 12:14:31.344140 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:31.343969 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjqlk\" (UniqueName: \"kubernetes.io/projected/deac4f19-5105-40df-bd7a-9d7c576cd705-kube-api-access-qjqlk\") pod \"network-check-target-cz8d9\" (UID: \"deac4f19-5105-40df-bd7a-9d7c576cd705\") " pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:14:31.344329 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:31.344172 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 12:14:31.344329 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:31.344193 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 12:14:31.344329 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:31.344206 2580 projected.go:194] Error preparing data for projected volume kube-api-access-qjqlk for pod openshift-network-diagnostics/network-check-target-cz8d9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:14:31.344329 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:31.344267 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/deac4f19-5105-40df-bd7a-9d7c576cd705-kube-api-access-qjqlk podName:deac4f19-5105-40df-bd7a-9d7c576cd705 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:35.344249327 +0000 UTC m=+9.064263311 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-qjqlk" (UniqueName: "kubernetes.io/projected/deac4f19-5105-40df-bd7a-9d7c576cd705-kube-api-access-qjqlk") pod "network-check-target-cz8d9" (UID: "deac4f19-5105-40df-bd7a-9d7c576cd705") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:14:31.444843 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:31.444746 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/430e704c-5d70-4df6-baaa-2296216f1239-metrics-certs\") pod \"network-metrics-daemon-nm452\" (UID: \"430e704c-5d70-4df6-baaa-2296216f1239\") " pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:14:31.445029 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:31.444951 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:14:31.445029 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:31.445025 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/430e704c-5d70-4df6-baaa-2296216f1239-metrics-certs podName:430e704c-5d70-4df6-baaa-2296216f1239 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:35.445003481 +0000 UTC m=+9.165017455 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/430e704c-5d70-4df6-baaa-2296216f1239-metrics-certs") pod "network-metrics-daemon-nm452" (UID: "430e704c-5d70-4df6-baaa-2296216f1239") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:14:31.842923 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:31.842837 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:14:31.843175 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:31.842969 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cz8d9" podUID="deac4f19-5105-40df-bd7a-9d7c576cd705" Apr 20 12:14:32.843356 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:32.842824 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:14:32.843356 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:32.842970 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nm452" podUID="430e704c-5d70-4df6-baaa-2296216f1239" Apr 20 12:14:33.842511 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:33.842475 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:14:33.842684 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:33.842626 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cz8d9" podUID="deac4f19-5105-40df-bd7a-9d7c576cd705" Apr 20 12:14:34.843058 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:34.843024 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:14:34.843543 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:34.843165 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nm452" podUID="430e704c-5d70-4df6-baaa-2296216f1239" Apr 20 12:14:35.375943 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:35.375882 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjqlk\" (UniqueName: \"kubernetes.io/projected/deac4f19-5105-40df-bd7a-9d7c576cd705-kube-api-access-qjqlk\") pod \"network-check-target-cz8d9\" (UID: \"deac4f19-5105-40df-bd7a-9d7c576cd705\") " pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:14:35.376141 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:35.376076 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 12:14:35.376141 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:35.376105 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 12:14:35.376141 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:35.376119 2580 projected.go:194] Error preparing data for projected volume kube-api-access-qjqlk for pod openshift-network-diagnostics/network-check-target-cz8d9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:14:35.376295 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:35.376186 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/deac4f19-5105-40df-bd7a-9d7c576cd705-kube-api-access-qjqlk podName:deac4f19-5105-40df-bd7a-9d7c576cd705 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:43.37616674 +0000 UTC m=+17.096180726 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-qjqlk" (UniqueName: "kubernetes.io/projected/deac4f19-5105-40df-bd7a-9d7c576cd705-kube-api-access-qjqlk") pod "network-check-target-cz8d9" (UID: "deac4f19-5105-40df-bd7a-9d7c576cd705") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:14:35.476741 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:35.476632 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/430e704c-5d70-4df6-baaa-2296216f1239-metrics-certs\") pod \"network-metrics-daemon-nm452\" (UID: \"430e704c-5d70-4df6-baaa-2296216f1239\") " pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:14:35.476932 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:35.476798 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:14:35.476932 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:35.476875 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/430e704c-5d70-4df6-baaa-2296216f1239-metrics-certs podName:430e704c-5d70-4df6-baaa-2296216f1239 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:43.476854155 +0000 UTC m=+17.196868128 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/430e704c-5d70-4df6-baaa-2296216f1239-metrics-certs") pod "network-metrics-daemon-nm452" (UID: "430e704c-5d70-4df6-baaa-2296216f1239") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:14:35.843054 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:35.842867 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:14:35.843054 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:35.843005 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cz8d9" podUID="deac4f19-5105-40df-bd7a-9d7c576cd705" Apr 20 12:14:36.843375 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:36.843339 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:14:36.843858 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:36.843490 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nm452" podUID="430e704c-5d70-4df6-baaa-2296216f1239" Apr 20 12:14:37.842775 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:37.842747 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:14:37.842963 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:37.842850 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cz8d9" podUID="deac4f19-5105-40df-bd7a-9d7c576cd705" Apr 20 12:14:38.842514 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:38.842479 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:14:38.842928 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:38.842608 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nm452" podUID="430e704c-5d70-4df6-baaa-2296216f1239" Apr 20 12:14:39.842367 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:39.842317 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:14:39.842572 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:39.842463 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cz8d9" podUID="deac4f19-5105-40df-bd7a-9d7c576cd705" Apr 20 12:14:40.844940 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:40.844908 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:14:40.845439 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:40.845040 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nm452" podUID="430e704c-5d70-4df6-baaa-2296216f1239" Apr 20 12:14:41.842993 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:41.842957 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:14:41.843184 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:41.843088 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cz8d9" podUID="deac4f19-5105-40df-bd7a-9d7c576cd705" Apr 20 12:14:42.842330 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:42.842292 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:14:42.842819 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:42.842446 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nm452" podUID="430e704c-5d70-4df6-baaa-2296216f1239" Apr 20 12:14:43.429161 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:43.428965 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjqlk\" (UniqueName: \"kubernetes.io/projected/deac4f19-5105-40df-bd7a-9d7c576cd705-kube-api-access-qjqlk\") pod \"network-check-target-cz8d9\" (UID: \"deac4f19-5105-40df-bd7a-9d7c576cd705\") " pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:14:43.429161 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:43.429152 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 12:14:43.429412 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:43.429176 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 12:14:43.429412 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:43.429190 2580 projected.go:194] Error preparing data for projected volume kube-api-access-qjqlk for pod openshift-network-diagnostics/network-check-target-cz8d9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:14:43.429412 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:43.429255 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/deac4f19-5105-40df-bd7a-9d7c576cd705-kube-api-access-qjqlk podName:deac4f19-5105-40df-bd7a-9d7c576cd705 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:59.429235125 +0000 UTC m=+33.149249104 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-qjqlk" (UniqueName: "kubernetes.io/projected/deac4f19-5105-40df-bd7a-9d7c576cd705-kube-api-access-qjqlk") pod "network-check-target-cz8d9" (UID: "deac4f19-5105-40df-bd7a-9d7c576cd705") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:14:43.529578 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:43.529544 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/430e704c-5d70-4df6-baaa-2296216f1239-metrics-certs\") pod \"network-metrics-daemon-nm452\" (UID: \"430e704c-5d70-4df6-baaa-2296216f1239\") " pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:14:43.529748 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:43.529683 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:14:43.529748 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:43.529746 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/430e704c-5d70-4df6-baaa-2296216f1239-metrics-certs podName:430e704c-5d70-4df6-baaa-2296216f1239 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:59.529727024 +0000 UTC m=+33.249741007 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/430e704c-5d70-4df6-baaa-2296216f1239-metrics-certs") pod "network-metrics-daemon-nm452" (UID: "430e704c-5d70-4df6-baaa-2296216f1239") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:14:43.842203 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:43.842161 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:14:43.842423 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:43.842289 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cz8d9" podUID="deac4f19-5105-40df-bd7a-9d7c576cd705" Apr 20 12:14:44.842825 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:44.842785 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:14:44.843263 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:44.842915 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nm452" podUID="430e704c-5d70-4df6-baaa-2296216f1239" Apr 20 12:14:45.843185 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:45.843152 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:14:45.843736 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:45.843292 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cz8d9" podUID="deac4f19-5105-40df-bd7a-9d7c576cd705" Apr 20 12:14:46.843167 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:46.842988 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:14:46.843278 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:46.843258 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nm452" podUID="430e704c-5d70-4df6-baaa-2296216f1239" Apr 20 12:14:47.050023 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:47.049986 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-csctt" event={"ID":"f8dc897f-a714-4520-8393-707949cd3be7","Type":"ContainerStarted","Data":"22eb1bf9aca18a0be063c227e47a8b6f269e25c3d1c14b543a43808d37f5d481"} Apr 20 12:14:47.053275 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:47.052776 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-26xn7" event={"ID":"e3f51f42-77bf-412b-970c-03006a2ef077","Type":"ContainerStarted","Data":"64eac6da117ce58667d7064b3d8c40429e9d9220ab6d34b7ca9e1a114895c111"} Apr 20 12:14:47.060813 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:47.060770 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-91.ec2.internal" event={"ID":"4e4cd406deda512383d472584c7956de","Type":"ContainerStarted","Data":"c5b59b82f4bf9179689a8f0e4c044787141b4045607cf250cc87640be3cf0745"} Apr 20 12:14:47.063593 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:47.063559 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" event={"ID":"2da468d5-2794-41ba-8344-4246be8732d7","Type":"ContainerStarted","Data":"b2c113fb88c65ecd4d1350a92ea6184bb05dc66d31b33955c9a76a065f3ba532"} Apr 20 12:14:47.063691 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:47.063601 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" event={"ID":"2da468d5-2794-41ba-8344-4246be8732d7","Type":"ContainerStarted","Data":"3a6b7902f72410946986b30468a78febab9d7abaf1c1d1ab85468c0c4fa19513"} Apr 20 12:14:47.063691 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:47.063628 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" event={"ID":"2da468d5-2794-41ba-8344-4246be8732d7","Type":"ContainerStarted","Data":"da025433474bc858676f4a92bd57941b6c29b1d923b527d880d32a971a8c768e"} Apr 20 12:14:47.063691 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:47.063645 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" event={"ID":"2da468d5-2794-41ba-8344-4246be8732d7","Type":"ContainerStarted","Data":"a094b33011b1802ae67e9b9de09e6a33d4ed45caf045d84efb4099196bbf799d"} Apr 20 12:14:47.067452 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:47.067311 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-csctt" podStartSLOduration=3.050995995 podStartE2EDuration="21.067297046s" podCreationTimestamp="2026-04-20 12:14:26 +0000 UTC" firstStartedPulling="2026-04-20 12:14:28.050727605 +0000 UTC m=+1.770741578" lastFinishedPulling="2026-04-20 12:14:46.067028653 +0000 UTC m=+19.787042629" observedRunningTime="2026-04-20 12:14:47.06722487 +0000 UTC m=+20.787238878" watchObservedRunningTime="2026-04-20 12:14:47.067297046 +0000 UTC m=+20.787311037" Apr 20 12:14:47.100208 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:47.100154 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-26xn7" podStartSLOduration=2.099849959 podStartE2EDuration="20.100134941s" podCreationTimestamp="2026-04-20 12:14:27 +0000 UTC" firstStartedPulling="2026-04-20 12:14:28.074859867 +0000 UTC m=+1.794873836" lastFinishedPulling="2026-04-20 12:14:46.075144643 +0000 UTC m=+19.795158818" observedRunningTime="2026-04-20 12:14:47.086587397 +0000 UTC m=+20.806601382" watchObservedRunningTime="2026-04-20 12:14:47.100134941 +0000 UTC m=+20.820148933" Apr 20 12:14:47.100573 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:47.100540 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-91.ec2.internal" podStartSLOduration=20.10052851 podStartE2EDuration="20.10052851s" podCreationTimestamp="2026-04-20 12:14:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 12:14:47.100058916 +0000 UTC m=+20.820072908" watchObservedRunningTime="2026-04-20 12:14:47.10052851 +0000 UTC m=+20.820542501" Apr 20 12:14:47.842929 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:47.842733 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:14:47.843094 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:47.843028 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cz8d9" podUID="deac4f19-5105-40df-bd7a-9d7c576cd705" Apr 20 12:14:48.068832 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:48.068777 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" event={"ID":"2da468d5-2794-41ba-8344-4246be8732d7","Type":"ContainerStarted","Data":"b1b8aeff9ece3c4c6e196072ceeeeed47a778fafda8fe528bc63962e1b2263e4"} Apr 20 12:14:48.068832 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:48.068836 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" event={"ID":"2da468d5-2794-41ba-8344-4246be8732d7","Type":"ContainerStarted","Data":"d20dffe76e785f1c30a696c828e7661504a1148223b21534f82ea3dc5739e747"} Apr 20 12:14:48.070277 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:48.070249 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zwm76" event={"ID":"900f999a-6b3b-4648-b015-7ca045ba8dcd","Type":"ContainerStarted","Data":"134a7e83bd0905bd11301118abcd6e92c65bc5976c77794bb33b3afa0c8af3f3"} Apr 20 12:14:48.071699 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:48.071673 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-p26xh" event={"ID":"1ab58f2d-5007-4797-8a83-489889f35e06","Type":"ContainerStarted","Data":"a0c398c7ae6d96239271375c230372c8cac7222d60b1d285e75528898cb6717c"} Apr 20 12:14:48.073167 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:48.073146 2580 generic.go:358] "Generic (PLEG): container finished" podID="0279eea6-b7aa-4e72-bec0-5aa87266cc8b" containerID="bab3543d18d9329a467f6df3cf13c46396b0ad40c19332e8cc95cb9586fc9cec" exitCode=0 Apr 20 12:14:48.073269 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:48.073216 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j99mh" event={"ID":"0279eea6-b7aa-4e72-bec0-5aa87266cc8b","Type":"ContainerDied","Data":"bab3543d18d9329a467f6df3cf13c46396b0ad40c19332e8cc95cb9586fc9cec"} Apr 20 12:14:48.075047 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:48.075027 2580 generic.go:358] "Generic (PLEG): container finished" podID="98a7b4133bfe59f130f091a3ed1a068f" containerID="db31d40d45b5fba2b07b81f8bf867edea65247ccb2fd3fa1a35acb48d9da7b82" exitCode=0 Apr 20 12:14:48.075135 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:48.075098 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-91.ec2.internal" event={"ID":"98a7b4133bfe59f130f091a3ed1a068f","Type":"ContainerDied","Data":"db31d40d45b5fba2b07b81f8bf867edea65247ccb2fd3fa1a35acb48d9da7b82"} Apr 20 12:14:48.076584 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:48.076542 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-8hww4" event={"ID":"021e270f-fe7a-402c-a482-41d496fec5fb","Type":"ContainerStarted","Data":"3b491feabd332bc8a3f59a761375b3a23398f4a6ab6c8c543578a38c51bd9cb4"} Apr 20 12:14:48.077916 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:48.077892 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-cf8lw" event={"ID":"261e5e12-8ebc-4a49-9b69-be511d818e12","Type":"ContainerStarted","Data":"34dc5afffe43b88a4c55f5a3510907d2ace708ebd4004da5911c6f54fe67b2db"} Apr 20 12:14:48.080816 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:48.080368 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-686bc" event={"ID":"2d6874b2-c453-4084-b388-f029a2a9cb5f","Type":"ContainerStarted","Data":"eb0d908957a2f9bea685372c36ab19b3284e2ae111b105371e7921cc6d4411f5"} Apr 20 12:14:48.085305 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:48.085094 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-zwm76" podStartSLOduration=3.087711751 podStartE2EDuration="21.0850793s" podCreationTimestamp="2026-04-20 12:14:27 +0000 UTC" firstStartedPulling="2026-04-20 12:14:28.06714923 +0000 UTC m=+1.787163199" lastFinishedPulling="2026-04-20 12:14:46.064516764 +0000 UTC m=+19.784530748" observedRunningTime="2026-04-20 12:14:48.084837004 +0000 UTC m=+21.804850994" watchObservedRunningTime="2026-04-20 12:14:48.0850793 +0000 UTC m=+21.805093292" Apr 20 12:14:48.111529 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:48.111464 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-cf8lw" podStartSLOduration=3.148031793 podStartE2EDuration="21.111446758s" podCreationTimestamp="2026-04-20 12:14:27 +0000 UTC" firstStartedPulling="2026-04-20 12:14:28.108197018 +0000 UTC m=+1.828210988" lastFinishedPulling="2026-04-20 12:14:46.071611985 +0000 UTC m=+19.791625953" observedRunningTime="2026-04-20 12:14:48.111371594 +0000 UTC m=+21.831385585" watchObservedRunningTime="2026-04-20 12:14:48.111446758 +0000 UTC m=+21.831460749" Apr 20 12:14:48.148694 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:48.148647 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-p26xh" podStartSLOduration=3.187618635 podStartE2EDuration="21.148634391s" podCreationTimestamp="2026-04-20 12:14:27 +0000 UTC" firstStartedPulling="2026-04-20 12:14:28.05543401 +0000 UTC m=+1.775447979" lastFinishedPulling="2026-04-20 12:14:46.016449763 +0000 UTC m=+19.736463735" observedRunningTime="2026-04-20 12:14:48.148630039 +0000 UTC m=+21.868644021" watchObservedRunningTime="2026-04-20 12:14:48.148634391 +0000 UTC m=+21.868648382" Apr 20 12:14:48.162138 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:48.162095 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-8hww4" podStartSLOduration=3.264957873 podStartE2EDuration="21.162080441s" podCreationTimestamp="2026-04-20 12:14:27 +0000 UTC" firstStartedPulling="2026-04-20 12:14:28.119326782 +0000 UTC m=+1.839340750" lastFinishedPulling="2026-04-20 12:14:46.016449342 +0000 UTC m=+19.736463318" observedRunningTime="2026-04-20 12:14:48.161846409 +0000 UTC m=+21.881860400" watchObservedRunningTime="2026-04-20 12:14:48.162080441 +0000 UTC m=+21.882094432" Apr 20 12:14:48.431202 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:48.431177 2580 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 12:14:48.774208 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:48.774081 2580 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T12:14:48.431197557Z","UUID":"3e8f1806-7a6c-4be9-a3db-bf0b20a60d8a","Handler":null,"Name":"","Endpoint":""} Apr 20 12:14:48.776073 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:48.776043 2580 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 12:14:48.776073 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:48.776077 2580 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 12:14:48.842639 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:48.842610 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:14:48.842830 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:48.842735 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nm452" podUID="430e704c-5d70-4df6-baaa-2296216f1239" Apr 20 12:14:49.084054 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:49.083960 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-686bc" event={"ID":"2d6874b2-c453-4084-b388-f029a2a9cb5f","Type":"ContainerStarted","Data":"0b5381954d08b915a60ccc03e6e3cc42bf9ca9001aa05bc69420840237f8f4ac"} Apr 20 12:14:49.085840 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:49.085780 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-91.ec2.internal" event={"ID":"98a7b4133bfe59f130f091a3ed1a068f","Type":"ContainerStarted","Data":"92ccedee27243d03a1571e7b51321fa8c1056e6280b99e6302a02eac51997f54"} Apr 20 12:14:49.103655 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:49.102129 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-91.ec2.internal" podStartSLOduration=22.102112867 podStartE2EDuration="22.102112867s" podCreationTimestamp="2026-04-20 12:14:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 12:14:49.101418906 +0000 UTC m=+22.821432891" watchObservedRunningTime="2026-04-20 12:14:49.102112867 +0000 UTC m=+22.822126864" Apr 20 12:14:49.157972 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:49.157945 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zwm76_900f999a-6b3b-4648-b015-7ca045ba8dcd/dns-node-resolver/0.log" Apr 20 12:14:49.741985 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:49.741955 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-p26xh_1ab58f2d-5007-4797-8a83-489889f35e06/node-ca/0.log" Apr 20 12:14:49.842431 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:49.842382 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:14:49.842598 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:49.842501 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cz8d9" podUID="deac4f19-5105-40df-bd7a-9d7c576cd705" Apr 20 12:14:50.090450 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:50.090413 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" event={"ID":"2da468d5-2794-41ba-8344-4246be8732d7","Type":"ContainerStarted","Data":"389a26da9137f826cea7395b0a7b2e4aeefda3cff51ee57891ca87cc67b9414d"} Apr 20 12:14:50.092314 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:50.092284 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-686bc" event={"ID":"2d6874b2-c453-4084-b388-f029a2a9cb5f","Type":"ContainerStarted","Data":"62f5f25ee9cf8977cd67c48514b990e4d1f81b4722d32eb3e69729074a651d6b"} Apr 20 12:14:50.111953 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:50.111904 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-686bc" podStartSLOduration=2.995498134 podStartE2EDuration="24.111887839s" podCreationTimestamp="2026-04-20 12:14:26 +0000 UTC" firstStartedPulling="2026-04-20 12:14:28.033706436 +0000 UTC m=+1.753720405" lastFinishedPulling="2026-04-20 12:14:49.150096137 +0000 UTC m=+22.870110110" observedRunningTime="2026-04-20 12:14:50.111542361 +0000 UTC m=+23.831556364" watchObservedRunningTime="2026-04-20 12:14:50.111887839 +0000 UTC m=+23.831901824" Apr 20 12:14:50.538107 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:50.538078 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-8hww4" Apr 20 12:14:50.538667 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:50.538643 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-8hww4" Apr 20 12:14:50.842760 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:50.842669 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:14:50.842929 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:50.842808 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nm452" podUID="430e704c-5d70-4df6-baaa-2296216f1239" Apr 20 12:14:51.094650 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:51.094564 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-8hww4" Apr 20 12:14:51.095168 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:51.095150 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-8hww4" Apr 20 12:14:51.842737 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:51.842492 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:14:51.843097 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:51.842841 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cz8d9" podUID="deac4f19-5105-40df-bd7a-9d7c576cd705" Apr 20 12:14:52.101989 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:52.101893 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" event={"ID":"2da468d5-2794-41ba-8344-4246be8732d7","Type":"ContainerStarted","Data":"23e12e3dc144c3364ba5bffaec35dd8ff1e635324aba6c91527c1d4685808af6"} Apr 20 12:14:52.128493 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:52.128439 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" podStartSLOduration=6.4776240640000005 podStartE2EDuration="25.128424244s" podCreationTimestamp="2026-04-20 12:14:27 +0000 UTC" firstStartedPulling="2026-04-20 12:14:28.087628757 +0000 UTC m=+1.807642726" lastFinishedPulling="2026-04-20 12:14:46.738428933 +0000 UTC m=+20.458442906" observedRunningTime="2026-04-20 12:14:52.127769713 +0000 UTC m=+25.847783704" watchObservedRunningTime="2026-04-20 12:14:52.128424244 +0000 UTC m=+25.848438235" Apr 20 12:14:52.842332 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:52.842285 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:14:52.842543 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:52.842455 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nm452" podUID="430e704c-5d70-4df6-baaa-2296216f1239" Apr 20 12:14:53.104574 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:53.104492 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:53.104574 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:53.104530 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:53.104574 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:53.104543 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:53.120136 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:53.120112 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:53.120332 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:53.120313 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:14:53.842413 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:53.842357 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:14:53.842573 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:53.842481 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cz8d9" podUID="deac4f19-5105-40df-bd7a-9d7c576cd705" Apr 20 12:14:54.842203 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:54.842166 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:14:54.842708 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:54.842271 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nm452" podUID="430e704c-5d70-4df6-baaa-2296216f1239" Apr 20 12:14:55.109801 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:55.109713 2580 generic.go:358] "Generic (PLEG): container finished" podID="0279eea6-b7aa-4e72-bec0-5aa87266cc8b" containerID="1cee405cd6bd2c2bdcacba34e2f1d91aa65d5d0477cac1d5699011bac4b0f388" exitCode=0 Apr 20 12:14:55.109801 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:55.109789 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j99mh" event={"ID":"0279eea6-b7aa-4e72-bec0-5aa87266cc8b","Type":"ContainerDied","Data":"1cee405cd6bd2c2bdcacba34e2f1d91aa65d5d0477cac1d5699011bac4b0f388"} Apr 20 12:14:55.842141 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:55.842114 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:14:55.842277 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:55.842236 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cz8d9" podUID="deac4f19-5105-40df-bd7a-9d7c576cd705" Apr 20 12:14:56.112910 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:56.112878 2580 generic.go:358] "Generic (PLEG): container finished" podID="0279eea6-b7aa-4e72-bec0-5aa87266cc8b" containerID="53165681e5214e9070e716a240c6030b0ffcdf838500402a488a00d59431691c" exitCode=0 Apr 20 12:14:56.112910 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:56.112915 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j99mh" event={"ID":"0279eea6-b7aa-4e72-bec0-5aa87266cc8b","Type":"ContainerDied","Data":"53165681e5214e9070e716a240c6030b0ffcdf838500402a488a00d59431691c"} Apr 20 12:14:56.843728 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:56.843550 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:14:56.844073 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:56.843796 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nm452" podUID="430e704c-5d70-4df6-baaa-2296216f1239" Apr 20 12:14:57.117007 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:57.116972 2580 generic.go:358] "Generic (PLEG): container finished" podID="0279eea6-b7aa-4e72-bec0-5aa87266cc8b" containerID="edd79a7565d3b1b666e2100be0357fe4480a7ae281c604b8bcb607bcbe11c975" exitCode=0 Apr 20 12:14:57.117191 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:57.117013 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j99mh" event={"ID":"0279eea6-b7aa-4e72-bec0-5aa87266cc8b","Type":"ContainerDied","Data":"edd79a7565d3b1b666e2100be0357fe4480a7ae281c604b8bcb607bcbe11c975"} Apr 20 12:14:57.842064 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:57.842027 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:14:57.842248 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:57.842154 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cz8d9" podUID="deac4f19-5105-40df-bd7a-9d7c576cd705" Apr 20 12:14:58.842765 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:58.842727 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:14:58.843172 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:58.842851 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nm452" podUID="430e704c-5d70-4df6-baaa-2296216f1239" Apr 20 12:14:59.443255 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:59.443205 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjqlk\" (UniqueName: \"kubernetes.io/projected/deac4f19-5105-40df-bd7a-9d7c576cd705-kube-api-access-qjqlk\") pod \"network-check-target-cz8d9\" (UID: \"deac4f19-5105-40df-bd7a-9d7c576cd705\") " pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:14:59.443467 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:59.443416 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 12:14:59.443467 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:59.443445 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 12:14:59.443467 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:59.443459 2580 projected.go:194] Error preparing data for projected volume kube-api-access-qjqlk for pod openshift-network-diagnostics/network-check-target-cz8d9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:14:59.443607 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:59.443525 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/deac4f19-5105-40df-bd7a-9d7c576cd705-kube-api-access-qjqlk podName:deac4f19-5105-40df-bd7a-9d7c576cd705 nodeName:}" failed. No retries permitted until 2026-04-20 12:15:31.443506033 +0000 UTC m=+65.163520021 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-qjqlk" (UniqueName: "kubernetes.io/projected/deac4f19-5105-40df-bd7a-9d7c576cd705-kube-api-access-qjqlk") pod "network-check-target-cz8d9" (UID: "deac4f19-5105-40df-bd7a-9d7c576cd705") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:14:59.544355 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:59.544311 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/430e704c-5d70-4df6-baaa-2296216f1239-metrics-certs\") pod \"network-metrics-daemon-nm452\" (UID: \"430e704c-5d70-4df6-baaa-2296216f1239\") " pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:14:59.544561 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:59.544482 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:14:59.544561 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:59.544552 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/430e704c-5d70-4df6-baaa-2296216f1239-metrics-certs podName:430e704c-5d70-4df6-baaa-2296216f1239 nodeName:}" failed. No retries permitted until 2026-04-20 12:15:31.544536022 +0000 UTC m=+65.264549990 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/430e704c-5d70-4df6-baaa-2296216f1239-metrics-certs") pod "network-metrics-daemon-nm452" (UID: "430e704c-5d70-4df6-baaa-2296216f1239") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:14:59.842692 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:14:59.842652 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:14:59.842880 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:14:59.842791 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cz8d9" podUID="deac4f19-5105-40df-bd7a-9d7c576cd705" Apr 20 12:15:00.843047 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:00.843014 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:15:00.843459 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:15:00.843128 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nm452" podUID="430e704c-5d70-4df6-baaa-2296216f1239" Apr 20 12:15:01.842778 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:01.842743 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:15:01.842994 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:15:01.842866 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cz8d9" podUID="deac4f19-5105-40df-bd7a-9d7c576cd705" Apr 20 12:15:02.842590 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:02.842547 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:15:02.843045 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:15:02.842721 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nm452" podUID="430e704c-5d70-4df6-baaa-2296216f1239" Apr 20 12:15:03.842419 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:03.842361 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:15:03.842582 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:15:03.842493 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cz8d9" podUID="deac4f19-5105-40df-bd7a-9d7c576cd705" Apr 20 12:15:04.132279 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:04.132241 2580 generic.go:358] "Generic (PLEG): container finished" podID="0279eea6-b7aa-4e72-bec0-5aa87266cc8b" containerID="dc29d42c91cb70483a489176ebe77b5c35b3e0ff139c00e416cc556fa1e46d48" exitCode=0 Apr 20 12:15:04.132684 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:04.132306 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j99mh" event={"ID":"0279eea6-b7aa-4e72-bec0-5aa87266cc8b","Type":"ContainerDied","Data":"dc29d42c91cb70483a489176ebe77b5c35b3e0ff139c00e416cc556fa1e46d48"} Apr 20 12:15:04.843114 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:04.843078 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:15:04.843290 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:15:04.843189 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nm452" podUID="430e704c-5d70-4df6-baaa-2296216f1239" Apr 20 12:15:05.137025 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:05.136936 2580 generic.go:358] "Generic (PLEG): container finished" podID="0279eea6-b7aa-4e72-bec0-5aa87266cc8b" containerID="5867c6d9f034afcc6cabcabfadce8d95a922ff73a12d20b2cbda32b38ac9ac02" exitCode=0 Apr 20 12:15:05.137025 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:05.136972 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j99mh" event={"ID":"0279eea6-b7aa-4e72-bec0-5aa87266cc8b","Type":"ContainerDied","Data":"5867c6d9f034afcc6cabcabfadce8d95a922ff73a12d20b2cbda32b38ac9ac02"} Apr 20 12:15:05.843039 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:05.842997 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:15:05.843217 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:15:05.843103 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cz8d9" podUID="deac4f19-5105-40df-bd7a-9d7c576cd705" Apr 20 12:15:06.141908 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:06.141824 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j99mh" event={"ID":"0279eea6-b7aa-4e72-bec0-5aa87266cc8b","Type":"ContainerStarted","Data":"6898aa296b78ee5f819a9c2c33b6a621b535b4c08121fdfc6d99c95c32a155b5"} Apr 20 12:15:06.166193 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:06.166135 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-j99mh" podStartSLOduration=4.707932264 podStartE2EDuration="40.166120964s" podCreationTimestamp="2026-04-20 12:14:26 +0000 UTC" firstStartedPulling="2026-04-20 12:14:28.015638037 +0000 UTC m=+1.735652005" lastFinishedPulling="2026-04-20 12:15:03.473826736 +0000 UTC m=+37.193840705" observedRunningTime="2026-04-20 12:15:06.16463965 +0000 UTC m=+39.884653641" watchObservedRunningTime="2026-04-20 12:15:06.166120964 +0000 UTC m=+39.886134955" Apr 20 12:15:06.843229 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:06.843191 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:15:06.843454 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:15:06.843288 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nm452" podUID="430e704c-5d70-4df6-baaa-2296216f1239" Apr 20 12:15:07.843019 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:07.842975 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:15:07.843591 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:15:07.843093 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cz8d9" podUID="deac4f19-5105-40df-bd7a-9d7c576cd705" Apr 20 12:15:08.842333 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:08.842296 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:15:08.842541 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:15:08.842442 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nm452" podUID="430e704c-5d70-4df6-baaa-2296216f1239" Apr 20 12:15:09.842444 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:09.842389 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:15:09.842831 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:15:09.842510 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cz8d9" podUID="deac4f19-5105-40df-bd7a-9d7c576cd705" Apr 20 12:15:10.842212 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:10.842173 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:15:10.842388 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:15:10.842298 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nm452" podUID="430e704c-5d70-4df6-baaa-2296216f1239" Apr 20 12:15:11.842557 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:11.842522 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:15:11.842918 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:15:11.842626 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cz8d9" podUID="deac4f19-5105-40df-bd7a-9d7c576cd705" Apr 20 12:15:12.309798 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:12.309605 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-cz8d9"] Apr 20 12:15:12.309972 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:12.309888 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:15:12.310097 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:15:12.310001 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cz8d9" podUID="deac4f19-5105-40df-bd7a-9d7c576cd705" Apr 20 12:15:12.312120 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:12.312089 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nm452"] Apr 20 12:15:12.312261 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:12.312187 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:15:12.312323 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:15:12.312283 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nm452" podUID="430e704c-5d70-4df6-baaa-2296216f1239" Apr 20 12:15:13.842850 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:13.842805 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:15:13.843269 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:13.842805 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:15:13.843269 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:15:13.842951 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nm452" podUID="430e704c-5d70-4df6-baaa-2296216f1239" Apr 20 12:15:13.843269 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:15:13.842985 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cz8d9" podUID="deac4f19-5105-40df-bd7a-9d7c576cd705" Apr 20 12:15:15.841997 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:15.841960 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:15:15.842380 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:15.841989 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:15:15.842380 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:15:15.842061 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cz8d9" podUID="deac4f19-5105-40df-bd7a-9d7c576cd705" Apr 20 12:15:15.842380 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:15:15.842182 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nm452" podUID="430e704c-5d70-4df6-baaa-2296216f1239" Apr 20 12:15:16.563416 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.563368 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-91.ec2.internal" event="NodeReady" Apr 20 12:15:16.563576 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.563498 2580 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 12:15:16.611135 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.611099 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-kffw5"] Apr 20 12:15:16.637148 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.637114 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-mlr8x"] Apr 20 12:15:16.637305 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.637281 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kffw5" Apr 20 12:15:16.640044 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.640019 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 12:15:16.640166 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.640116 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-8c8ph\"" Apr 20 12:15:16.640166 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.640122 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 12:15:16.653244 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.653214 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mlr8x"] Apr 20 12:15:16.653244 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.653244 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-kffw5"] Apr 20 12:15:16.653429 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.653256 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-kxx9j"] Apr 20 12:15:16.653429 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.653374 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mlr8x" Apr 20 12:15:16.655979 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.655942 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-g4vct\"" Apr 20 12:15:16.655979 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.655944 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 12:15:16.655979 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.655980 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 12:15:16.656212 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.655980 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 12:15:16.676207 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.676173 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-kxx9j"] Apr 20 12:15:16.676371 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.676313 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-kxx9j" Apr 20 12:15:16.678531 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.678494 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 12:15:16.678531 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.678521 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 12:15:16.678727 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.678590 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 12:15:16.678727 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.678499 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 12:15:16.678852 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.678838 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-cqzj8\"" Apr 20 12:15:16.763885 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.763852 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/05263fd7-7f8e-489b-bd5a-837b7e49f5bb-crio-socket\") pod \"insights-runtime-extractor-kxx9j\" (UID: \"05263fd7-7f8e-489b-bd5a-837b7e49f5bb\") " pod="openshift-insights/insights-runtime-extractor-kxx9j" Apr 20 12:15:16.763885 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.763890 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/05263fd7-7f8e-489b-bd5a-837b7e49f5bb-data-volume\") pod \"insights-runtime-extractor-kxx9j\" (UID: \"05263fd7-7f8e-489b-bd5a-837b7e49f5bb\") " pod="openshift-insights/insights-runtime-extractor-kxx9j" Apr 20 12:15:16.764126 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.763910 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6tlg\" (UniqueName: \"kubernetes.io/projected/0b7a32b8-6e66-44fe-b5c2-72348a1935b3-kube-api-access-t6tlg\") pod \"dns-default-kffw5\" (UID: \"0b7a32b8-6e66-44fe-b5c2-72348a1935b3\") " pod="openshift-dns/dns-default-kffw5" Apr 20 12:15:16.764126 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.763974 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhk2j\" (UniqueName: \"kubernetes.io/projected/24ad10ec-2768-4402-b312-c7462cdbf063-kube-api-access-xhk2j\") pod \"ingress-canary-mlr8x\" (UID: \"24ad10ec-2768-4402-b312-c7462cdbf063\") " pod="openshift-ingress-canary/ingress-canary-mlr8x" Apr 20 12:15:16.764126 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.764014 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b7a32b8-6e66-44fe-b5c2-72348a1935b3-metrics-tls\") pod \"dns-default-kffw5\" (UID: \"0b7a32b8-6e66-44fe-b5c2-72348a1935b3\") " pod="openshift-dns/dns-default-kffw5" Apr 20 12:15:16.764126 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.764060 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nrpw\" (UniqueName: \"kubernetes.io/projected/05263fd7-7f8e-489b-bd5a-837b7e49f5bb-kube-api-access-7nrpw\") pod \"insights-runtime-extractor-kxx9j\" (UID: \"05263fd7-7f8e-489b-bd5a-837b7e49f5bb\") " pod="openshift-insights/insights-runtime-extractor-kxx9j" Apr 20 12:15:16.764126 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.764105 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/05263fd7-7f8e-489b-bd5a-837b7e49f5bb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kxx9j\" (UID: \"05263fd7-7f8e-489b-bd5a-837b7e49f5bb\") " pod="openshift-insights/insights-runtime-extractor-kxx9j" Apr 20 12:15:16.764126 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.764122 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0b7a32b8-6e66-44fe-b5c2-72348a1935b3-tmp-dir\") pod \"dns-default-kffw5\" (UID: \"0b7a32b8-6e66-44fe-b5c2-72348a1935b3\") " pod="openshift-dns/dns-default-kffw5" Apr 20 12:15:16.764368 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.764170 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/05263fd7-7f8e-489b-bd5a-837b7e49f5bb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kxx9j\" (UID: \"05263fd7-7f8e-489b-bd5a-837b7e49f5bb\") " pod="openshift-insights/insights-runtime-extractor-kxx9j" Apr 20 12:15:16.764368 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.764202 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24ad10ec-2768-4402-b312-c7462cdbf063-cert\") pod \"ingress-canary-mlr8x\" (UID: \"24ad10ec-2768-4402-b312-c7462cdbf063\") " pod="openshift-ingress-canary/ingress-canary-mlr8x" Apr 20 12:15:16.764368 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.764282 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b7a32b8-6e66-44fe-b5c2-72348a1935b3-config-volume\") pod \"dns-default-kffw5\" (UID: \"0b7a32b8-6e66-44fe-b5c2-72348a1935b3\") " pod="openshift-dns/dns-default-kffw5" Apr 20 12:15:16.865472 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.865359 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7nrpw\" (UniqueName: \"kubernetes.io/projected/05263fd7-7f8e-489b-bd5a-837b7e49f5bb-kube-api-access-7nrpw\") pod \"insights-runtime-extractor-kxx9j\" (UID: \"05263fd7-7f8e-489b-bd5a-837b7e49f5bb\") " pod="openshift-insights/insights-runtime-extractor-kxx9j" Apr 20 12:15:16.865472 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.865430 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/05263fd7-7f8e-489b-bd5a-837b7e49f5bb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kxx9j\" (UID: \"05263fd7-7f8e-489b-bd5a-837b7e49f5bb\") " pod="openshift-insights/insights-runtime-extractor-kxx9j" Apr 20 12:15:16.865472 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.865449 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0b7a32b8-6e66-44fe-b5c2-72348a1935b3-tmp-dir\") pod \"dns-default-kffw5\" (UID: \"0b7a32b8-6e66-44fe-b5c2-72348a1935b3\") " pod="openshift-dns/dns-default-kffw5" Apr 20 12:15:16.866283 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.865486 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/05263fd7-7f8e-489b-bd5a-837b7e49f5bb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kxx9j\" (UID: \"05263fd7-7f8e-489b-bd5a-837b7e49f5bb\") " pod="openshift-insights/insights-runtime-extractor-kxx9j" Apr 20 12:15:16.866283 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.865502 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24ad10ec-2768-4402-b312-c7462cdbf063-cert\") pod \"ingress-canary-mlr8x\" (UID: \"24ad10ec-2768-4402-b312-c7462cdbf063\") " pod="openshift-ingress-canary/ingress-canary-mlr8x" Apr 20 12:15:16.866283 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.865524 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b7a32b8-6e66-44fe-b5c2-72348a1935b3-config-volume\") pod \"dns-default-kffw5\" (UID: \"0b7a32b8-6e66-44fe-b5c2-72348a1935b3\") " pod="openshift-dns/dns-default-kffw5" Apr 20 12:15:16.866283 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.865659 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/05263fd7-7f8e-489b-bd5a-837b7e49f5bb-crio-socket\") pod \"insights-runtime-extractor-kxx9j\" (UID: \"05263fd7-7f8e-489b-bd5a-837b7e49f5bb\") " pod="openshift-insights/insights-runtime-extractor-kxx9j" Apr 20 12:15:16.866283 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.865691 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/05263fd7-7f8e-489b-bd5a-837b7e49f5bb-data-volume\") pod \"insights-runtime-extractor-kxx9j\" (UID: \"05263fd7-7f8e-489b-bd5a-837b7e49f5bb\") " pod="openshift-insights/insights-runtime-extractor-kxx9j" Apr 20 12:15:16.866283 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.865714 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t6tlg\" (UniqueName: \"kubernetes.io/projected/0b7a32b8-6e66-44fe-b5c2-72348a1935b3-kube-api-access-t6tlg\") pod \"dns-default-kffw5\" (UID: \"0b7a32b8-6e66-44fe-b5c2-72348a1935b3\") " pod="openshift-dns/dns-default-kffw5" Apr 20 12:15:16.866283 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.865738 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xhk2j\" (UniqueName: \"kubernetes.io/projected/24ad10ec-2768-4402-b312-c7462cdbf063-kube-api-access-xhk2j\") pod \"ingress-canary-mlr8x\" (UID: \"24ad10ec-2768-4402-b312-c7462cdbf063\") " pod="openshift-ingress-canary/ingress-canary-mlr8x" Apr 20 12:15:16.866283 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.865918 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0b7a32b8-6e66-44fe-b5c2-72348a1935b3-tmp-dir\") pod \"dns-default-kffw5\" (UID: \"0b7a32b8-6e66-44fe-b5c2-72348a1935b3\") " pod="openshift-dns/dns-default-kffw5" Apr 20 12:15:16.866283 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.866030 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/05263fd7-7f8e-489b-bd5a-837b7e49f5bb-data-volume\") pod \"insights-runtime-extractor-kxx9j\" (UID: \"05263fd7-7f8e-489b-bd5a-837b7e49f5bb\") " pod="openshift-insights/insights-runtime-extractor-kxx9j" Apr 20 12:15:16.866283 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.866052 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/05263fd7-7f8e-489b-bd5a-837b7e49f5bb-crio-socket\") pod \"insights-runtime-extractor-kxx9j\" (UID: \"05263fd7-7f8e-489b-bd5a-837b7e49f5bb\") " pod="openshift-insights/insights-runtime-extractor-kxx9j" Apr 20 12:15:16.866283 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.866078 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b7a32b8-6e66-44fe-b5c2-72348a1935b3-metrics-tls\") pod \"dns-default-kffw5\" (UID: \"0b7a32b8-6e66-44fe-b5c2-72348a1935b3\") " pod="openshift-dns/dns-default-kffw5" Apr 20 12:15:16.866283 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.866209 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b7a32b8-6e66-44fe-b5c2-72348a1935b3-config-volume\") pod \"dns-default-kffw5\" (UID: \"0b7a32b8-6e66-44fe-b5c2-72348a1935b3\") " pod="openshift-dns/dns-default-kffw5" Apr 20 12:15:16.869717 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.869682 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/05263fd7-7f8e-489b-bd5a-837b7e49f5bb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kxx9j\" (UID: \"05263fd7-7f8e-489b-bd5a-837b7e49f5bb\") " pod="openshift-insights/insights-runtime-extractor-kxx9j" Apr 20 12:15:16.869841 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.869717 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b7a32b8-6e66-44fe-b5c2-72348a1935b3-metrics-tls\") pod \"dns-default-kffw5\" (UID: \"0b7a32b8-6e66-44fe-b5c2-72348a1935b3\") " pod="openshift-dns/dns-default-kffw5" Apr 20 12:15:16.869841 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.869721 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24ad10ec-2768-4402-b312-c7462cdbf063-cert\") pod \"ingress-canary-mlr8x\" (UID: \"24ad10ec-2768-4402-b312-c7462cdbf063\") " pod="openshift-ingress-canary/ingress-canary-mlr8x" Apr 20 12:15:16.873331 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.873307 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhk2j\" (UniqueName: \"kubernetes.io/projected/24ad10ec-2768-4402-b312-c7462cdbf063-kube-api-access-xhk2j\") pod \"ingress-canary-mlr8x\" (UID: \"24ad10ec-2768-4402-b312-c7462cdbf063\") " pod="openshift-ingress-canary/ingress-canary-mlr8x" Apr 20 12:15:16.873473 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.873383 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nrpw\" (UniqueName: \"kubernetes.io/projected/05263fd7-7f8e-489b-bd5a-837b7e49f5bb-kube-api-access-7nrpw\") pod \"insights-runtime-extractor-kxx9j\" (UID: \"05263fd7-7f8e-489b-bd5a-837b7e49f5bb\") " pod="openshift-insights/insights-runtime-extractor-kxx9j" Apr 20 12:15:16.873532 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.873477 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6tlg\" (UniqueName: \"kubernetes.io/projected/0b7a32b8-6e66-44fe-b5c2-72348a1935b3-kube-api-access-t6tlg\") pod \"dns-default-kffw5\" (UID: \"0b7a32b8-6e66-44fe-b5c2-72348a1935b3\") " pod="openshift-dns/dns-default-kffw5" Apr 20 12:15:16.877186 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.877163 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/05263fd7-7f8e-489b-bd5a-837b7e49f5bb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kxx9j\" (UID: \"05263fd7-7f8e-489b-bd5a-837b7e49f5bb\") " pod="openshift-insights/insights-runtime-extractor-kxx9j" Apr 20 12:15:16.949051 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.949013 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kffw5" Apr 20 12:15:16.961972 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.961946 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mlr8x" Apr 20 12:15:16.985315 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:16.985283 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-kxx9j" Apr 20 12:15:17.149639 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:17.149245 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mlr8x"] Apr 20 12:15:17.151026 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:17.150999 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-kffw5"] Apr 20 12:15:17.151220 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:17.151198 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-kxx9j"] Apr 20 12:15:17.155146 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:15:17.155114 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24ad10ec_2768_4402_b312_c7462cdbf063.slice/crio-a5afaa94c96e9b7ca20e94b92d2e377a57d861f8eb3437437604c9c2e3113dc0 WatchSource:0}: Error finding container a5afaa94c96e9b7ca20e94b92d2e377a57d861f8eb3437437604c9c2e3113dc0: Status 404 returned error can't find the container with id a5afaa94c96e9b7ca20e94b92d2e377a57d861f8eb3437437604c9c2e3113dc0 Apr 20 12:15:17.155456 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:15:17.155437 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b7a32b8_6e66_44fe_b5c2_72348a1935b3.slice/crio-2c6fbfda69a3e5f176833f245f8acddc32bc5224367a36962871ce45994428f5 WatchSource:0}: Error finding container 2c6fbfda69a3e5f176833f245f8acddc32bc5224367a36962871ce45994428f5: Status 404 returned error can't find the container with id 2c6fbfda69a3e5f176833f245f8acddc32bc5224367a36962871ce45994428f5 Apr 20 12:15:17.156117 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:15:17.156097 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05263fd7_7f8e_489b_bd5a_837b7e49f5bb.slice/crio-f19cf567922c169441991548dcf70261ddacc09658e76f1ab60de9c055b262b0 WatchSource:0}: Error finding container f19cf567922c169441991548dcf70261ddacc09658e76f1ab60de9c055b262b0: Status 404 returned error can't find the container with id f19cf567922c169441991548dcf70261ddacc09658e76f1ab60de9c055b262b0 Apr 20 12:15:17.162412 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:17.162366 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mlr8x" event={"ID":"24ad10ec-2768-4402-b312-c7462cdbf063","Type":"ContainerStarted","Data":"a5afaa94c96e9b7ca20e94b92d2e377a57d861f8eb3437437604c9c2e3113dc0"} Apr 20 12:15:17.163411 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:17.163375 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kffw5" event={"ID":"0b7a32b8-6e66-44fe-b5c2-72348a1935b3","Type":"ContainerStarted","Data":"2c6fbfda69a3e5f176833f245f8acddc32bc5224367a36962871ce45994428f5"} Apr 20 12:15:17.164287 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:17.164266 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kxx9j" event={"ID":"05263fd7-7f8e-489b-bd5a-837b7e49f5bb","Type":"ContainerStarted","Data":"f19cf567922c169441991548dcf70261ddacc09658e76f1ab60de9c055b262b0"} Apr 20 12:15:17.842523 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:17.842480 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:15:17.842736 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:17.842717 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:15:17.845530 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:17.845506 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 12:15:17.846597 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:17.846569 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-p76zv\"" Apr 20 12:15:17.846724 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:17.846610 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kwdzh\"" Apr 20 12:15:17.846724 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:17.846646 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 12:15:17.847105 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:17.846886 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 12:15:18.167417 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:18.167316 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kxx9j" event={"ID":"05263fd7-7f8e-489b-bd5a-837b7e49f5bb","Type":"ContainerStarted","Data":"b610c6916c1085c78dcb3fcaf0a5a2b97562e83bca9035581edfac5de7c1fb1f"} Apr 20 12:15:20.173027 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:20.172792 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kffw5" event={"ID":"0b7a32b8-6e66-44fe-b5c2-72348a1935b3","Type":"ContainerStarted","Data":"ba8621c9748dbc1c3cd3f59cee9bb96254ba3a55d61672bd231e126233ff5d4c"} Apr 20 12:15:20.173578 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:20.173047 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-kffw5" Apr 20 12:15:20.173578 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:20.173064 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kffw5" event={"ID":"0b7a32b8-6e66-44fe-b5c2-72348a1935b3","Type":"ContainerStarted","Data":"544b447d30497a9a1fc3a11cde07c2b558f1d511ddf43231e76bd5bfac2a8b29"} Apr 20 12:15:20.174669 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:20.174638 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kxx9j" event={"ID":"05263fd7-7f8e-489b-bd5a-837b7e49f5bb","Type":"ContainerStarted","Data":"94c30c13adac8e2ce81336eb28068deccde50a5846dedb38723178098557cb00"} Apr 20 12:15:20.176012 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:20.175984 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mlr8x" event={"ID":"24ad10ec-2768-4402-b312-c7462cdbf063","Type":"ContainerStarted","Data":"cfa66ebac4f4c7818930b670aba4e1a76b8ba545fa42cb7bcfa7ef86ecaa3e3e"} Apr 20 12:15:20.192438 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:20.192362 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-kffw5" podStartSLOduration=2.210484397 podStartE2EDuration="4.192343674s" podCreationTimestamp="2026-04-20 12:15:16 +0000 UTC" firstStartedPulling="2026-04-20 12:15:17.157459734 +0000 UTC m=+50.877473704" lastFinishedPulling="2026-04-20 12:15:19.139318998 +0000 UTC m=+52.859332981" observedRunningTime="2026-04-20 12:15:20.191245553 +0000 UTC m=+53.911259566" watchObservedRunningTime="2026-04-20 12:15:20.192343674 +0000 UTC m=+53.912357668" Apr 20 12:15:20.207290 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:20.207232 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-mlr8x" podStartSLOduration=2.221354003 podStartE2EDuration="4.20721535s" podCreationTimestamp="2026-04-20 12:15:16 +0000 UTC" firstStartedPulling="2026-04-20 12:15:17.157080202 +0000 UTC m=+50.877094172" lastFinishedPulling="2026-04-20 12:15:19.142941545 +0000 UTC m=+52.862955519" observedRunningTime="2026-04-20 12:15:20.206467937 +0000 UTC m=+53.926481923" watchObservedRunningTime="2026-04-20 12:15:20.20721535 +0000 UTC m=+53.927229341" Apr 20 12:15:20.889626 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:20.889596 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-2p7tn"] Apr 20 12:15:20.892515 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:20.892498 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2p7tn" Apr 20 12:15:20.894801 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:20.894779 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 12:15:20.894801 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:20.894798 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 20 12:15:20.895042 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:20.895013 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 12:15:20.895141 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:20.895053 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-zklrk\"" Apr 20 12:15:20.895141 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:20.895075 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 20 12:15:20.895635 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:20.895620 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 12:15:20.904971 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:20.904950 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-2p7tn"] Apr 20 12:15:20.929553 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:20.929521 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-4rfp8"] Apr 20 12:15:20.932674 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:20.932579 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4rfp8" Apr 20 12:15:20.933390 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:20.933371 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-q8rbm"] Apr 20 12:15:20.934732 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:20.934714 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 12:15:20.934947 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:20.934930 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 12:15:20.935228 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:20.935201 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 12:15:20.935416 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:20.935370 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-jsw86\"" Apr 20 12:15:20.936192 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:20.936177 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-q8rbm" Apr 20 12:15:20.938429 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:20.938388 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 20 12:15:20.938541 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:20.938443 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 20 12:15:20.938541 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:20.938515 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-9w94x\"" Apr 20 12:15:20.938748 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:20.938673 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 20 12:15:20.946551 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:20.946533 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-q8rbm"] Apr 20 12:15:20.997084 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:20.997052 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4gwd\" (UniqueName: \"kubernetes.io/projected/3f8898a4-1988-4c88-81d7-7b424aaf64e7-kube-api-access-s4gwd\") pod \"openshift-state-metrics-9d44df66c-2p7tn\" (UID: \"3f8898a4-1988-4c88-81d7-7b424aaf64e7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2p7tn" Apr 20 12:15:20.997084 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:20.997085 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3f8898a4-1988-4c88-81d7-7b424aaf64e7-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-2p7tn\" (UID: \"3f8898a4-1988-4c88-81d7-7b424aaf64e7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2p7tn" Apr 20 12:15:20.997274 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:20.997186 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f8898a4-1988-4c88-81d7-7b424aaf64e7-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-2p7tn\" (UID: \"3f8898a4-1988-4c88-81d7-7b424aaf64e7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2p7tn" Apr 20 12:15:20.997274 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:20.997215 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3f8898a4-1988-4c88-81d7-7b424aaf64e7-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-2p7tn\" (UID: \"3f8898a4-1988-4c88-81d7-7b424aaf64e7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2p7tn" Apr 20 12:15:21.097767 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.097736 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9c983cd8-5455-48d0-a6b5-0f8277cb2ca9-node-exporter-accelerators-collector-config\") pod \"node-exporter-4rfp8\" (UID: \"9c983cd8-5455-48d0-a6b5-0f8277cb2ca9\") " pod="openshift-monitoring/node-exporter-4rfp8" Apr 20 12:15:21.097767 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.097773 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtd56\" (UniqueName: \"kubernetes.io/projected/09b97c75-b838-4128-bb6f-d6a54d1cc11e-kube-api-access-vtd56\") pod \"kube-state-metrics-69db897b98-q8rbm\" (UID: \"09b97c75-b838-4128-bb6f-d6a54d1cc11e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-q8rbm" Apr 20 12:15:21.097967 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.097843 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/09b97c75-b838-4128-bb6f-d6a54d1cc11e-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-q8rbm\" (UID: \"09b97c75-b838-4128-bb6f-d6a54d1cc11e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-q8rbm" Apr 20 12:15:21.097967 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.097867 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/09b97c75-b838-4128-bb6f-d6a54d1cc11e-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-q8rbm\" (UID: \"09b97c75-b838-4128-bb6f-d6a54d1cc11e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-q8rbm" Apr 20 12:15:21.097967 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.097895 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9c983cd8-5455-48d0-a6b5-0f8277cb2ca9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4rfp8\" (UID: \"9c983cd8-5455-48d0-a6b5-0f8277cb2ca9\") " pod="openshift-monitoring/node-exporter-4rfp8" Apr 20 12:15:21.097967 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.097914 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9c983cd8-5455-48d0-a6b5-0f8277cb2ca9-sys\") pod \"node-exporter-4rfp8\" (UID: \"9c983cd8-5455-48d0-a6b5-0f8277cb2ca9\") " pod="openshift-monitoring/node-exporter-4rfp8" Apr 20 12:15:21.098092 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.097971 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/09b97c75-b838-4128-bb6f-d6a54d1cc11e-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-q8rbm\" (UID: \"09b97c75-b838-4128-bb6f-d6a54d1cc11e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-q8rbm" Apr 20 12:15:21.098092 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.098009 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/09b97c75-b838-4128-bb6f-d6a54d1cc11e-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-q8rbm\" (UID: \"09b97c75-b838-4128-bb6f-d6a54d1cc11e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-q8rbm" Apr 20 12:15:21.098092 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.098030 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9c983cd8-5455-48d0-a6b5-0f8277cb2ca9-node-exporter-textfile\") pod \"node-exporter-4rfp8\" (UID: \"9c983cd8-5455-48d0-a6b5-0f8277cb2ca9\") " pod="openshift-monitoring/node-exporter-4rfp8" Apr 20 12:15:21.098092 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.098055 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f8898a4-1988-4c88-81d7-7b424aaf64e7-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-2p7tn\" (UID: \"3f8898a4-1988-4c88-81d7-7b424aaf64e7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2p7tn" Apr 20 12:15:21.098092 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.098077 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6dmk\" (UniqueName: \"kubernetes.io/projected/9c983cd8-5455-48d0-a6b5-0f8277cb2ca9-kube-api-access-m6dmk\") pod \"node-exporter-4rfp8\" (UID: \"9c983cd8-5455-48d0-a6b5-0f8277cb2ca9\") " pod="openshift-monitoring/node-exporter-4rfp8" Apr 20 12:15:21.098238 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.098102 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9c983cd8-5455-48d0-a6b5-0f8277cb2ca9-metrics-client-ca\") pod \"node-exporter-4rfp8\" (UID: \"9c983cd8-5455-48d0-a6b5-0f8277cb2ca9\") " pod="openshift-monitoring/node-exporter-4rfp8" Apr 20 12:15:21.098238 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.098122 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3f8898a4-1988-4c88-81d7-7b424aaf64e7-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-2p7tn\" (UID: \"3f8898a4-1988-4c88-81d7-7b424aaf64e7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2p7tn" Apr 20 12:15:21.098238 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.098149 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9c983cd8-5455-48d0-a6b5-0f8277cb2ca9-node-exporter-tls\") pod \"node-exporter-4rfp8\" (UID: \"9c983cd8-5455-48d0-a6b5-0f8277cb2ca9\") " pod="openshift-monitoring/node-exporter-4rfp8" Apr 20 12:15:21.098238 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.098165 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/09b97c75-b838-4128-bb6f-d6a54d1cc11e-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-q8rbm\" (UID: \"09b97c75-b838-4128-bb6f-d6a54d1cc11e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-q8rbm" Apr 20 12:15:21.098238 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.098192 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s4gwd\" (UniqueName: \"kubernetes.io/projected/3f8898a4-1988-4c88-81d7-7b424aaf64e7-kube-api-access-s4gwd\") pod \"openshift-state-metrics-9d44df66c-2p7tn\" (UID: \"3f8898a4-1988-4c88-81d7-7b424aaf64e7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2p7tn" Apr 20 12:15:21.098238 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.098231 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9c983cd8-5455-48d0-a6b5-0f8277cb2ca9-root\") pod \"node-exporter-4rfp8\" (UID: \"9c983cd8-5455-48d0-a6b5-0f8277cb2ca9\") " pod="openshift-monitoring/node-exporter-4rfp8" Apr 20 12:15:21.098444 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.098254 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3f8898a4-1988-4c88-81d7-7b424aaf64e7-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-2p7tn\" (UID: \"3f8898a4-1988-4c88-81d7-7b424aaf64e7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2p7tn" Apr 20 12:15:21.098444 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.098295 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9c983cd8-5455-48d0-a6b5-0f8277cb2ca9-node-exporter-wtmp\") pod \"node-exporter-4rfp8\" (UID: \"9c983cd8-5455-48d0-a6b5-0f8277cb2ca9\") " pod="openshift-monitoring/node-exporter-4rfp8" Apr 20 12:15:21.098934 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.098915 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3f8898a4-1988-4c88-81d7-7b424aaf64e7-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-2p7tn\" (UID: \"3f8898a4-1988-4c88-81d7-7b424aaf64e7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2p7tn" Apr 20 12:15:21.102125 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.102102 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f8898a4-1988-4c88-81d7-7b424aaf64e7-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-2p7tn\" (UID: \"3f8898a4-1988-4c88-81d7-7b424aaf64e7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2p7tn" Apr 20 12:15:21.102237 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.102197 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3f8898a4-1988-4c88-81d7-7b424aaf64e7-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-2p7tn\" (UID: \"3f8898a4-1988-4c88-81d7-7b424aaf64e7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2p7tn" Apr 20 12:15:21.106155 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.106132 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4gwd\" (UniqueName: \"kubernetes.io/projected/3f8898a4-1988-4c88-81d7-7b424aaf64e7-kube-api-access-s4gwd\") pod \"openshift-state-metrics-9d44df66c-2p7tn\" (UID: \"3f8898a4-1988-4c88-81d7-7b424aaf64e7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2p7tn" Apr 20 12:15:21.180246 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.180149 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kxx9j" event={"ID":"05263fd7-7f8e-489b-bd5a-837b7e49f5bb","Type":"ContainerStarted","Data":"065fa2ee5a80d6d6de920a070bdbc92fa3d11a0bedc31016c934015fd4672ecf"} Apr 20 12:15:21.199247 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.199220 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9c983cd8-5455-48d0-a6b5-0f8277cb2ca9-node-exporter-tls\") pod \"node-exporter-4rfp8\" (UID: \"9c983cd8-5455-48d0-a6b5-0f8277cb2ca9\") " pod="openshift-monitoring/node-exporter-4rfp8" Apr 20 12:15:21.199348 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.199256 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/09b97c75-b838-4128-bb6f-d6a54d1cc11e-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-q8rbm\" (UID: \"09b97c75-b838-4128-bb6f-d6a54d1cc11e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-q8rbm" Apr 20 12:15:21.199348 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.199276 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9c983cd8-5455-48d0-a6b5-0f8277cb2ca9-root\") pod \"node-exporter-4rfp8\" (UID: \"9c983cd8-5455-48d0-a6b5-0f8277cb2ca9\") " pod="openshift-monitoring/node-exporter-4rfp8" Apr 20 12:15:21.199348 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.199309 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9c983cd8-5455-48d0-a6b5-0f8277cb2ca9-node-exporter-wtmp\") pod \"node-exporter-4rfp8\" (UID: \"9c983cd8-5455-48d0-a6b5-0f8277cb2ca9\") " pod="openshift-monitoring/node-exporter-4rfp8" Apr 20 12:15:21.199348 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.199333 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9c983cd8-5455-48d0-a6b5-0f8277cb2ca9-node-exporter-accelerators-collector-config\") pod \"node-exporter-4rfp8\" (UID: \"9c983cd8-5455-48d0-a6b5-0f8277cb2ca9\") " pod="openshift-monitoring/node-exporter-4rfp8" Apr 20 12:15:21.199621 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.199359 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vtd56\" (UniqueName: \"kubernetes.io/projected/09b97c75-b838-4128-bb6f-d6a54d1cc11e-kube-api-access-vtd56\") pod \"kube-state-metrics-69db897b98-q8rbm\" (UID: \"09b97c75-b838-4128-bb6f-d6a54d1cc11e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-q8rbm" Apr 20 12:15:21.199621 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.199383 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9c983cd8-5455-48d0-a6b5-0f8277cb2ca9-root\") pod \"node-exporter-4rfp8\" (UID: \"9c983cd8-5455-48d0-a6b5-0f8277cb2ca9\") " pod="openshift-monitoring/node-exporter-4rfp8" Apr 20 12:15:21.199621 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.199420 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/09b97c75-b838-4128-bb6f-d6a54d1cc11e-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-q8rbm\" (UID: \"09b97c75-b838-4128-bb6f-d6a54d1cc11e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-q8rbm" Apr 20 12:15:21.199621 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.199449 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/09b97c75-b838-4128-bb6f-d6a54d1cc11e-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-q8rbm\" (UID: \"09b97c75-b838-4128-bb6f-d6a54d1cc11e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-q8rbm" Apr 20 12:15:21.199621 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.199478 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9c983cd8-5455-48d0-a6b5-0f8277cb2ca9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4rfp8\" (UID: \"9c983cd8-5455-48d0-a6b5-0f8277cb2ca9\") " pod="openshift-monitoring/node-exporter-4rfp8" Apr 20 12:15:21.199621 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.199516 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9c983cd8-5455-48d0-a6b5-0f8277cb2ca9-sys\") pod \"node-exporter-4rfp8\" (UID: \"9c983cd8-5455-48d0-a6b5-0f8277cb2ca9\") " pod="openshift-monitoring/node-exporter-4rfp8" Apr 20 12:15:21.199621 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.199556 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/09b97c75-b838-4128-bb6f-d6a54d1cc11e-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-q8rbm\" (UID: \"09b97c75-b838-4128-bb6f-d6a54d1cc11e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-q8rbm" Apr 20 12:15:21.199621 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.199585 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/09b97c75-b838-4128-bb6f-d6a54d1cc11e-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-q8rbm\" (UID: \"09b97c75-b838-4128-bb6f-d6a54d1cc11e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-q8rbm" Apr 20 12:15:21.199621 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.199611 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9c983cd8-5455-48d0-a6b5-0f8277cb2ca9-node-exporter-textfile\") pod \"node-exporter-4rfp8\" (UID: \"9c983cd8-5455-48d0-a6b5-0f8277cb2ca9\") " pod="openshift-monitoring/node-exporter-4rfp8" Apr 20 12:15:21.200045 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.199678 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6dmk\" (UniqueName: \"kubernetes.io/projected/9c983cd8-5455-48d0-a6b5-0f8277cb2ca9-kube-api-access-m6dmk\") pod \"node-exporter-4rfp8\" (UID: \"9c983cd8-5455-48d0-a6b5-0f8277cb2ca9\") " pod="openshift-monitoring/node-exporter-4rfp8" Apr 20 12:15:21.200045 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.199704 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9c983cd8-5455-48d0-a6b5-0f8277cb2ca9-metrics-client-ca\") pod \"node-exporter-4rfp8\" (UID: \"9c983cd8-5455-48d0-a6b5-0f8277cb2ca9\") " pod="openshift-monitoring/node-exporter-4rfp8" Apr 20 12:15:21.200045 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.199780 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/09b97c75-b838-4128-bb6f-d6a54d1cc11e-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-q8rbm\" (UID: \"09b97c75-b838-4128-bb6f-d6a54d1cc11e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-q8rbm" Apr 20 12:15:21.200198 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.200117 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9c983cd8-5455-48d0-a6b5-0f8277cb2ca9-node-exporter-accelerators-collector-config\") pod \"node-exporter-4rfp8\" (UID: \"9c983cd8-5455-48d0-a6b5-0f8277cb2ca9\") " pod="openshift-monitoring/node-exporter-4rfp8" Apr 20 12:15:21.200198 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.200146 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/09b97c75-b838-4128-bb6f-d6a54d1cc11e-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-q8rbm\" (UID: \"09b97c75-b838-4128-bb6f-d6a54d1cc11e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-q8rbm" Apr 20 12:15:21.200198 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.200179 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9c983cd8-5455-48d0-a6b5-0f8277cb2ca9-metrics-client-ca\") pod \"node-exporter-4rfp8\" (UID: \"9c983cd8-5455-48d0-a6b5-0f8277cb2ca9\") " pod="openshift-monitoring/node-exporter-4rfp8" Apr 20 12:15:21.200350 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.200239 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9c983cd8-5455-48d0-a6b5-0f8277cb2ca9-sys\") pod \"node-exporter-4rfp8\" (UID: \"9c983cd8-5455-48d0-a6b5-0f8277cb2ca9\") " pod="openshift-monitoring/node-exporter-4rfp8" Apr 20 12:15:21.200350 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.199513 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9c983cd8-5455-48d0-a6b5-0f8277cb2ca9-node-exporter-wtmp\") pod \"node-exporter-4rfp8\" (UID: \"9c983cd8-5455-48d0-a6b5-0f8277cb2ca9\") " pod="openshift-monitoring/node-exporter-4rfp8" Apr 20 12:15:21.200350 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.200322 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9c983cd8-5455-48d0-a6b5-0f8277cb2ca9-node-exporter-textfile\") pod \"node-exporter-4rfp8\" (UID: \"9c983cd8-5455-48d0-a6b5-0f8277cb2ca9\") " pod="openshift-monitoring/node-exporter-4rfp8" Apr 20 12:15:21.200501 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:15:21.200445 2580 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 20 12:15:21.200551 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:15:21.200507 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09b97c75-b838-4128-bb6f-d6a54d1cc11e-kube-state-metrics-tls podName:09b97c75-b838-4128-bb6f-d6a54d1cc11e nodeName:}" failed. No retries permitted until 2026-04-20 12:15:21.700487783 +0000 UTC m=+55.420501767 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/09b97c75-b838-4128-bb6f-d6a54d1cc11e-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-q8rbm" (UID: "09b97c75-b838-4128-bb6f-d6a54d1cc11e") : secret "kube-state-metrics-tls" not found Apr 20 12:15:21.200819 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.200795 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/09b97c75-b838-4128-bb6f-d6a54d1cc11e-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-q8rbm\" (UID: \"09b97c75-b838-4128-bb6f-d6a54d1cc11e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-q8rbm" Apr 20 12:15:21.201862 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.201837 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9c983cd8-5455-48d0-a6b5-0f8277cb2ca9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4rfp8\" (UID: \"9c983cd8-5455-48d0-a6b5-0f8277cb2ca9\") " pod="openshift-monitoring/node-exporter-4rfp8" Apr 20 12:15:21.202011 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.201976 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-kxx9j" podStartSLOduration=1.989691066 podStartE2EDuration="5.201965627s" podCreationTimestamp="2026-04-20 12:15:16 +0000 UTC" firstStartedPulling="2026-04-20 12:15:17.26902538 +0000 UTC m=+50.989039349" lastFinishedPulling="2026-04-20 12:15:20.481299941 +0000 UTC m=+54.201313910" observedRunningTime="2026-04-20 12:15:21.199321715 +0000 UTC m=+54.919335729" watchObservedRunningTime="2026-04-20 12:15:21.201965627 +0000 UTC m=+54.921979636" Apr 20 12:15:21.202250 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.202228 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9c983cd8-5455-48d0-a6b5-0f8277cb2ca9-node-exporter-tls\") pod \"node-exporter-4rfp8\" (UID: \"9c983cd8-5455-48d0-a6b5-0f8277cb2ca9\") " pod="openshift-monitoring/node-exporter-4rfp8" Apr 20 12:15:21.203341 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.203310 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2p7tn" Apr 20 12:15:21.206930 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.206910 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/09b97c75-b838-4128-bb6f-d6a54d1cc11e-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-q8rbm\" (UID: \"09b97c75-b838-4128-bb6f-d6a54d1cc11e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-q8rbm" Apr 20 12:15:21.222319 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.222283 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtd56\" (UniqueName: \"kubernetes.io/projected/09b97c75-b838-4128-bb6f-d6a54d1cc11e-kube-api-access-vtd56\") pod \"kube-state-metrics-69db897b98-q8rbm\" (UID: \"09b97c75-b838-4128-bb6f-d6a54d1cc11e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-q8rbm" Apr 20 12:15:21.224785 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.224764 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6dmk\" (UniqueName: \"kubernetes.io/projected/9c983cd8-5455-48d0-a6b5-0f8277cb2ca9-kube-api-access-m6dmk\") pod \"node-exporter-4rfp8\" (UID: \"9c983cd8-5455-48d0-a6b5-0f8277cb2ca9\") " pod="openshift-monitoring/node-exporter-4rfp8" Apr 20 12:15:21.243012 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.242979 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4rfp8" Apr 20 12:15:21.250751 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:15:21.250697 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c983cd8_5455_48d0_a6b5_0f8277cb2ca9.slice/crio-4ff6452fa457d580d1762694966d0dd63adca2906838bd9026a4207fa658b0f3 WatchSource:0}: Error finding container 4ff6452fa457d580d1762694966d0dd63adca2906838bd9026a4207fa658b0f3: Status 404 returned error can't find the container with id 4ff6452fa457d580d1762694966d0dd63adca2906838bd9026a4207fa658b0f3 Apr 20 12:15:21.326162 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.326126 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-2p7tn"] Apr 20 12:15:21.329635 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:15:21.329597 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f8898a4_1988_4c88_81d7_7b424aaf64e7.slice/crio-513a89cf60e513d4d01832f406b2dfc873a999288d1684e611403de9d93c5572 WatchSource:0}: Error finding container 513a89cf60e513d4d01832f406b2dfc873a999288d1684e611403de9d93c5572: Status 404 returned error can't find the container with id 513a89cf60e513d4d01832f406b2dfc873a999288d1684e611403de9d93c5572 Apr 20 12:15:21.703887 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.703796 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/09b97c75-b838-4128-bb6f-d6a54d1cc11e-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-q8rbm\" (UID: \"09b97c75-b838-4128-bb6f-d6a54d1cc11e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-q8rbm" Apr 20 12:15:21.706142 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.706120 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/09b97c75-b838-4128-bb6f-d6a54d1cc11e-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-q8rbm\" (UID: \"09b97c75-b838-4128-bb6f-d6a54d1cc11e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-q8rbm" Apr 20 12:15:21.849144 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.849112 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-q8rbm" Apr 20 12:15:21.978735 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.978647 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-q8rbm"] Apr 20 12:15:21.982122 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.982077 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 12:15:21.985963 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.985940 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:21.988557 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.988383 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 12:15:21.988557 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.988427 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 12:15:21.988557 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.988412 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 12:15:21.988774 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.988762 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 12:15:21.988774 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.988763 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 12:15:21.988870 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.988848 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 12:15:21.988870 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.988768 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 12:15:21.989136 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.989109 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 12:15:21.989214 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.989138 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-mhsr8\"" Apr 20 12:15:21.989571 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:21.989382 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 12:15:22.002184 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.002154 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 12:15:22.061005 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:15:22.060966 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09b97c75_b838_4128_bb6f_d6a54d1cc11e.slice/crio-457f4059d7bdbda4afd44d3448727d4c0896890212811b22d5d63e934589aca7 WatchSource:0}: Error finding container 457f4059d7bdbda4afd44d3448727d4c0896890212811b22d5d63e934589aca7: Status 404 returned error can't find the container with id 457f4059d7bdbda4afd44d3448727d4c0896890212811b22d5d63e934589aca7 Apr 20 12:15:22.106366 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.106342 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vg68\" (UniqueName: \"kubernetes.io/projected/22d63887-ce5b-4e34-b1a0-9126e4462f30-kube-api-access-5vg68\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.106468 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.106380 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/22d63887-ce5b-4e34-b1a0-9126e4462f30-config-out\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.106468 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.106421 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.106552 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.106492 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.106552 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.106528 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-web-config\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.106625 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.106583 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.106671 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.106622 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/22d63887-ce5b-4e34-b1a0-9126e4462f30-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.106671 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.106660 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.106768 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.106700 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/22d63887-ce5b-4e34-b1a0-9126e4462f30-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.106853 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.106817 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22d63887-ce5b-4e34-b1a0-9126e4462f30-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.106905 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.106864 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/22d63887-ce5b-4e34-b1a0-9126e4462f30-tls-assets\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.106905 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.106893 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.106995 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.106941 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-config-volume\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.185094 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.185058 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4rfp8" event={"ID":"9c983cd8-5455-48d0-a6b5-0f8277cb2ca9","Type":"ContainerStarted","Data":"4ff6452fa457d580d1762694966d0dd63adca2906838bd9026a4207fa658b0f3"} Apr 20 12:15:22.186286 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.186249 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-q8rbm" event={"ID":"09b97c75-b838-4128-bb6f-d6a54d1cc11e","Type":"ContainerStarted","Data":"457f4059d7bdbda4afd44d3448727d4c0896890212811b22d5d63e934589aca7"} Apr 20 12:15:22.188029 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.188003 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2p7tn" event={"ID":"3f8898a4-1988-4c88-81d7-7b424aaf64e7","Type":"ContainerStarted","Data":"2c7dd707147dadc5be04f431e91655c2f7bba6152b45407f900d87ce61b0a886"} Apr 20 12:15:22.188101 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.188033 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2p7tn" event={"ID":"3f8898a4-1988-4c88-81d7-7b424aaf64e7","Type":"ContainerStarted","Data":"fbd637697d28ee1d06ad9d13a4ecb8f6047b98918a927e58bfe876d33e5791fc"} Apr 20 12:15:22.188101 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.188046 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2p7tn" event={"ID":"3f8898a4-1988-4c88-81d7-7b424aaf64e7","Type":"ContainerStarted","Data":"513a89cf60e513d4d01832f406b2dfc873a999288d1684e611403de9d93c5572"} Apr 20 12:15:22.207744 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.207715 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.207885 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.207761 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.207885 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.207787 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-web-config\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.207885 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.207824 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.207885 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.207867 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/22d63887-ce5b-4e34-b1a0-9126e4462f30-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.208096 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:15:22.207888 2580 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 20 12:15:22.208096 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.207900 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.208096 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.207924 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/22d63887-ce5b-4e34-b1a0-9126e4462f30-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.208096 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:15:22.207961 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-secret-alertmanager-main-tls podName:22d63887-ce5b-4e34-b1a0-9126e4462f30 nodeName:}" failed. No retries permitted until 2026-04-20 12:15:22.707938679 +0000 UTC m=+56.427952666 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "22d63887-ce5b-4e34-b1a0-9126e4462f30") : secret "alertmanager-main-tls" not found Apr 20 12:15:22.208096 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.208031 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22d63887-ce5b-4e34-b1a0-9126e4462f30-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.208096 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.208066 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/22d63887-ce5b-4e34-b1a0-9126e4462f30-tls-assets\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.208096 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.208090 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.208456 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.208130 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-config-volume\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.208456 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.208158 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vg68\" (UniqueName: \"kubernetes.io/projected/22d63887-ce5b-4e34-b1a0-9126e4462f30-kube-api-access-5vg68\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.208456 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.208194 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/22d63887-ce5b-4e34-b1a0-9126e4462f30-config-out\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.208456 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.208310 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/22d63887-ce5b-4e34-b1a0-9126e4462f30-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.208777 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.208758 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/22d63887-ce5b-4e34-b1a0-9126e4462f30-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.209766 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.209739 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22d63887-ce5b-4e34-b1a0-9126e4462f30-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.210905 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.210854 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.211198 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.211150 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.211290 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.211198 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-config-volume\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.211290 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.211202 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-web-config\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.211454 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.211437 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/22d63887-ce5b-4e34-b1a0-9126e4462f30-tls-assets\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.211599 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.211578 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/22d63887-ce5b-4e34-b1a0-9126e4462f30-config-out\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.212081 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.212063 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.213035 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.213012 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.220888 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.220864 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vg68\" (UniqueName: \"kubernetes.io/projected/22d63887-ce5b-4e34-b1a0-9126e4462f30-kube-api-access-5vg68\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.417767 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.417738 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-74b68ddfb9-rwkgj"] Apr 20 12:15:22.420665 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.420648 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74b68ddfb9-rwkgj" Apr 20 12:15:22.423153 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.423096 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 12:15:22.423153 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.423097 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 12:15:22.423153 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.423144 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-jfmjp\"" Apr 20 12:15:22.423608 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.423187 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 12:15:22.423608 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.423145 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 12:15:22.423608 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.423146 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 12:15:22.423608 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.423458 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 12:15:22.423608 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.423578 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 12:15:22.432326 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.432235 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 20 12:15:22.433877 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.433856 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74b68ddfb9-rwkgj"] Apr 20 12:15:22.613019 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.612926 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/69db919e-8d10-40f9-b4d7-b37130d3ab45-console-serving-cert\") pod \"console-74b68ddfb9-rwkgj\" (UID: \"69db919e-8d10-40f9-b4d7-b37130d3ab45\") " pod="openshift-console/console-74b68ddfb9-rwkgj" Apr 20 12:15:22.613188 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.613018 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/69db919e-8d10-40f9-b4d7-b37130d3ab45-console-oauth-config\") pod \"console-74b68ddfb9-rwkgj\" (UID: \"69db919e-8d10-40f9-b4d7-b37130d3ab45\") " pod="openshift-console/console-74b68ddfb9-rwkgj" Apr 20 12:15:22.613188 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.613056 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9mtz\" (UniqueName: \"kubernetes.io/projected/69db919e-8d10-40f9-b4d7-b37130d3ab45-kube-api-access-b9mtz\") pod \"console-74b68ddfb9-rwkgj\" (UID: \"69db919e-8d10-40f9-b4d7-b37130d3ab45\") " pod="openshift-console/console-74b68ddfb9-rwkgj" Apr 20 12:15:22.613188 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.613157 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/69db919e-8d10-40f9-b4d7-b37130d3ab45-console-config\") pod \"console-74b68ddfb9-rwkgj\" (UID: \"69db919e-8d10-40f9-b4d7-b37130d3ab45\") " pod="openshift-console/console-74b68ddfb9-rwkgj" Apr 20 12:15:22.613348 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.613191 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69db919e-8d10-40f9-b4d7-b37130d3ab45-trusted-ca-bundle\") pod \"console-74b68ddfb9-rwkgj\" (UID: \"69db919e-8d10-40f9-b4d7-b37130d3ab45\") " pod="openshift-console/console-74b68ddfb9-rwkgj" Apr 20 12:15:22.613348 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.613210 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/69db919e-8d10-40f9-b4d7-b37130d3ab45-service-ca\") pod \"console-74b68ddfb9-rwkgj\" (UID: \"69db919e-8d10-40f9-b4d7-b37130d3ab45\") " pod="openshift-console/console-74b68ddfb9-rwkgj" Apr 20 12:15:22.613348 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.613228 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/69db919e-8d10-40f9-b4d7-b37130d3ab45-oauth-serving-cert\") pod \"console-74b68ddfb9-rwkgj\" (UID: \"69db919e-8d10-40f9-b4d7-b37130d3ab45\") " pod="openshift-console/console-74b68ddfb9-rwkgj" Apr 20 12:15:22.713940 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.713903 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/69db919e-8d10-40f9-b4d7-b37130d3ab45-console-oauth-config\") pod \"console-74b68ddfb9-rwkgj\" (UID: \"69db919e-8d10-40f9-b4d7-b37130d3ab45\") " pod="openshift-console/console-74b68ddfb9-rwkgj" Apr 20 12:15:22.714126 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.713958 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b9mtz\" (UniqueName: \"kubernetes.io/projected/69db919e-8d10-40f9-b4d7-b37130d3ab45-kube-api-access-b9mtz\") pod \"console-74b68ddfb9-rwkgj\" (UID: \"69db919e-8d10-40f9-b4d7-b37130d3ab45\") " pod="openshift-console/console-74b68ddfb9-rwkgj" Apr 20 12:15:22.714126 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.714019 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/69db919e-8d10-40f9-b4d7-b37130d3ab45-console-config\") pod \"console-74b68ddfb9-rwkgj\" (UID: \"69db919e-8d10-40f9-b4d7-b37130d3ab45\") " pod="openshift-console/console-74b68ddfb9-rwkgj" Apr 20 12:15:22.714126 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.714039 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69db919e-8d10-40f9-b4d7-b37130d3ab45-trusted-ca-bundle\") pod \"console-74b68ddfb9-rwkgj\" (UID: \"69db919e-8d10-40f9-b4d7-b37130d3ab45\") " pod="openshift-console/console-74b68ddfb9-rwkgj" Apr 20 12:15:22.714126 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.714064 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/69db919e-8d10-40f9-b4d7-b37130d3ab45-service-ca\") pod \"console-74b68ddfb9-rwkgj\" (UID: \"69db919e-8d10-40f9-b4d7-b37130d3ab45\") " pod="openshift-console/console-74b68ddfb9-rwkgj" Apr 20 12:15:22.714126 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.714091 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/69db919e-8d10-40f9-b4d7-b37130d3ab45-oauth-serving-cert\") pod \"console-74b68ddfb9-rwkgj\" (UID: \"69db919e-8d10-40f9-b4d7-b37130d3ab45\") " pod="openshift-console/console-74b68ddfb9-rwkgj" Apr 20 12:15:22.714830 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.714144 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/69db919e-8d10-40f9-b4d7-b37130d3ab45-console-serving-cert\") pod \"console-74b68ddfb9-rwkgj\" (UID: \"69db919e-8d10-40f9-b4d7-b37130d3ab45\") " pod="openshift-console/console-74b68ddfb9-rwkgj" Apr 20 12:15:22.714830 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.714190 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.715284 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.715133 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/69db919e-8d10-40f9-b4d7-b37130d3ab45-console-config\") pod \"console-74b68ddfb9-rwkgj\" (UID: \"69db919e-8d10-40f9-b4d7-b37130d3ab45\") " pod="openshift-console/console-74b68ddfb9-rwkgj" Apr 20 12:15:22.715284 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.715179 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69db919e-8d10-40f9-b4d7-b37130d3ab45-trusted-ca-bundle\") pod \"console-74b68ddfb9-rwkgj\" (UID: \"69db919e-8d10-40f9-b4d7-b37130d3ab45\") " pod="openshift-console/console-74b68ddfb9-rwkgj" Apr 20 12:15:22.715284 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.715140 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/69db919e-8d10-40f9-b4d7-b37130d3ab45-service-ca\") pod \"console-74b68ddfb9-rwkgj\" (UID: \"69db919e-8d10-40f9-b4d7-b37130d3ab45\") " pod="openshift-console/console-74b68ddfb9-rwkgj" Apr 20 12:15:22.715509 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.715451 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/69db919e-8d10-40f9-b4d7-b37130d3ab45-oauth-serving-cert\") pod \"console-74b68ddfb9-rwkgj\" (UID: \"69db919e-8d10-40f9-b4d7-b37130d3ab45\") " pod="openshift-console/console-74b68ddfb9-rwkgj" Apr 20 12:15:22.716806 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.716781 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/69db919e-8d10-40f9-b4d7-b37130d3ab45-console-oauth-config\") pod \"console-74b68ddfb9-rwkgj\" (UID: \"69db919e-8d10-40f9-b4d7-b37130d3ab45\") " pod="openshift-console/console-74b68ddfb9-rwkgj" Apr 20 12:15:22.717039 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.717009 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.717163 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.717136 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/69db919e-8d10-40f9-b4d7-b37130d3ab45-console-serving-cert\") pod \"console-74b68ddfb9-rwkgj\" (UID: \"69db919e-8d10-40f9-b4d7-b37130d3ab45\") " pod="openshift-console/console-74b68ddfb9-rwkgj" Apr 20 12:15:22.721777 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.721752 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9mtz\" (UniqueName: \"kubernetes.io/projected/69db919e-8d10-40f9-b4d7-b37130d3ab45-kube-api-access-b9mtz\") pod \"console-74b68ddfb9-rwkgj\" (UID: \"69db919e-8d10-40f9-b4d7-b37130d3ab45\") " pod="openshift-console/console-74b68ddfb9-rwkgj" Apr 20 12:15:22.730563 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.730531 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74b68ddfb9-rwkgj" Apr 20 12:15:22.869165 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.869088 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74b68ddfb9-rwkgj"] Apr 20 12:15:22.872613 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:15:22.872584 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69db919e_8d10_40f9_b4d7_b37130d3ab45.slice/crio-0b49a58a2a85c0166f33f7b4a9bcbec43efb1fb4f9f8472040c357eb2d428835 WatchSource:0}: Error finding container 0b49a58a2a85c0166f33f7b4a9bcbec43efb1fb4f9f8472040c357eb2d428835: Status 404 returned error can't find the container with id 0b49a58a2a85c0166f33f7b4a9bcbec43efb1fb4f9f8472040c357eb2d428835 Apr 20 12:15:22.898174 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.898142 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:15:22.977499 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.977454 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-579895dbf5-cfrwz"] Apr 20 12:15:22.981413 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.981370 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-579895dbf5-cfrwz" Apr 20 12:15:22.984412 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.984342 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 20 12:15:22.984412 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.984392 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 20 12:15:22.984626 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.984392 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 20 12:15:22.984767 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.984744 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 20 12:15:22.984892 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.984769 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-fnj2p\"" Apr 20 12:15:22.984892 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.984788 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-6lubh6t2orfcb\"" Apr 20 12:15:22.985070 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.985040 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 20 12:15:22.991749 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:22.991728 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-579895dbf5-cfrwz"] Apr 20 12:15:23.118052 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:23.118012 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/9748a2da-a0df-48e0-8d2e-bc1ef97f4fed-secret-thanos-querier-tls\") pod \"thanos-querier-579895dbf5-cfrwz\" (UID: \"9748a2da-a0df-48e0-8d2e-bc1ef97f4fed\") " pod="openshift-monitoring/thanos-querier-579895dbf5-cfrwz" Apr 20 12:15:23.118232 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:23.118064 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9748a2da-a0df-48e0-8d2e-bc1ef97f4fed-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-579895dbf5-cfrwz\" (UID: \"9748a2da-a0df-48e0-8d2e-bc1ef97f4fed\") " pod="openshift-monitoring/thanos-querier-579895dbf5-cfrwz" Apr 20 12:15:23.118232 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:23.118100 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/9748a2da-a0df-48e0-8d2e-bc1ef97f4fed-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-579895dbf5-cfrwz\" (UID: \"9748a2da-a0df-48e0-8d2e-bc1ef97f4fed\") " pod="openshift-monitoring/thanos-querier-579895dbf5-cfrwz" Apr 20 12:15:23.118232 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:23.118128 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9748a2da-a0df-48e0-8d2e-bc1ef97f4fed-metrics-client-ca\") pod \"thanos-querier-579895dbf5-cfrwz\" (UID: \"9748a2da-a0df-48e0-8d2e-bc1ef97f4fed\") " pod="openshift-monitoring/thanos-querier-579895dbf5-cfrwz" Apr 20 12:15:23.118232 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:23.118157 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/9748a2da-a0df-48e0-8d2e-bc1ef97f4fed-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-579895dbf5-cfrwz\" (UID: \"9748a2da-a0df-48e0-8d2e-bc1ef97f4fed\") " pod="openshift-monitoring/thanos-querier-579895dbf5-cfrwz" Apr 20 12:15:23.118232 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:23.118191 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9748a2da-a0df-48e0-8d2e-bc1ef97f4fed-secret-grpc-tls\") pod \"thanos-querier-579895dbf5-cfrwz\" (UID: \"9748a2da-a0df-48e0-8d2e-bc1ef97f4fed\") " pod="openshift-monitoring/thanos-querier-579895dbf5-cfrwz" Apr 20 12:15:23.118501 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:23.118293 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t2kw\" (UniqueName: \"kubernetes.io/projected/9748a2da-a0df-48e0-8d2e-bc1ef97f4fed-kube-api-access-6t2kw\") pod \"thanos-querier-579895dbf5-cfrwz\" (UID: \"9748a2da-a0df-48e0-8d2e-bc1ef97f4fed\") " pod="openshift-monitoring/thanos-querier-579895dbf5-cfrwz" Apr 20 12:15:23.118501 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:23.118388 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9748a2da-a0df-48e0-8d2e-bc1ef97f4fed-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-579895dbf5-cfrwz\" (UID: \"9748a2da-a0df-48e0-8d2e-bc1ef97f4fed\") " pod="openshift-monitoring/thanos-querier-579895dbf5-cfrwz" Apr 20 12:15:23.193846 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:23.193802 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2p7tn" event={"ID":"3f8898a4-1988-4c88-81d7-7b424aaf64e7","Type":"ContainerStarted","Data":"68a249765186fb2dc95f0b714d62cef32693175af82c582fab10052e2c712430"} Apr 20 12:15:23.195077 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:23.195044 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74b68ddfb9-rwkgj" event={"ID":"69db919e-8d10-40f9-b4d7-b37130d3ab45","Type":"ContainerStarted","Data":"0b49a58a2a85c0166f33f7b4a9bcbec43efb1fb4f9f8472040c357eb2d428835"} Apr 20 12:15:23.196494 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:23.196470 2580 generic.go:358] "Generic (PLEG): container finished" podID="9c983cd8-5455-48d0-a6b5-0f8277cb2ca9" containerID="e774304c3ce9c689f4f029a1d87bea3c7bc7ab05749861c032ab6d6a90816ded" exitCode=0 Apr 20 12:15:23.196617 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:23.196517 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4rfp8" event={"ID":"9c983cd8-5455-48d0-a6b5-0f8277cb2ca9","Type":"ContainerDied","Data":"e774304c3ce9c689f4f029a1d87bea3c7bc7ab05749861c032ab6d6a90816ded"} Apr 20 12:15:23.211522 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:23.211477 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2p7tn" podStartSLOduration=2.226065659 podStartE2EDuration="3.211462858s" podCreationTimestamp="2026-04-20 12:15:20 +0000 UTC" firstStartedPulling="2026-04-20 12:15:21.452987733 +0000 UTC m=+55.173001702" lastFinishedPulling="2026-04-20 12:15:22.438384929 +0000 UTC m=+56.158398901" observedRunningTime="2026-04-20 12:15:23.210266225 +0000 UTC m=+56.930280216" watchObservedRunningTime="2026-04-20 12:15:23.211462858 +0000 UTC m=+56.931476849" Apr 20 12:15:23.219716 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:23.219687 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/9748a2da-a0df-48e0-8d2e-bc1ef97f4fed-secret-thanos-querier-tls\") pod \"thanos-querier-579895dbf5-cfrwz\" (UID: \"9748a2da-a0df-48e0-8d2e-bc1ef97f4fed\") " pod="openshift-monitoring/thanos-querier-579895dbf5-cfrwz" Apr 20 12:15:23.219835 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:23.219751 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9748a2da-a0df-48e0-8d2e-bc1ef97f4fed-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-579895dbf5-cfrwz\" (UID: \"9748a2da-a0df-48e0-8d2e-bc1ef97f4fed\") " pod="openshift-monitoring/thanos-querier-579895dbf5-cfrwz" Apr 20 12:15:23.219835 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:23.219783 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/9748a2da-a0df-48e0-8d2e-bc1ef97f4fed-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-579895dbf5-cfrwz\" (UID: \"9748a2da-a0df-48e0-8d2e-bc1ef97f4fed\") " pod="openshift-monitoring/thanos-querier-579895dbf5-cfrwz" Apr 20 12:15:23.219835 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:23.219830 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9748a2da-a0df-48e0-8d2e-bc1ef97f4fed-metrics-client-ca\") pod \"thanos-querier-579895dbf5-cfrwz\" (UID: \"9748a2da-a0df-48e0-8d2e-bc1ef97f4fed\") " pod="openshift-monitoring/thanos-querier-579895dbf5-cfrwz" Apr 20 12:15:23.219980 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:23.219859 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/9748a2da-a0df-48e0-8d2e-bc1ef97f4fed-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-579895dbf5-cfrwz\" (UID: \"9748a2da-a0df-48e0-8d2e-bc1ef97f4fed\") " pod="openshift-monitoring/thanos-querier-579895dbf5-cfrwz" Apr 20 12:15:23.219980 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:23.219913 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9748a2da-a0df-48e0-8d2e-bc1ef97f4fed-secret-grpc-tls\") pod \"thanos-querier-579895dbf5-cfrwz\" (UID: \"9748a2da-a0df-48e0-8d2e-bc1ef97f4fed\") " pod="openshift-monitoring/thanos-querier-579895dbf5-cfrwz" Apr 20 12:15:23.220288 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:23.219986 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6t2kw\" (UniqueName: \"kubernetes.io/projected/9748a2da-a0df-48e0-8d2e-bc1ef97f4fed-kube-api-access-6t2kw\") pod \"thanos-querier-579895dbf5-cfrwz\" (UID: \"9748a2da-a0df-48e0-8d2e-bc1ef97f4fed\") " pod="openshift-monitoring/thanos-querier-579895dbf5-cfrwz" Apr 20 12:15:23.220288 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:23.220066 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9748a2da-a0df-48e0-8d2e-bc1ef97f4fed-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-579895dbf5-cfrwz\" (UID: \"9748a2da-a0df-48e0-8d2e-bc1ef97f4fed\") " pod="openshift-monitoring/thanos-querier-579895dbf5-cfrwz" Apr 20 12:15:23.222193 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:23.222145 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9748a2da-a0df-48e0-8d2e-bc1ef97f4fed-metrics-client-ca\") pod \"thanos-querier-579895dbf5-cfrwz\" (UID: \"9748a2da-a0df-48e0-8d2e-bc1ef97f4fed\") " pod="openshift-monitoring/thanos-querier-579895dbf5-cfrwz" Apr 20 12:15:23.225780 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:23.225719 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9748a2da-a0df-48e0-8d2e-bc1ef97f4fed-secret-grpc-tls\") pod \"thanos-querier-579895dbf5-cfrwz\" (UID: \"9748a2da-a0df-48e0-8d2e-bc1ef97f4fed\") " pod="openshift-monitoring/thanos-querier-579895dbf5-cfrwz" Apr 20 12:15:23.226242 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:23.225868 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/9748a2da-a0df-48e0-8d2e-bc1ef97f4fed-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-579895dbf5-cfrwz\" (UID: \"9748a2da-a0df-48e0-8d2e-bc1ef97f4fed\") " pod="openshift-monitoring/thanos-querier-579895dbf5-cfrwz" Apr 20 12:15:23.227214 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:23.227192 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/9748a2da-a0df-48e0-8d2e-bc1ef97f4fed-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-579895dbf5-cfrwz\" (UID: \"9748a2da-a0df-48e0-8d2e-bc1ef97f4fed\") " pod="openshift-monitoring/thanos-querier-579895dbf5-cfrwz" Apr 20 12:15:23.227612 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:23.227592 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9748a2da-a0df-48e0-8d2e-bc1ef97f4fed-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-579895dbf5-cfrwz\" (UID: \"9748a2da-a0df-48e0-8d2e-bc1ef97f4fed\") " pod="openshift-monitoring/thanos-querier-579895dbf5-cfrwz" Apr 20 12:15:23.227817 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:23.227798 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/9748a2da-a0df-48e0-8d2e-bc1ef97f4fed-secret-thanos-querier-tls\") pod \"thanos-querier-579895dbf5-cfrwz\" (UID: \"9748a2da-a0df-48e0-8d2e-bc1ef97f4fed\") " pod="openshift-monitoring/thanos-querier-579895dbf5-cfrwz" Apr 20 12:15:23.228899 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:23.228877 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9748a2da-a0df-48e0-8d2e-bc1ef97f4fed-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-579895dbf5-cfrwz\" (UID: \"9748a2da-a0df-48e0-8d2e-bc1ef97f4fed\") " pod="openshift-monitoring/thanos-querier-579895dbf5-cfrwz" Apr 20 12:15:23.234505 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:23.234465 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t2kw\" (UniqueName: \"kubernetes.io/projected/9748a2da-a0df-48e0-8d2e-bc1ef97f4fed-kube-api-access-6t2kw\") pod \"thanos-querier-579895dbf5-cfrwz\" (UID: \"9748a2da-a0df-48e0-8d2e-bc1ef97f4fed\") " pod="openshift-monitoring/thanos-querier-579895dbf5-cfrwz" Apr 20 12:15:23.292777 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:23.292428 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-579895dbf5-cfrwz" Apr 20 12:15:23.316873 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:23.316474 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 12:15:23.469485 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:23.469439 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-579895dbf5-cfrwz"] Apr 20 12:15:23.473672 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:15:23.473622 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9748a2da_a0df_48e0_8d2e_bc1ef97f4fed.slice/crio-11ffd42a8feb6c438d2c65ce8a4c561c04bf5d53e36fa033d9320f8b4dd87180 WatchSource:0}: Error finding container 11ffd42a8feb6c438d2c65ce8a4c561c04bf5d53e36fa033d9320f8b4dd87180: Status 404 returned error can't find the container with id 11ffd42a8feb6c438d2c65ce8a4c561c04bf5d53e36fa033d9320f8b4dd87180 Apr 20 12:15:24.203173 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:24.203080 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-q8rbm" event={"ID":"09b97c75-b838-4128-bb6f-d6a54d1cc11e","Type":"ContainerStarted","Data":"628871e88d82ce485a599b6c446be9fe2c7ca9771d4d9f054fd83f8169a38278"} Apr 20 12:15:24.203173 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:24.203125 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-q8rbm" event={"ID":"09b97c75-b838-4128-bb6f-d6a54d1cc11e","Type":"ContainerStarted","Data":"23971bb4a03a2fcffd27855986d4561c6a3e5211390ad9c62a23c2040d63c334"} Apr 20 12:15:24.203173 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:24.203143 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-q8rbm" event={"ID":"09b97c75-b838-4128-bb6f-d6a54d1cc11e","Type":"ContainerStarted","Data":"07ad79f572955ba2ec852a66cf351982eaf2f703fe109e986e0b659ecf94a875"} Apr 20 12:15:24.204921 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:24.204870 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-579895dbf5-cfrwz" event={"ID":"9748a2da-a0df-48e0-8d2e-bc1ef97f4fed","Type":"ContainerStarted","Data":"11ffd42a8feb6c438d2c65ce8a4c561c04bf5d53e36fa033d9320f8b4dd87180"} Apr 20 12:15:24.206391 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:24.206348 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"22d63887-ce5b-4e34-b1a0-9126e4462f30","Type":"ContainerStarted","Data":"d6d65d1aef5bf8278d17d3ad01cc858b5d09b4df16c23d3ccccaf314f66d325a"} Apr 20 12:15:24.209893 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:24.209857 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4rfp8" event={"ID":"9c983cd8-5455-48d0-a6b5-0f8277cb2ca9","Type":"ContainerStarted","Data":"6b4d6280e5715e7bbc1d3bdd0d5b14768a2feafe4d97f6b02818246b22048755"} Apr 20 12:15:24.209893 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:24.209894 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4rfp8" event={"ID":"9c983cd8-5455-48d0-a6b5-0f8277cb2ca9","Type":"ContainerStarted","Data":"268fd380d5d2dc44951d512dff5c63d2dd6d1e7aaae7bcd35fd17a4f55b722cb"} Apr 20 12:15:24.225638 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:24.225584 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-q8rbm" podStartSLOduration=3.066301118 podStartE2EDuration="4.22556854s" podCreationTimestamp="2026-04-20 12:15:20 +0000 UTC" firstStartedPulling="2026-04-20 12:15:22.063492442 +0000 UTC m=+55.783506410" lastFinishedPulling="2026-04-20 12:15:23.222759859 +0000 UTC m=+56.942773832" observedRunningTime="2026-04-20 12:15:24.225507979 +0000 UTC m=+57.945521970" watchObservedRunningTime="2026-04-20 12:15:24.22556854 +0000 UTC m=+57.945582529" Apr 20 12:15:24.252950 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:24.252488 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-4rfp8" podStartSLOduration=3.399581123 podStartE2EDuration="4.252465039s" podCreationTimestamp="2026-04-20 12:15:20 +0000 UTC" firstStartedPulling="2026-04-20 12:15:21.252584683 +0000 UTC m=+54.972598652" lastFinishedPulling="2026-04-20 12:15:22.105468584 +0000 UTC m=+55.825482568" observedRunningTime="2026-04-20 12:15:24.251955005 +0000 UTC m=+57.971969000" watchObservedRunningTime="2026-04-20 12:15:24.252465039 +0000 UTC m=+57.972479030" Apr 20 12:15:25.126144 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:25.126106 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dwphh" Apr 20 12:15:25.324249 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:25.324209 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7b75467d8b-xqm69"] Apr 20 12:15:25.349687 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:25.349649 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7b75467d8b-xqm69"] Apr 20 12:15:25.349854 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:25.349781 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7b75467d8b-xqm69" Apr 20 12:15:25.352367 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:25.352332 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 12:15:25.352367 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:25.352363 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 20 12:15:25.352577 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:25.352436 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-85t79\"" Apr 20 12:15:25.352577 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:25.352469 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 20 12:15:25.352868 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:25.352804 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-48cain8veh00r\"" Apr 20 12:15:25.352967 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:25.352920 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 20 12:15:25.441080 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:25.441004 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/6aeb557f-5fc7-4469-90cb-cfdbee9a0458-secret-metrics-server-client-certs\") pod \"metrics-server-7b75467d8b-xqm69\" (UID: \"6aeb557f-5fc7-4469-90cb-cfdbee9a0458\") " pod="openshift-monitoring/metrics-server-7b75467d8b-xqm69" Apr 20 12:15:25.441080 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:25.441039 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b78k9\" (UniqueName: \"kubernetes.io/projected/6aeb557f-5fc7-4469-90cb-cfdbee9a0458-kube-api-access-b78k9\") pod \"metrics-server-7b75467d8b-xqm69\" (UID: \"6aeb557f-5fc7-4469-90cb-cfdbee9a0458\") " pod="openshift-monitoring/metrics-server-7b75467d8b-xqm69" Apr 20 12:15:25.441267 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:25.441116 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aeb557f-5fc7-4469-90cb-cfdbee9a0458-client-ca-bundle\") pod \"metrics-server-7b75467d8b-xqm69\" (UID: \"6aeb557f-5fc7-4469-90cb-cfdbee9a0458\") " pod="openshift-monitoring/metrics-server-7b75467d8b-xqm69" Apr 20 12:15:25.441267 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:25.441150 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/6aeb557f-5fc7-4469-90cb-cfdbee9a0458-audit-log\") pod \"metrics-server-7b75467d8b-xqm69\" (UID: \"6aeb557f-5fc7-4469-90cb-cfdbee9a0458\") " pod="openshift-monitoring/metrics-server-7b75467d8b-xqm69" Apr 20 12:15:25.441267 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:25.441175 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6aeb557f-5fc7-4469-90cb-cfdbee9a0458-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7b75467d8b-xqm69\" (UID: \"6aeb557f-5fc7-4469-90cb-cfdbee9a0458\") " pod="openshift-monitoring/metrics-server-7b75467d8b-xqm69" Apr 20 12:15:25.441267 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:25.441210 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/6aeb557f-5fc7-4469-90cb-cfdbee9a0458-secret-metrics-server-tls\") pod \"metrics-server-7b75467d8b-xqm69\" (UID: \"6aeb557f-5fc7-4469-90cb-cfdbee9a0458\") " pod="openshift-monitoring/metrics-server-7b75467d8b-xqm69" Apr 20 12:15:25.441410 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:25.441268 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/6aeb557f-5fc7-4469-90cb-cfdbee9a0458-metrics-server-audit-profiles\") pod \"metrics-server-7b75467d8b-xqm69\" (UID: \"6aeb557f-5fc7-4469-90cb-cfdbee9a0458\") " pod="openshift-monitoring/metrics-server-7b75467d8b-xqm69" Apr 20 12:15:25.542205 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:25.542169 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b78k9\" (UniqueName: \"kubernetes.io/projected/6aeb557f-5fc7-4469-90cb-cfdbee9a0458-kube-api-access-b78k9\") pod \"metrics-server-7b75467d8b-xqm69\" (UID: \"6aeb557f-5fc7-4469-90cb-cfdbee9a0458\") " pod="openshift-monitoring/metrics-server-7b75467d8b-xqm69" Apr 20 12:15:25.542416 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:25.542252 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aeb557f-5fc7-4469-90cb-cfdbee9a0458-client-ca-bundle\") pod \"metrics-server-7b75467d8b-xqm69\" (UID: \"6aeb557f-5fc7-4469-90cb-cfdbee9a0458\") " pod="openshift-monitoring/metrics-server-7b75467d8b-xqm69" Apr 20 12:15:25.542416 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:25.542291 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/6aeb557f-5fc7-4469-90cb-cfdbee9a0458-audit-log\") pod \"metrics-server-7b75467d8b-xqm69\" (UID: \"6aeb557f-5fc7-4469-90cb-cfdbee9a0458\") " pod="openshift-monitoring/metrics-server-7b75467d8b-xqm69" Apr 20 12:15:25.542545 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:25.542455 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6aeb557f-5fc7-4469-90cb-cfdbee9a0458-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7b75467d8b-xqm69\" (UID: \"6aeb557f-5fc7-4469-90cb-cfdbee9a0458\") " pod="openshift-monitoring/metrics-server-7b75467d8b-xqm69" Apr 20 12:15:25.542599 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:25.542536 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/6aeb557f-5fc7-4469-90cb-cfdbee9a0458-secret-metrics-server-tls\") pod \"metrics-server-7b75467d8b-xqm69\" (UID: \"6aeb557f-5fc7-4469-90cb-cfdbee9a0458\") " pod="openshift-monitoring/metrics-server-7b75467d8b-xqm69" Apr 20 12:15:25.542649 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:25.542602 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/6aeb557f-5fc7-4469-90cb-cfdbee9a0458-metrics-server-audit-profiles\") pod \"metrics-server-7b75467d8b-xqm69\" (UID: \"6aeb557f-5fc7-4469-90cb-cfdbee9a0458\") " pod="openshift-monitoring/metrics-server-7b75467d8b-xqm69" Apr 20 12:15:25.542701 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:25.542680 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/6aeb557f-5fc7-4469-90cb-cfdbee9a0458-audit-log\") pod \"metrics-server-7b75467d8b-xqm69\" (UID: \"6aeb557f-5fc7-4469-90cb-cfdbee9a0458\") " pod="openshift-monitoring/metrics-server-7b75467d8b-xqm69" Apr 20 12:15:25.542829 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:25.542734 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/6aeb557f-5fc7-4469-90cb-cfdbee9a0458-secret-metrics-server-client-certs\") pod \"metrics-server-7b75467d8b-xqm69\" (UID: \"6aeb557f-5fc7-4469-90cb-cfdbee9a0458\") " pod="openshift-monitoring/metrics-server-7b75467d8b-xqm69" Apr 20 12:15:25.543237 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:25.543196 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6aeb557f-5fc7-4469-90cb-cfdbee9a0458-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7b75467d8b-xqm69\" (UID: \"6aeb557f-5fc7-4469-90cb-cfdbee9a0458\") " pod="openshift-monitoring/metrics-server-7b75467d8b-xqm69" Apr 20 12:15:25.543705 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:25.543678 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/6aeb557f-5fc7-4469-90cb-cfdbee9a0458-metrics-server-audit-profiles\") pod \"metrics-server-7b75467d8b-xqm69\" (UID: \"6aeb557f-5fc7-4469-90cb-cfdbee9a0458\") " pod="openshift-monitoring/metrics-server-7b75467d8b-xqm69" Apr 20 12:15:25.545300 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:25.545278 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aeb557f-5fc7-4469-90cb-cfdbee9a0458-client-ca-bundle\") pod \"metrics-server-7b75467d8b-xqm69\" (UID: \"6aeb557f-5fc7-4469-90cb-cfdbee9a0458\") " pod="openshift-monitoring/metrics-server-7b75467d8b-xqm69" Apr 20 12:15:25.545416 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:25.545284 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/6aeb557f-5fc7-4469-90cb-cfdbee9a0458-secret-metrics-server-tls\") pod \"metrics-server-7b75467d8b-xqm69\" (UID: \"6aeb557f-5fc7-4469-90cb-cfdbee9a0458\") " pod="openshift-monitoring/metrics-server-7b75467d8b-xqm69" Apr 20 12:15:25.545527 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:25.545503 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/6aeb557f-5fc7-4469-90cb-cfdbee9a0458-secret-metrics-server-client-certs\") pod \"metrics-server-7b75467d8b-xqm69\" (UID: \"6aeb557f-5fc7-4469-90cb-cfdbee9a0458\") " pod="openshift-monitoring/metrics-server-7b75467d8b-xqm69" Apr 20 12:15:25.549607 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:25.549589 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b78k9\" (UniqueName: \"kubernetes.io/projected/6aeb557f-5fc7-4469-90cb-cfdbee9a0458-kube-api-access-b78k9\") pod \"metrics-server-7b75467d8b-xqm69\" (UID: \"6aeb557f-5fc7-4469-90cb-cfdbee9a0458\") " pod="openshift-monitoring/metrics-server-7b75467d8b-xqm69" Apr 20 12:15:25.660375 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:25.660336 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7b75467d8b-xqm69" Apr 20 12:15:25.664164 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:25.664135 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-6vjwl"] Apr 20 12:15:25.673221 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:25.673202 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6vjwl" Apr 20 12:15:25.675600 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:25.675576 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 20 12:15:25.676032 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:25.676000 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-6vjwl"] Apr 20 12:15:25.676148 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:25.676126 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-q2sph\"" Apr 20 12:15:25.743950 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:25.743862 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6d32df24-8beb-401c-9a8a-8999fce03374-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-6vjwl\" (UID: \"6d32df24-8beb-401c-9a8a-8999fce03374\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6vjwl" Apr 20 12:15:25.844308 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:25.844271 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6d32df24-8beb-401c-9a8a-8999fce03374-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-6vjwl\" (UID: \"6d32df24-8beb-401c-9a8a-8999fce03374\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6vjwl" Apr 20 12:15:25.844507 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:15:25.844430 2580 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 20 12:15:25.844572 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:15:25.844508 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d32df24-8beb-401c-9a8a-8999fce03374-monitoring-plugin-cert podName:6d32df24-8beb-401c-9a8a-8999fce03374 nodeName:}" failed. No retries permitted until 2026-04-20 12:15:26.344491596 +0000 UTC m=+60.064505581 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/6d32df24-8beb-401c-9a8a-8999fce03374-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-6vjwl" (UID: "6d32df24-8beb-401c-9a8a-8999fce03374") : secret "monitoring-plugin-cert" not found Apr 20 12:15:26.348354 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:26.348318 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6d32df24-8beb-401c-9a8a-8999fce03374-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-6vjwl\" (UID: \"6d32df24-8beb-401c-9a8a-8999fce03374\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6vjwl" Apr 20 12:15:26.351202 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:26.351176 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6d32df24-8beb-401c-9a8a-8999fce03374-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-6vjwl\" (UID: \"6d32df24-8beb-401c-9a8a-8999fce03374\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6vjwl" Apr 20 12:15:26.585284 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:26.585201 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6vjwl" Apr 20 12:15:26.636463 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:26.634913 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7b75467d8b-xqm69"] Apr 20 12:15:26.658567 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:15:26.658533 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6aeb557f_5fc7_4469_90cb_cfdbee9a0458.slice/crio-7ac5dfaabb9be5e78390ffa97fda11d93f889053264175a19dac421014160b7c WatchSource:0}: Error finding container 7ac5dfaabb9be5e78390ffa97fda11d93f889053264175a19dac421014160b7c: Status 404 returned error can't find the container with id 7ac5dfaabb9be5e78390ffa97fda11d93f889053264175a19dac421014160b7c Apr 20 12:15:26.726894 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:26.726871 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-6vjwl"] Apr 20 12:15:27.104749 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.104716 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 12:15:27.108502 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.108483 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.112311 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.112287 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 12:15:27.112417 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.112331 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-1e2ag3hh82dov\"" Apr 20 12:15:27.112417 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.112333 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 12:15:27.112847 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.112833 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 12:15:27.113157 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.113141 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 12:15:27.113237 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.113154 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 12:15:27.113296 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.113264 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 12:15:27.113624 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.113602 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 12:15:27.113734 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.113671 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-wgjj5\"" Apr 20 12:15:27.113824 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.113797 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 12:15:27.114003 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.113988 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 12:15:27.114135 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.114114 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 12:15:27.117064 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.117046 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 12:15:27.121318 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.121300 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 12:15:27.128628 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.128605 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 12:15:27.155214 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.155186 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4250d722-a976-44b5-8936-846f53dc7cca-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.155306 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.155219 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cl5x\" (UniqueName: \"kubernetes.io/projected/4250d722-a976-44b5-8936-846f53dc7cca-kube-api-access-5cl5x\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.155306 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.155265 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.155306 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.155295 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-web-config\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.155440 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.155332 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.155440 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.155357 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.155440 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.155386 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/4250d722-a976-44b5-8936-846f53dc7cca-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.155533 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.155467 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-config\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.155533 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.155492 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4250d722-a976-44b5-8936-846f53dc7cca-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.155533 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.155511 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.155616 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.155539 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4250d722-a976-44b5-8936-846f53dc7cca-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.155616 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.155562 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4250d722-a976-44b5-8936-846f53dc7cca-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.155616 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.155598 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.155700 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.155644 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.155700 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.155677 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4250d722-a976-44b5-8936-846f53dc7cca-config-out\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.155759 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.155699 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.155759 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.155716 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4250d722-a976-44b5-8936-846f53dc7cca-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.155759 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.155732 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4250d722-a976-44b5-8936-846f53dc7cca-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.220763 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.220729 2580 generic.go:358] "Generic (PLEG): container finished" podID="22d63887-ce5b-4e34-b1a0-9126e4462f30" containerID="df0b8e8b535eae8194ac8abc77bec7b24dd90efe1bc7b828ce8559be09561e9d" exitCode=0 Apr 20 12:15:27.220892 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.220799 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"22d63887-ce5b-4e34-b1a0-9126e4462f30","Type":"ContainerDied","Data":"df0b8e8b535eae8194ac8abc77bec7b24dd90efe1bc7b828ce8559be09561e9d"} Apr 20 12:15:27.221982 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.221957 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7b75467d8b-xqm69" event={"ID":"6aeb557f-5fc7-4469-90cb-cfdbee9a0458","Type":"ContainerStarted","Data":"7ac5dfaabb9be5e78390ffa97fda11d93f889053264175a19dac421014160b7c"} Apr 20 12:15:27.222940 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.222919 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6vjwl" event={"ID":"6d32df24-8beb-401c-9a8a-8999fce03374","Type":"ContainerStarted","Data":"9083b92199d23e552e98bffc49bef9ee1cff9b9bb1f85e176a3eaf9445f726a0"} Apr 20 12:15:27.224209 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.224189 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74b68ddfb9-rwkgj" event={"ID":"69db919e-8d10-40f9-b4d7-b37130d3ab45","Type":"ContainerStarted","Data":"519d3fc90dd053bf2937ab28bdb79dae21b18cc5184fbff0442ebfe5c6f3d75f"} Apr 20 12:15:27.226258 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.226234 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-579895dbf5-cfrwz" event={"ID":"9748a2da-a0df-48e0-8d2e-bc1ef97f4fed","Type":"ContainerStarted","Data":"5b5ae793b9a50814696bda644c402541ed69e71c7d2d497e3d0e3be521aa2647"} Apr 20 12:15:27.226333 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.226262 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-579895dbf5-cfrwz" event={"ID":"9748a2da-a0df-48e0-8d2e-bc1ef97f4fed","Type":"ContainerStarted","Data":"ce4fd2284d6a56b8587fdb3ec9771fb00319880145dca7e0fdf489c6182f7401"} Apr 20 12:15:27.226333 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.226275 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-579895dbf5-cfrwz" event={"ID":"9748a2da-a0df-48e0-8d2e-bc1ef97f4fed","Type":"ContainerStarted","Data":"d747f4306a2dfbda7e137a6f43b4f9ec5b0282ee11ebee664785eb82f2f72391"} Apr 20 12:15:27.256301 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.256269 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/4250d722-a976-44b5-8936-846f53dc7cca-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.256483 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.256348 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-config\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.256483 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.256373 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4250d722-a976-44b5-8936-846f53dc7cca-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.256483 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.256417 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.256483 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.256466 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4250d722-a976-44b5-8936-846f53dc7cca-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.256690 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.256520 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4250d722-a976-44b5-8936-846f53dc7cca-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.256690 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.256564 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.256690 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.256674 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.257064 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.257037 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/4250d722-a976-44b5-8936-846f53dc7cca-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.258731 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.257442 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4250d722-a976-44b5-8936-846f53dc7cca-config-out\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.258731 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.257599 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.258731 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.257639 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4250d722-a976-44b5-8936-846f53dc7cca-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.258731 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.257676 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4250d722-a976-44b5-8936-846f53dc7cca-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.258731 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.257683 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4250d722-a976-44b5-8936-846f53dc7cca-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.258731 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.257744 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4250d722-a976-44b5-8936-846f53dc7cca-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.258731 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.258366 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4250d722-a976-44b5-8936-846f53dc7cca-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.259137 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.258892 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5cl5x\" (UniqueName: \"kubernetes.io/projected/4250d722-a976-44b5-8936-846f53dc7cca-kube-api-access-5cl5x\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.259137 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.258942 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.259137 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.258983 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-web-config\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.259137 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.259009 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.259137 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.259045 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.259137 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.259106 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4250d722-a976-44b5-8936-846f53dc7cca-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.259792 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.259758 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.259985 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.259961 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-config\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.260270 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.260245 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4250d722-a976-44b5-8936-846f53dc7cca-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.260659 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.260639 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.260789 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.260719 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4250d722-a976-44b5-8936-846f53dc7cca-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.262413 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.262373 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4250d722-a976-44b5-8936-846f53dc7cca-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.262968 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.262928 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.263048 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.262974 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.263048 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.263031 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-web-config\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.263156 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.263059 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.263156 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.263118 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4250d722-a976-44b5-8936-846f53dc7cca-config-out\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.263703 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.263680 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.264544 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.264525 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.267811 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.267770 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-74b68ddfb9-rwkgj" podStartSLOduration=1.643041413 podStartE2EDuration="5.267758814s" podCreationTimestamp="2026-04-20 12:15:22 +0000 UTC" firstStartedPulling="2026-04-20 12:15:22.874207914 +0000 UTC m=+56.594221884" lastFinishedPulling="2026-04-20 12:15:26.498925313 +0000 UTC m=+60.218939285" observedRunningTime="2026-04-20 12:15:27.265883058 +0000 UTC m=+60.985897051" watchObservedRunningTime="2026-04-20 12:15:27.267758814 +0000 UTC m=+60.987772805" Apr 20 12:15:27.272022 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.271998 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cl5x\" (UniqueName: \"kubernetes.io/projected/4250d722-a976-44b5-8936-846f53dc7cca-kube-api-access-5cl5x\") pod \"prometheus-k8s-0\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.418843 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.418314 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:27.586128 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:27.586077 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 12:15:27.587874 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:15:27.587846 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4250d722_a976_44b5_8936_846f53dc7cca.slice/crio-8a198cf82de6a53971cc5faefc30e93b395a05cf24ded81aac0d0de55b10c35f WatchSource:0}: Error finding container 8a198cf82de6a53971cc5faefc30e93b395a05cf24ded81aac0d0de55b10c35f: Status 404 returned error can't find the container with id 8a198cf82de6a53971cc5faefc30e93b395a05cf24ded81aac0d0de55b10c35f Apr 20 12:15:28.230892 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:28.230847 2580 generic.go:358] "Generic (PLEG): container finished" podID="4250d722-a976-44b5-8936-846f53dc7cca" containerID="7d5bca424f828bdff54ef72bc300c8fc1ac58a7b0c04326f5bff978afb96c258" exitCode=0 Apr 20 12:15:28.231055 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:28.230989 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4250d722-a976-44b5-8936-846f53dc7cca","Type":"ContainerDied","Data":"7d5bca424f828bdff54ef72bc300c8fc1ac58a7b0c04326f5bff978afb96c258"} Apr 20 12:15:28.231055 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:28.231024 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4250d722-a976-44b5-8936-846f53dc7cca","Type":"ContainerStarted","Data":"8a198cf82de6a53971cc5faefc30e93b395a05cf24ded81aac0d0de55b10c35f"} Apr 20 12:15:29.236503 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:29.236463 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6vjwl" event={"ID":"6d32df24-8beb-401c-9a8a-8999fce03374","Type":"ContainerStarted","Data":"cf4c9c42f6c821dcfc7c68bc5eec577851c67f8b6a184ae17fc9ba1f3dc2ca7f"} Apr 20 12:15:29.237498 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:29.237445 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6vjwl" Apr 20 12:15:29.241617 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:29.241567 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-579895dbf5-cfrwz" event={"ID":"9748a2da-a0df-48e0-8d2e-bc1ef97f4fed","Type":"ContainerStarted","Data":"6e2bf0119e2ce7333b209df322c2618f11bb6d1668fe869401b5bf2301152f85"} Apr 20 12:15:29.245552 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:29.245484 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"22d63887-ce5b-4e34-b1a0-9126e4462f30","Type":"ContainerStarted","Data":"d2c198ec574f4f5cb3cd63f888de41b5c4ce71d59e2d8bbb711e318ace206946"} Apr 20 12:15:29.245851 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:29.245830 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6vjwl" Apr 20 12:15:29.247649 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:29.247622 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7b75467d8b-xqm69" event={"ID":"6aeb557f-5fc7-4469-90cb-cfdbee9a0458","Type":"ContainerStarted","Data":"071eff8236e30ebeb8c8a9cd7f1261ce2f88b3cc772d549f61fd045edabfb375"} Apr 20 12:15:29.257948 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:29.257651 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6vjwl" podStartSLOduration=1.99354553 podStartE2EDuration="4.257637293s" podCreationTimestamp="2026-04-20 12:15:25 +0000 UTC" firstStartedPulling="2026-04-20 12:15:26.733785791 +0000 UTC m=+60.453799774" lastFinishedPulling="2026-04-20 12:15:28.997877562 +0000 UTC m=+62.717891537" observedRunningTime="2026-04-20 12:15:29.256289542 +0000 UTC m=+62.976303534" watchObservedRunningTime="2026-04-20 12:15:29.257637293 +0000 UTC m=+62.977651283" Apr 20 12:15:29.273370 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:29.273232 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7b75467d8b-xqm69" podStartSLOduration=1.9328869480000002 podStartE2EDuration="4.273217334s" podCreationTimestamp="2026-04-20 12:15:25 +0000 UTC" firstStartedPulling="2026-04-20 12:15:26.661576553 +0000 UTC m=+60.381590525" lastFinishedPulling="2026-04-20 12:15:29.001906937 +0000 UTC m=+62.721920911" observedRunningTime="2026-04-20 12:15:29.272896203 +0000 UTC m=+62.992910191" watchObservedRunningTime="2026-04-20 12:15:29.273217334 +0000 UTC m=+62.993231319" Apr 20 12:15:30.183067 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:30.183036 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-kffw5" Apr 20 12:15:30.260558 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:30.260508 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"22d63887-ce5b-4e34-b1a0-9126e4462f30","Type":"ContainerStarted","Data":"af8d2dc30ebe23a7828fdf1afe3cc1c74b802b9afd57c18bfa9fba596ff868c1"} Apr 20 12:15:30.261014 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:30.260567 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"22d63887-ce5b-4e34-b1a0-9126e4462f30","Type":"ContainerStarted","Data":"8ab5c449a415ac5e71e668d4988246f0385dd59c573d2bbf00bee7ac88d988e1"} Apr 20 12:15:30.261014 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:30.260582 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"22d63887-ce5b-4e34-b1a0-9126e4462f30","Type":"ContainerStarted","Data":"8eefc7ec18ef90edb2c95d5dc6dd287f1c3cef06d211bee77cc4d4125d35381f"} Apr 20 12:15:30.261014 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:30.260593 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"22d63887-ce5b-4e34-b1a0-9126e4462f30","Type":"ContainerStarted","Data":"6d0ed0cbb45b05ba06f3e3944076fb6d4307b1c3bdcce73723fe4de2665c7965"} Apr 20 12:15:30.261014 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:30.260605 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"22d63887-ce5b-4e34-b1a0-9126e4462f30","Type":"ContainerStarted","Data":"e78046dae12800c6311a4ae131bd9c3a4c907d9eb90758865ffcfc33fa6b8623"} Apr 20 12:15:30.264060 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:30.264031 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-579895dbf5-cfrwz" event={"ID":"9748a2da-a0df-48e0-8d2e-bc1ef97f4fed","Type":"ContainerStarted","Data":"37bfdbc7164baf1eb71cbc41d8c656529a02237f6f312ce0e1a817a9b754c5de"} Apr 20 12:15:30.264219 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:30.264068 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-579895dbf5-cfrwz" event={"ID":"9748a2da-a0df-48e0-8d2e-bc1ef97f4fed","Type":"ContainerStarted","Data":"1bc54826da8c908714a7b576d66746c0b914cfc5ea7040eace592da2f2416776"} Apr 20 12:15:30.290147 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:30.289916 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=3.562631097 podStartE2EDuration="9.289897425s" podCreationTimestamp="2026-04-20 12:15:21 +0000 UTC" firstStartedPulling="2026-04-20 12:15:23.325069732 +0000 UTC m=+57.045083701" lastFinishedPulling="2026-04-20 12:15:29.052336047 +0000 UTC m=+62.772350029" observedRunningTime="2026-04-20 12:15:30.288536603 +0000 UTC m=+64.008550595" watchObservedRunningTime="2026-04-20 12:15:30.289897425 +0000 UTC m=+64.009911421" Apr 20 12:15:30.311938 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:30.311873 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-579895dbf5-cfrwz" podStartSLOduration=2.788513787 podStartE2EDuration="8.311851979s" podCreationTimestamp="2026-04-20 12:15:22 +0000 UTC" firstStartedPulling="2026-04-20 12:15:23.476242451 +0000 UTC m=+57.196256435" lastFinishedPulling="2026-04-20 12:15:28.99958065 +0000 UTC m=+62.719594627" observedRunningTime="2026-04-20 12:15:30.310173916 +0000 UTC m=+64.030187899" watchObservedRunningTime="2026-04-20 12:15:30.311851979 +0000 UTC m=+64.031865970" Apr 20 12:15:30.441253 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:30.441166 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74b68ddfb9-rwkgj"] Apr 20 12:15:30.471174 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:30.471135 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-865b486975-pvr2h"] Apr 20 12:15:30.475787 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:30.475759 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-865b486975-pvr2h" Apr 20 12:15:30.486083 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:30.486053 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-865b486975-pvr2h"] Apr 20 12:15:30.491089 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:30.491057 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e011dacd-0632-4429-8ccc-5da8870c87f4-service-ca\") pod \"console-865b486975-pvr2h\" (UID: \"e011dacd-0632-4429-8ccc-5da8870c87f4\") " pod="openshift-console/console-865b486975-pvr2h" Apr 20 12:15:30.491236 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:30.491110 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e011dacd-0632-4429-8ccc-5da8870c87f4-console-serving-cert\") pod \"console-865b486975-pvr2h\" (UID: \"e011dacd-0632-4429-8ccc-5da8870c87f4\") " pod="openshift-console/console-865b486975-pvr2h" Apr 20 12:15:30.491236 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:30.491148 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e011dacd-0632-4429-8ccc-5da8870c87f4-console-config\") pod \"console-865b486975-pvr2h\" (UID: \"e011dacd-0632-4429-8ccc-5da8870c87f4\") " pod="openshift-console/console-865b486975-pvr2h" Apr 20 12:15:30.491363 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:30.491240 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e011dacd-0632-4429-8ccc-5da8870c87f4-console-oauth-config\") pod \"console-865b486975-pvr2h\" (UID: \"e011dacd-0632-4429-8ccc-5da8870c87f4\") " pod="openshift-console/console-865b486975-pvr2h" Apr 20 12:15:30.491363 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:30.491300 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e011dacd-0632-4429-8ccc-5da8870c87f4-trusted-ca-bundle\") pod \"console-865b486975-pvr2h\" (UID: \"e011dacd-0632-4429-8ccc-5da8870c87f4\") " pod="openshift-console/console-865b486975-pvr2h" Apr 20 12:15:30.491363 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:30.491326 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdt5x\" (UniqueName: \"kubernetes.io/projected/e011dacd-0632-4429-8ccc-5da8870c87f4-kube-api-access-kdt5x\") pod \"console-865b486975-pvr2h\" (UID: \"e011dacd-0632-4429-8ccc-5da8870c87f4\") " pod="openshift-console/console-865b486975-pvr2h" Apr 20 12:15:30.491363 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:30.491360 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e011dacd-0632-4429-8ccc-5da8870c87f4-oauth-serving-cert\") pod \"console-865b486975-pvr2h\" (UID: \"e011dacd-0632-4429-8ccc-5da8870c87f4\") " pod="openshift-console/console-865b486975-pvr2h" Apr 20 12:15:30.592714 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:30.592674 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e011dacd-0632-4429-8ccc-5da8870c87f4-service-ca\") pod \"console-865b486975-pvr2h\" (UID: \"e011dacd-0632-4429-8ccc-5da8870c87f4\") " pod="openshift-console/console-865b486975-pvr2h" Apr 20 12:15:30.592901 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:30.592739 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e011dacd-0632-4429-8ccc-5da8870c87f4-console-serving-cert\") pod \"console-865b486975-pvr2h\" (UID: \"e011dacd-0632-4429-8ccc-5da8870c87f4\") " pod="openshift-console/console-865b486975-pvr2h" Apr 20 12:15:30.592901 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:30.592776 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e011dacd-0632-4429-8ccc-5da8870c87f4-console-config\") pod \"console-865b486975-pvr2h\" (UID: \"e011dacd-0632-4429-8ccc-5da8870c87f4\") " pod="openshift-console/console-865b486975-pvr2h" Apr 20 12:15:30.592901 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:30.592830 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e011dacd-0632-4429-8ccc-5da8870c87f4-console-oauth-config\") pod \"console-865b486975-pvr2h\" (UID: \"e011dacd-0632-4429-8ccc-5da8870c87f4\") " pod="openshift-console/console-865b486975-pvr2h" Apr 20 12:15:30.593199 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:30.593157 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e011dacd-0632-4429-8ccc-5da8870c87f4-trusted-ca-bundle\") pod \"console-865b486975-pvr2h\" (UID: \"e011dacd-0632-4429-8ccc-5da8870c87f4\") " pod="openshift-console/console-865b486975-pvr2h" Apr 20 12:15:30.593314 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:30.593207 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdt5x\" (UniqueName: \"kubernetes.io/projected/e011dacd-0632-4429-8ccc-5da8870c87f4-kube-api-access-kdt5x\") pod \"console-865b486975-pvr2h\" (UID: \"e011dacd-0632-4429-8ccc-5da8870c87f4\") " pod="openshift-console/console-865b486975-pvr2h" Apr 20 12:15:30.593314 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:30.593247 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e011dacd-0632-4429-8ccc-5da8870c87f4-oauth-serving-cert\") pod \"console-865b486975-pvr2h\" (UID: \"e011dacd-0632-4429-8ccc-5da8870c87f4\") " pod="openshift-console/console-865b486975-pvr2h" Apr 20 12:15:30.593497 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:30.593474 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e011dacd-0632-4429-8ccc-5da8870c87f4-service-ca\") pod \"console-865b486975-pvr2h\" (UID: \"e011dacd-0632-4429-8ccc-5da8870c87f4\") " pod="openshift-console/console-865b486975-pvr2h" Apr 20 12:15:30.593654 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:30.593634 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e011dacd-0632-4429-8ccc-5da8870c87f4-console-config\") pod \"console-865b486975-pvr2h\" (UID: \"e011dacd-0632-4429-8ccc-5da8870c87f4\") " pod="openshift-console/console-865b486975-pvr2h" Apr 20 12:15:30.593977 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:30.593936 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e011dacd-0632-4429-8ccc-5da8870c87f4-oauth-serving-cert\") pod \"console-865b486975-pvr2h\" (UID: \"e011dacd-0632-4429-8ccc-5da8870c87f4\") " pod="openshift-console/console-865b486975-pvr2h" Apr 20 12:15:30.594079 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:30.594057 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e011dacd-0632-4429-8ccc-5da8870c87f4-trusted-ca-bundle\") pod \"console-865b486975-pvr2h\" (UID: \"e011dacd-0632-4429-8ccc-5da8870c87f4\") " pod="openshift-console/console-865b486975-pvr2h" Apr 20 12:15:30.595653 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:30.595630 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e011dacd-0632-4429-8ccc-5da8870c87f4-console-oauth-config\") pod \"console-865b486975-pvr2h\" (UID: \"e011dacd-0632-4429-8ccc-5da8870c87f4\") " pod="openshift-console/console-865b486975-pvr2h" Apr 20 12:15:30.595824 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:30.595804 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e011dacd-0632-4429-8ccc-5da8870c87f4-console-serving-cert\") pod \"console-865b486975-pvr2h\" (UID: \"e011dacd-0632-4429-8ccc-5da8870c87f4\") " pod="openshift-console/console-865b486975-pvr2h" Apr 20 12:15:30.602644 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:30.602620 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdt5x\" (UniqueName: \"kubernetes.io/projected/e011dacd-0632-4429-8ccc-5da8870c87f4-kube-api-access-kdt5x\") pod \"console-865b486975-pvr2h\" (UID: \"e011dacd-0632-4429-8ccc-5da8870c87f4\") " pod="openshift-console/console-865b486975-pvr2h" Apr 20 12:15:30.787278 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:30.787236 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-865b486975-pvr2h" Apr 20 12:15:31.192687 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:31.192662 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-865b486975-pvr2h"] Apr 20 12:15:31.200410 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:15:31.200368 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode011dacd_0632_4429_8ccc_5da8870c87f4.slice/crio-abb2a051b5af9572a2f32b9aaf8cd8f73ba0aea14e13b362752c0c5c520b4bb9 WatchSource:0}: Error finding container abb2a051b5af9572a2f32b9aaf8cd8f73ba0aea14e13b362752c0c5c520b4bb9: Status 404 returned error can't find the container with id abb2a051b5af9572a2f32b9aaf8cd8f73ba0aea14e13b362752c0c5c520b4bb9 Apr 20 12:15:31.268258 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:31.268213 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-865b486975-pvr2h" event={"ID":"e011dacd-0632-4429-8ccc-5da8870c87f4","Type":"ContainerStarted","Data":"abb2a051b5af9572a2f32b9aaf8cd8f73ba0aea14e13b362752c0c5c520b4bb9"} Apr 20 12:15:31.270115 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:31.270085 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4250d722-a976-44b5-8936-846f53dc7cca","Type":"ContainerStarted","Data":"91a869e4727c73167d92cd5efce7982a8bf1c7f9384ae9f34143ff7c6dd264f1"} Apr 20 12:15:31.270795 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:31.270761 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-579895dbf5-cfrwz" Apr 20 12:15:31.278678 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:31.278658 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-579895dbf5-cfrwz" Apr 20 12:15:31.504488 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:31.504454 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjqlk\" (UniqueName: \"kubernetes.io/projected/deac4f19-5105-40df-bd7a-9d7c576cd705-kube-api-access-qjqlk\") pod \"network-check-target-cz8d9\" (UID: \"deac4f19-5105-40df-bd7a-9d7c576cd705\") " pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:15:31.507166 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:31.507135 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 12:15:31.517502 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:31.517479 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 12:15:31.527789 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:31.527753 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjqlk\" (UniqueName: \"kubernetes.io/projected/deac4f19-5105-40df-bd7a-9d7c576cd705-kube-api-access-qjqlk\") pod \"network-check-target-cz8d9\" (UID: \"deac4f19-5105-40df-bd7a-9d7c576cd705\") " pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:15:31.606037 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:31.605965 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/430e704c-5d70-4df6-baaa-2296216f1239-metrics-certs\") pod \"network-metrics-daemon-nm452\" (UID: \"430e704c-5d70-4df6-baaa-2296216f1239\") " pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:15:31.608656 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:31.608616 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 12:15:31.618473 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:31.618439 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/430e704c-5d70-4df6-baaa-2296216f1239-metrics-certs\") pod \"network-metrics-daemon-nm452\" (UID: \"430e704c-5d70-4df6-baaa-2296216f1239\") " pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:15:31.659259 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:31.659230 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kwdzh\"" Apr 20 12:15:31.665936 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:31.665914 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-p76zv\"" Apr 20 12:15:31.667509 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:31.667493 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nm452" Apr 20 12:15:31.674130 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:31.674104 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:15:31.802224 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:31.802194 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nm452"] Apr 20 12:15:31.804590 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:15:31.804557 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod430e704c_5d70_4df6_baaa_2296216f1239.slice/crio-8324e519cb15259d9f60412d529cb980889d424350b820371c2257264a29adb8 WatchSource:0}: Error finding container 8324e519cb15259d9f60412d529cb980889d424350b820371c2257264a29adb8: Status 404 returned error can't find the container with id 8324e519cb15259d9f60412d529cb980889d424350b820371c2257264a29adb8 Apr 20 12:15:31.819578 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:31.819451 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-cz8d9"] Apr 20 12:15:31.822281 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:15:31.822252 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddeac4f19_5105_40df_bd7a_9d7c576cd705.slice/crio-b98e6890aa360f57baacd9ab6e1f17ac45b5bcad7ab9ab1c780b787b317b6267 WatchSource:0}: Error finding container b98e6890aa360f57baacd9ab6e1f17ac45b5bcad7ab9ab1c780b787b317b6267: Status 404 returned error can't find the container with id b98e6890aa360f57baacd9ab6e1f17ac45b5bcad7ab9ab1c780b787b317b6267 Apr 20 12:15:32.275002 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:32.274963 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-865b486975-pvr2h" event={"ID":"e011dacd-0632-4429-8ccc-5da8870c87f4","Type":"ContainerStarted","Data":"bbd06af96ee89f4ea09c18ef8f4994a36200485bc2b3fc5c1633fe2a6a2daea4"} Apr 20 12:15:32.276214 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:32.276190 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-cz8d9" event={"ID":"deac4f19-5105-40df-bd7a-9d7c576cd705","Type":"ContainerStarted","Data":"b98e6890aa360f57baacd9ab6e1f17ac45b5bcad7ab9ab1c780b787b317b6267"} Apr 20 12:15:32.279358 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:32.279325 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4250d722-a976-44b5-8936-846f53dc7cca","Type":"ContainerStarted","Data":"d0f8ca4501c8467c6884b3be53f26cab5d7579b43507e2ea45f2122f0eeddd1a"} Apr 20 12:15:32.279358 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:32.279355 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4250d722-a976-44b5-8936-846f53dc7cca","Type":"ContainerStarted","Data":"fa280989b6275627b05908c174872f1e9009812bfd1e598f8ca7fee6498e5216"} Apr 20 12:15:32.279533 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:32.279367 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4250d722-a976-44b5-8936-846f53dc7cca","Type":"ContainerStarted","Data":"c0363d3e6fd2690cde36d8af948483224cb372cc4dd5e636bb6850c808f31138"} Apr 20 12:15:32.279533 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:32.279379 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4250d722-a976-44b5-8936-846f53dc7cca","Type":"ContainerStarted","Data":"16ca6f078d59528a6292161eb2a4883d5d2e63dfd6681b639f009150b68ba684"} Apr 20 12:15:32.279533 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:32.279390 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4250d722-a976-44b5-8936-846f53dc7cca","Type":"ContainerStarted","Data":"fcd9e97e1b219572a49b9fdb8b3c160aff2c81c5a4269c522f4d6bdf11c30816"} Apr 20 12:15:32.280437 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:32.280414 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nm452" event={"ID":"430e704c-5d70-4df6-baaa-2296216f1239","Type":"ContainerStarted","Data":"8324e519cb15259d9f60412d529cb980889d424350b820371c2257264a29adb8"} Apr 20 12:15:32.293265 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:32.293220 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-865b486975-pvr2h" podStartSLOduration=2.293204918 podStartE2EDuration="2.293204918s" podCreationTimestamp="2026-04-20 12:15:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 12:15:32.291716534 +0000 UTC m=+66.011730525" watchObservedRunningTime="2026-04-20 12:15:32.293204918 +0000 UTC m=+66.013218909" Apr 20 12:15:32.319938 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:32.319858 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.412421216 podStartE2EDuration="5.319838659s" podCreationTimestamp="2026-04-20 12:15:27 +0000 UTC" firstStartedPulling="2026-04-20 12:15:28.232219894 +0000 UTC m=+61.952233878" lastFinishedPulling="2026-04-20 12:15:31.139637352 +0000 UTC m=+64.859651321" observedRunningTime="2026-04-20 12:15:32.317916956 +0000 UTC m=+66.037930948" watchObservedRunningTime="2026-04-20 12:15:32.319838659 +0000 UTC m=+66.039852649" Apr 20 12:15:32.418507 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:32.418464 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:15:32.730930 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:32.730831 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-74b68ddfb9-rwkgj" Apr 20 12:15:34.288989 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:34.288950 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nm452" event={"ID":"430e704c-5d70-4df6-baaa-2296216f1239","Type":"ContainerStarted","Data":"06581238054c5000930c28358cd79d3086578c4b01b0a57ba75caabb75357750"} Apr 20 12:15:34.289506 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:34.288997 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nm452" event={"ID":"430e704c-5d70-4df6-baaa-2296216f1239","Type":"ContainerStarted","Data":"50d153fe2ea7688a51301d857ae221f1f5bf1e447146d5bb1ffdf3bda31c91ac"} Apr 20 12:15:34.307252 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:34.307195 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-nm452" podStartSLOduration=65.732544639 podStartE2EDuration="1m7.307178022s" podCreationTimestamp="2026-04-20 12:14:27 +0000 UTC" firstStartedPulling="2026-04-20 12:15:31.806484388 +0000 UTC m=+65.526498358" lastFinishedPulling="2026-04-20 12:15:33.381117756 +0000 UTC m=+67.101131741" observedRunningTime="2026-04-20 12:15:34.305794065 +0000 UTC m=+68.025808068" watchObservedRunningTime="2026-04-20 12:15:34.307178022 +0000 UTC m=+68.027192013" Apr 20 12:15:36.296698 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:36.296663 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-cz8d9" event={"ID":"deac4f19-5105-40df-bd7a-9d7c576cd705","Type":"ContainerStarted","Data":"e0e4faf2563281b5f71bd9dc4dc4ad29a58a729d860f1ef328a3b8a783f0eaab"} Apr 20 12:15:36.297162 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:36.296802 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:15:36.313446 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:36.313361 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-cz8d9" podStartSLOduration=66.892429442 podStartE2EDuration="1m10.313338876s" podCreationTimestamp="2026-04-20 12:14:26 +0000 UTC" firstStartedPulling="2026-04-20 12:15:31.824102713 +0000 UTC m=+65.544116681" lastFinishedPulling="2026-04-20 12:15:35.245012145 +0000 UTC m=+68.965026115" observedRunningTime="2026-04-20 12:15:36.312184812 +0000 UTC m=+70.032198825" watchObservedRunningTime="2026-04-20 12:15:36.313338876 +0000 UTC m=+70.033352870" Apr 20 12:15:40.788189 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:40.788149 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-865b486975-pvr2h" Apr 20 12:15:40.788758 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:40.788203 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-865b486975-pvr2h" Apr 20 12:15:40.793044 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:40.793022 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-865b486975-pvr2h" Apr 20 12:15:41.320099 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:41.320073 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-865b486975-pvr2h" Apr 20 12:15:45.661125 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:45.660989 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7b75467d8b-xqm69" Apr 20 12:15:45.661125 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:45.661038 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-7b75467d8b-xqm69" Apr 20 12:15:55.463660 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:55.463594 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-74b68ddfb9-rwkgj" podUID="69db919e-8d10-40f9-b4d7-b37130d3ab45" containerName="console" containerID="cri-o://519d3fc90dd053bf2937ab28bdb79dae21b18cc5184fbff0442ebfe5c6f3d75f" gracePeriod=15 Apr 20 12:15:55.709146 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:55.709120 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74b68ddfb9-rwkgj_69db919e-8d10-40f9-b4d7-b37130d3ab45/console/0.log" Apr 20 12:15:55.709265 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:55.709193 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74b68ddfb9-rwkgj" Apr 20 12:15:55.831207 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:55.831161 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/69db919e-8d10-40f9-b4d7-b37130d3ab45-console-oauth-config\") pod \"69db919e-8d10-40f9-b4d7-b37130d3ab45\" (UID: \"69db919e-8d10-40f9-b4d7-b37130d3ab45\") " Apr 20 12:15:55.831207 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:55.831209 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/69db919e-8d10-40f9-b4d7-b37130d3ab45-oauth-serving-cert\") pod \"69db919e-8d10-40f9-b4d7-b37130d3ab45\" (UID: \"69db919e-8d10-40f9-b4d7-b37130d3ab45\") " Apr 20 12:15:55.831445 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:55.831236 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/69db919e-8d10-40f9-b4d7-b37130d3ab45-console-config\") pod \"69db919e-8d10-40f9-b4d7-b37130d3ab45\" (UID: \"69db919e-8d10-40f9-b4d7-b37130d3ab45\") " Apr 20 12:15:55.831445 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:55.831389 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9mtz\" (UniqueName: \"kubernetes.io/projected/69db919e-8d10-40f9-b4d7-b37130d3ab45-kube-api-access-b9mtz\") pod \"69db919e-8d10-40f9-b4d7-b37130d3ab45\" (UID: \"69db919e-8d10-40f9-b4d7-b37130d3ab45\") " Apr 20 12:15:55.831528 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:55.831450 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69db919e-8d10-40f9-b4d7-b37130d3ab45-trusted-ca-bundle\") pod \"69db919e-8d10-40f9-b4d7-b37130d3ab45\" (UID: \"69db919e-8d10-40f9-b4d7-b37130d3ab45\") " Apr 20 12:15:55.831528 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:55.831491 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/69db919e-8d10-40f9-b4d7-b37130d3ab45-service-ca\") pod \"69db919e-8d10-40f9-b4d7-b37130d3ab45\" (UID: \"69db919e-8d10-40f9-b4d7-b37130d3ab45\") " Apr 20 12:15:55.831606 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:55.831583 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/69db919e-8d10-40f9-b4d7-b37130d3ab45-console-serving-cert\") pod \"69db919e-8d10-40f9-b4d7-b37130d3ab45\" (UID: \"69db919e-8d10-40f9-b4d7-b37130d3ab45\") " Apr 20 12:15:55.831663 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:55.831617 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69db919e-8d10-40f9-b4d7-b37130d3ab45-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "69db919e-8d10-40f9-b4d7-b37130d3ab45" (UID: "69db919e-8d10-40f9-b4d7-b37130d3ab45"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 12:15:55.831723 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:55.831656 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69db919e-8d10-40f9-b4d7-b37130d3ab45-console-config" (OuterVolumeSpecName: "console-config") pod "69db919e-8d10-40f9-b4d7-b37130d3ab45" (UID: "69db919e-8d10-40f9-b4d7-b37130d3ab45"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 12:15:55.831927 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:55.831899 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69db919e-8d10-40f9-b4d7-b37130d3ab45-service-ca" (OuterVolumeSpecName: "service-ca") pod "69db919e-8d10-40f9-b4d7-b37130d3ab45" (UID: "69db919e-8d10-40f9-b4d7-b37130d3ab45"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 12:15:55.832023 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:55.831938 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69db919e-8d10-40f9-b4d7-b37130d3ab45-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "69db919e-8d10-40f9-b4d7-b37130d3ab45" (UID: "69db919e-8d10-40f9-b4d7-b37130d3ab45"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 12:15:55.832087 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:55.832028 2580 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/69db919e-8d10-40f9-b4d7-b37130d3ab45-console-config\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:15:55.832087 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:55.832048 2580 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69db919e-8d10-40f9-b4d7-b37130d3ab45-trusted-ca-bundle\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:15:55.832087 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:55.832064 2580 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/69db919e-8d10-40f9-b4d7-b37130d3ab45-service-ca\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:15:55.832087 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:55.832078 2580 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/69db919e-8d10-40f9-b4d7-b37130d3ab45-oauth-serving-cert\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:15:55.833720 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:55.833696 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69db919e-8d10-40f9-b4d7-b37130d3ab45-kube-api-access-b9mtz" (OuterVolumeSpecName: "kube-api-access-b9mtz") pod "69db919e-8d10-40f9-b4d7-b37130d3ab45" (UID: "69db919e-8d10-40f9-b4d7-b37130d3ab45"). InnerVolumeSpecName "kube-api-access-b9mtz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 12:15:55.833720 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:55.833707 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69db919e-8d10-40f9-b4d7-b37130d3ab45-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "69db919e-8d10-40f9-b4d7-b37130d3ab45" (UID: "69db919e-8d10-40f9-b4d7-b37130d3ab45"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 12:15:55.833873 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:55.833724 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69db919e-8d10-40f9-b4d7-b37130d3ab45-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "69db919e-8d10-40f9-b4d7-b37130d3ab45" (UID: "69db919e-8d10-40f9-b4d7-b37130d3ab45"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 12:15:55.932656 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:55.932615 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b9mtz\" (UniqueName: \"kubernetes.io/projected/69db919e-8d10-40f9-b4d7-b37130d3ab45-kube-api-access-b9mtz\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:15:55.932656 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:55.932648 2580 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/69db919e-8d10-40f9-b4d7-b37130d3ab45-console-serving-cert\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:15:55.932656 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:55.932661 2580 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/69db919e-8d10-40f9-b4d7-b37130d3ab45-console-oauth-config\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:15:56.361661 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:56.361630 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74b68ddfb9-rwkgj_69db919e-8d10-40f9-b4d7-b37130d3ab45/console/0.log" Apr 20 12:15:56.361840 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:56.361678 2580 generic.go:358] "Generic (PLEG): container finished" podID="69db919e-8d10-40f9-b4d7-b37130d3ab45" containerID="519d3fc90dd053bf2937ab28bdb79dae21b18cc5184fbff0442ebfe5c6f3d75f" exitCode=2 Apr 20 12:15:56.361840 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:56.361745 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74b68ddfb9-rwkgj" Apr 20 12:15:56.361840 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:56.361767 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74b68ddfb9-rwkgj" event={"ID":"69db919e-8d10-40f9-b4d7-b37130d3ab45","Type":"ContainerDied","Data":"519d3fc90dd053bf2937ab28bdb79dae21b18cc5184fbff0442ebfe5c6f3d75f"} Apr 20 12:15:56.361840 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:56.361802 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74b68ddfb9-rwkgj" event={"ID":"69db919e-8d10-40f9-b4d7-b37130d3ab45","Type":"ContainerDied","Data":"0b49a58a2a85c0166f33f7b4a9bcbec43efb1fb4f9f8472040c357eb2d428835"} Apr 20 12:15:56.361840 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:56.361817 2580 scope.go:117] "RemoveContainer" containerID="519d3fc90dd053bf2937ab28bdb79dae21b18cc5184fbff0442ebfe5c6f3d75f" Apr 20 12:15:56.370435 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:56.370392 2580 scope.go:117] "RemoveContainer" containerID="519d3fc90dd053bf2937ab28bdb79dae21b18cc5184fbff0442ebfe5c6f3d75f" Apr 20 12:15:56.370678 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:15:56.370658 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"519d3fc90dd053bf2937ab28bdb79dae21b18cc5184fbff0442ebfe5c6f3d75f\": container with ID starting with 519d3fc90dd053bf2937ab28bdb79dae21b18cc5184fbff0442ebfe5c6f3d75f not found: ID does not exist" containerID="519d3fc90dd053bf2937ab28bdb79dae21b18cc5184fbff0442ebfe5c6f3d75f" Apr 20 12:15:56.370725 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:56.370686 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"519d3fc90dd053bf2937ab28bdb79dae21b18cc5184fbff0442ebfe5c6f3d75f"} err="failed to get container status \"519d3fc90dd053bf2937ab28bdb79dae21b18cc5184fbff0442ebfe5c6f3d75f\": rpc error: code = NotFound desc = could not find container \"519d3fc90dd053bf2937ab28bdb79dae21b18cc5184fbff0442ebfe5c6f3d75f\": container with ID starting with 519d3fc90dd053bf2937ab28bdb79dae21b18cc5184fbff0442ebfe5c6f3d75f not found: ID does not exist" Apr 20 12:15:56.381594 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:56.381569 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74b68ddfb9-rwkgj"] Apr 20 12:15:56.384887 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:56.384868 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-74b68ddfb9-rwkgj"] Apr 20 12:15:56.847105 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:15:56.847074 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69db919e-8d10-40f9-b4d7-b37130d3ab45" path="/var/lib/kubelet/pods/69db919e-8d10-40f9-b4d7-b37130d3ab45/volumes" Apr 20 12:16:05.666874 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:05.666843 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7b75467d8b-xqm69" Apr 20 12:16:05.671088 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:05.671062 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7b75467d8b-xqm69" Apr 20 12:16:07.306892 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:07.306861 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-cz8d9" Apr 20 12:16:27.419375 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:27.419338 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:27.438878 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:27.438851 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:27.469253 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:27.469222 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:41.149885 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:41.149839 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 12:16:41.150473 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:41.150428 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="22d63887-ce5b-4e34-b1a0-9126e4462f30" containerName="alertmanager" containerID="cri-o://d2c198ec574f4f5cb3cd63f888de41b5c4ce71d59e2d8bbb711e318ace206946" gracePeriod=120 Apr 20 12:16:41.150565 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:41.150482 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="22d63887-ce5b-4e34-b1a0-9126e4462f30" containerName="kube-rbac-proxy-metric" containerID="cri-o://8ab5c449a415ac5e71e668d4988246f0385dd59c573d2bbf00bee7ac88d988e1" gracePeriod=120 Apr 20 12:16:41.150565 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:41.150519 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="22d63887-ce5b-4e34-b1a0-9126e4462f30" containerName="kube-rbac-proxy-web" containerID="cri-o://6d0ed0cbb45b05ba06f3e3944076fb6d4307b1c3bdcce73723fe4de2665c7965" gracePeriod=120 Apr 20 12:16:41.150565 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:41.150523 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="22d63887-ce5b-4e34-b1a0-9126e4462f30" containerName="prom-label-proxy" containerID="cri-o://af8d2dc30ebe23a7828fdf1afe3cc1c74b802b9afd57c18bfa9fba596ff868c1" gracePeriod=120 Apr 20 12:16:41.150716 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:41.150545 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="22d63887-ce5b-4e34-b1a0-9126e4462f30" containerName="config-reloader" containerID="cri-o://e78046dae12800c6311a4ae131bd9c3a4c907d9eb90758865ffcfc33fa6b8623" gracePeriod=120 Apr 20 12:16:41.150716 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:41.150599 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="22d63887-ce5b-4e34-b1a0-9126e4462f30" containerName="kube-rbac-proxy" containerID="cri-o://8eefc7ec18ef90edb2c95d5dc6dd287f1c3cef06d211bee77cc4d4125d35381f" gracePeriod=120 Apr 20 12:16:41.496154 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:41.496062 2580 generic.go:358] "Generic (PLEG): container finished" podID="22d63887-ce5b-4e34-b1a0-9126e4462f30" containerID="af8d2dc30ebe23a7828fdf1afe3cc1c74b802b9afd57c18bfa9fba596ff868c1" exitCode=0 Apr 20 12:16:41.496154 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:41.496094 2580 generic.go:358] "Generic (PLEG): container finished" podID="22d63887-ce5b-4e34-b1a0-9126e4462f30" containerID="8ab5c449a415ac5e71e668d4988246f0385dd59c573d2bbf00bee7ac88d988e1" exitCode=0 Apr 20 12:16:41.496154 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:41.496101 2580 generic.go:358] "Generic (PLEG): container finished" podID="22d63887-ce5b-4e34-b1a0-9126e4462f30" containerID="8eefc7ec18ef90edb2c95d5dc6dd287f1c3cef06d211bee77cc4d4125d35381f" exitCode=0 Apr 20 12:16:41.496154 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:41.496107 2580 generic.go:358] "Generic (PLEG): container finished" podID="22d63887-ce5b-4e34-b1a0-9126e4462f30" containerID="e78046dae12800c6311a4ae131bd9c3a4c907d9eb90758865ffcfc33fa6b8623" exitCode=0 Apr 20 12:16:41.496154 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:41.496112 2580 generic.go:358] "Generic (PLEG): container finished" podID="22d63887-ce5b-4e34-b1a0-9126e4462f30" containerID="d2c198ec574f4f5cb3cd63f888de41b5c4ce71d59e2d8bbb711e318ace206946" exitCode=0 Apr 20 12:16:41.496154 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:41.496133 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"22d63887-ce5b-4e34-b1a0-9126e4462f30","Type":"ContainerDied","Data":"af8d2dc30ebe23a7828fdf1afe3cc1c74b802b9afd57c18bfa9fba596ff868c1"} Apr 20 12:16:41.496494 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:41.496165 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"22d63887-ce5b-4e34-b1a0-9126e4462f30","Type":"ContainerDied","Data":"8ab5c449a415ac5e71e668d4988246f0385dd59c573d2bbf00bee7ac88d988e1"} Apr 20 12:16:41.496494 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:41.496176 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"22d63887-ce5b-4e34-b1a0-9126e4462f30","Type":"ContainerDied","Data":"8eefc7ec18ef90edb2c95d5dc6dd287f1c3cef06d211bee77cc4d4125d35381f"} Apr 20 12:16:41.496494 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:41.496184 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"22d63887-ce5b-4e34-b1a0-9126e4462f30","Type":"ContainerDied","Data":"e78046dae12800c6311a4ae131bd9c3a4c907d9eb90758865ffcfc33fa6b8623"} Apr 20 12:16:41.496494 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:41.496193 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"22d63887-ce5b-4e34-b1a0-9126e4462f30","Type":"ContainerDied","Data":"d2c198ec574f4f5cb3cd63f888de41b5c4ce71d59e2d8bbb711e318ace206946"} Apr 20 12:16:42.401451 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.401426 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:42.502017 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.501928 2580 generic.go:358] "Generic (PLEG): container finished" podID="22d63887-ce5b-4e34-b1a0-9126e4462f30" containerID="6d0ed0cbb45b05ba06f3e3944076fb6d4307b1c3bdcce73723fe4de2665c7965" exitCode=0 Apr 20 12:16:42.502017 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.501972 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"22d63887-ce5b-4e34-b1a0-9126e4462f30","Type":"ContainerDied","Data":"6d0ed0cbb45b05ba06f3e3944076fb6d4307b1c3bdcce73723fe4de2665c7965"} Apr 20 12:16:42.502017 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.502000 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"22d63887-ce5b-4e34-b1a0-9126e4462f30","Type":"ContainerDied","Data":"d6d65d1aef5bf8278d17d3ad01cc858b5d09b4df16c23d3ccccaf314f66d325a"} Apr 20 12:16:42.502017 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.502019 2580 scope.go:117] "RemoveContainer" containerID="af8d2dc30ebe23a7828fdf1afe3cc1c74b802b9afd57c18bfa9fba596ff868c1" Apr 20 12:16:42.502281 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.502049 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:42.511624 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.511547 2580 scope.go:117] "RemoveContainer" containerID="8ab5c449a415ac5e71e668d4988246f0385dd59c573d2bbf00bee7ac88d988e1" Apr 20 12:16:42.518680 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.518662 2580 scope.go:117] "RemoveContainer" containerID="8eefc7ec18ef90edb2c95d5dc6dd287f1c3cef06d211bee77cc4d4125d35381f" Apr 20 12:16:42.524797 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.524767 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-secret-alertmanager-kube-rbac-proxy-web\") pod \"22d63887-ce5b-4e34-b1a0-9126e4462f30\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " Apr 20 12:16:42.524920 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.524814 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/22d63887-ce5b-4e34-b1a0-9126e4462f30-metrics-client-ca\") pod \"22d63887-ce5b-4e34-b1a0-9126e4462f30\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " Apr 20 12:16:42.524920 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.524876 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-web-config\") pod \"22d63887-ce5b-4e34-b1a0-9126e4462f30\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " Apr 20 12:16:42.524920 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.524903 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-secret-alertmanager-main-tls\") pod \"22d63887-ce5b-4e34-b1a0-9126e4462f30\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " Apr 20 12:16:42.525089 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.524926 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-cluster-tls-config\") pod \"22d63887-ce5b-4e34-b1a0-9126e4462f30\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " Apr 20 12:16:42.525089 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.524985 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-secret-alertmanager-kube-rbac-proxy\") pod \"22d63887-ce5b-4e34-b1a0-9126e4462f30\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " Apr 20 12:16:42.525089 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.525010 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/22d63887-ce5b-4e34-b1a0-9126e4462f30-alertmanager-main-db\") pod \"22d63887-ce5b-4e34-b1a0-9126e4462f30\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " Apr 20 12:16:42.525089 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.525050 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22d63887-ce5b-4e34-b1a0-9126e4462f30-alertmanager-trusted-ca-bundle\") pod \"22d63887-ce5b-4e34-b1a0-9126e4462f30\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " Apr 20 12:16:42.525089 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.525078 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/22d63887-ce5b-4e34-b1a0-9126e4462f30-config-out\") pod \"22d63887-ce5b-4e34-b1a0-9126e4462f30\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " Apr 20 12:16:42.525326 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.525105 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vg68\" (UniqueName: \"kubernetes.io/projected/22d63887-ce5b-4e34-b1a0-9126e4462f30-kube-api-access-5vg68\") pod \"22d63887-ce5b-4e34-b1a0-9126e4462f30\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " Apr 20 12:16:42.525326 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.525136 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-config-volume\") pod \"22d63887-ce5b-4e34-b1a0-9126e4462f30\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " Apr 20 12:16:42.525326 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.525167 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/22d63887-ce5b-4e34-b1a0-9126e4462f30-tls-assets\") pod \"22d63887-ce5b-4e34-b1a0-9126e4462f30\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " Apr 20 12:16:42.525326 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.525209 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-secret-alertmanager-kube-rbac-proxy-metric\") pod \"22d63887-ce5b-4e34-b1a0-9126e4462f30\" (UID: \"22d63887-ce5b-4e34-b1a0-9126e4462f30\") " Apr 20 12:16:42.525326 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.525237 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22d63887-ce5b-4e34-b1a0-9126e4462f30-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "22d63887-ce5b-4e34-b1a0-9126e4462f30" (UID: "22d63887-ce5b-4e34-b1a0-9126e4462f30"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 12:16:42.525575 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.525492 2580 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/22d63887-ce5b-4e34-b1a0-9126e4462f30-metrics-client-ca\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:16:42.525575 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.525496 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22d63887-ce5b-4e34-b1a0-9126e4462f30-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "22d63887-ce5b-4e34-b1a0-9126e4462f30" (UID: "22d63887-ce5b-4e34-b1a0-9126e4462f30"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 12:16:42.526298 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.526274 2580 scope.go:117] "RemoveContainer" containerID="6d0ed0cbb45b05ba06f3e3944076fb6d4307b1c3bdcce73723fe4de2665c7965" Apr 20 12:16:42.526569 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.526543 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22d63887-ce5b-4e34-b1a0-9126e4462f30-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "22d63887-ce5b-4e34-b1a0-9126e4462f30" (UID: "22d63887-ce5b-4e34-b1a0-9126e4462f30"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 12:16:42.527714 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.527686 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "22d63887-ce5b-4e34-b1a0-9126e4462f30" (UID: "22d63887-ce5b-4e34-b1a0-9126e4462f30"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 12:16:42.527916 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.527885 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22d63887-ce5b-4e34-b1a0-9126e4462f30-kube-api-access-5vg68" (OuterVolumeSpecName: "kube-api-access-5vg68") pod "22d63887-ce5b-4e34-b1a0-9126e4462f30" (UID: "22d63887-ce5b-4e34-b1a0-9126e4462f30"). InnerVolumeSpecName "kube-api-access-5vg68". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 12:16:42.528818 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.528770 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22d63887-ce5b-4e34-b1a0-9126e4462f30-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "22d63887-ce5b-4e34-b1a0-9126e4462f30" (UID: "22d63887-ce5b-4e34-b1a0-9126e4462f30"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 12:16:42.528818 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.528789 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "22d63887-ce5b-4e34-b1a0-9126e4462f30" (UID: "22d63887-ce5b-4e34-b1a0-9126e4462f30"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 12:16:42.529286 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.529263 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "22d63887-ce5b-4e34-b1a0-9126e4462f30" (UID: "22d63887-ce5b-4e34-b1a0-9126e4462f30"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 12:16:42.529376 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.529333 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "22d63887-ce5b-4e34-b1a0-9126e4462f30" (UID: "22d63887-ce5b-4e34-b1a0-9126e4462f30"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 12:16:42.529935 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.529886 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-config-volume" (OuterVolumeSpecName: "config-volume") pod "22d63887-ce5b-4e34-b1a0-9126e4462f30" (UID: "22d63887-ce5b-4e34-b1a0-9126e4462f30"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 12:16:42.530022 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.529970 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22d63887-ce5b-4e34-b1a0-9126e4462f30-config-out" (OuterVolumeSpecName: "config-out") pod "22d63887-ce5b-4e34-b1a0-9126e4462f30" (UID: "22d63887-ce5b-4e34-b1a0-9126e4462f30"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 12:16:42.533526 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.533503 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "22d63887-ce5b-4e34-b1a0-9126e4462f30" (UID: "22d63887-ce5b-4e34-b1a0-9126e4462f30"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 12:16:42.540881 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.540849 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-web-config" (OuterVolumeSpecName: "web-config") pod "22d63887-ce5b-4e34-b1a0-9126e4462f30" (UID: "22d63887-ce5b-4e34-b1a0-9126e4462f30"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 12:16:42.546796 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.546774 2580 scope.go:117] "RemoveContainer" containerID="e78046dae12800c6311a4ae131bd9c3a4c907d9eb90758865ffcfc33fa6b8623" Apr 20 12:16:42.554082 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.554063 2580 scope.go:117] "RemoveContainer" containerID="d2c198ec574f4f5cb3cd63f888de41b5c4ce71d59e2d8bbb711e318ace206946" Apr 20 12:16:42.560992 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.560972 2580 scope.go:117] "RemoveContainer" containerID="df0b8e8b535eae8194ac8abc77bec7b24dd90efe1bc7b828ce8559be09561e9d" Apr 20 12:16:42.567633 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.567611 2580 scope.go:117] "RemoveContainer" containerID="af8d2dc30ebe23a7828fdf1afe3cc1c74b802b9afd57c18bfa9fba596ff868c1" Apr 20 12:16:42.567896 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:16:42.567875 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af8d2dc30ebe23a7828fdf1afe3cc1c74b802b9afd57c18bfa9fba596ff868c1\": container with ID starting with af8d2dc30ebe23a7828fdf1afe3cc1c74b802b9afd57c18bfa9fba596ff868c1 not found: ID does not exist" containerID="af8d2dc30ebe23a7828fdf1afe3cc1c74b802b9afd57c18bfa9fba596ff868c1" Apr 20 12:16:42.567946 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.567909 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af8d2dc30ebe23a7828fdf1afe3cc1c74b802b9afd57c18bfa9fba596ff868c1"} err="failed to get container status \"af8d2dc30ebe23a7828fdf1afe3cc1c74b802b9afd57c18bfa9fba596ff868c1\": rpc error: code = NotFound desc = could not find container \"af8d2dc30ebe23a7828fdf1afe3cc1c74b802b9afd57c18bfa9fba596ff868c1\": container with ID starting with af8d2dc30ebe23a7828fdf1afe3cc1c74b802b9afd57c18bfa9fba596ff868c1 not found: ID does not exist" Apr 20 12:16:42.567946 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.567928 2580 scope.go:117] "RemoveContainer" containerID="8ab5c449a415ac5e71e668d4988246f0385dd59c573d2bbf00bee7ac88d988e1" Apr 20 12:16:42.568145 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:16:42.568127 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ab5c449a415ac5e71e668d4988246f0385dd59c573d2bbf00bee7ac88d988e1\": container with ID starting with 8ab5c449a415ac5e71e668d4988246f0385dd59c573d2bbf00bee7ac88d988e1 not found: ID does not exist" containerID="8ab5c449a415ac5e71e668d4988246f0385dd59c573d2bbf00bee7ac88d988e1" Apr 20 12:16:42.568186 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.568149 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ab5c449a415ac5e71e668d4988246f0385dd59c573d2bbf00bee7ac88d988e1"} err="failed to get container status \"8ab5c449a415ac5e71e668d4988246f0385dd59c573d2bbf00bee7ac88d988e1\": rpc error: code = NotFound desc = could not find container \"8ab5c449a415ac5e71e668d4988246f0385dd59c573d2bbf00bee7ac88d988e1\": container with ID starting with 8ab5c449a415ac5e71e668d4988246f0385dd59c573d2bbf00bee7ac88d988e1 not found: ID does not exist" Apr 20 12:16:42.568186 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.568164 2580 scope.go:117] "RemoveContainer" containerID="8eefc7ec18ef90edb2c95d5dc6dd287f1c3cef06d211bee77cc4d4125d35381f" Apr 20 12:16:42.568345 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:16:42.568331 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8eefc7ec18ef90edb2c95d5dc6dd287f1c3cef06d211bee77cc4d4125d35381f\": container with ID starting with 8eefc7ec18ef90edb2c95d5dc6dd287f1c3cef06d211bee77cc4d4125d35381f not found: ID does not exist" containerID="8eefc7ec18ef90edb2c95d5dc6dd287f1c3cef06d211bee77cc4d4125d35381f" Apr 20 12:16:42.568406 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.568348 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eefc7ec18ef90edb2c95d5dc6dd287f1c3cef06d211bee77cc4d4125d35381f"} err="failed to get container status \"8eefc7ec18ef90edb2c95d5dc6dd287f1c3cef06d211bee77cc4d4125d35381f\": rpc error: code = NotFound desc = could not find container \"8eefc7ec18ef90edb2c95d5dc6dd287f1c3cef06d211bee77cc4d4125d35381f\": container with ID starting with 8eefc7ec18ef90edb2c95d5dc6dd287f1c3cef06d211bee77cc4d4125d35381f not found: ID does not exist" Apr 20 12:16:42.568406 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.568362 2580 scope.go:117] "RemoveContainer" containerID="6d0ed0cbb45b05ba06f3e3944076fb6d4307b1c3bdcce73723fe4de2665c7965" Apr 20 12:16:42.568612 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:16:42.568597 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d0ed0cbb45b05ba06f3e3944076fb6d4307b1c3bdcce73723fe4de2665c7965\": container with ID starting with 6d0ed0cbb45b05ba06f3e3944076fb6d4307b1c3bdcce73723fe4de2665c7965 not found: ID does not exist" containerID="6d0ed0cbb45b05ba06f3e3944076fb6d4307b1c3bdcce73723fe4de2665c7965" Apr 20 12:16:42.568654 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.568618 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d0ed0cbb45b05ba06f3e3944076fb6d4307b1c3bdcce73723fe4de2665c7965"} err="failed to get container status \"6d0ed0cbb45b05ba06f3e3944076fb6d4307b1c3bdcce73723fe4de2665c7965\": rpc error: code = NotFound desc = could not find container \"6d0ed0cbb45b05ba06f3e3944076fb6d4307b1c3bdcce73723fe4de2665c7965\": container with ID starting with 6d0ed0cbb45b05ba06f3e3944076fb6d4307b1c3bdcce73723fe4de2665c7965 not found: ID does not exist" Apr 20 12:16:42.568654 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.568632 2580 scope.go:117] "RemoveContainer" containerID="e78046dae12800c6311a4ae131bd9c3a4c907d9eb90758865ffcfc33fa6b8623" Apr 20 12:16:42.568859 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:16:42.568838 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e78046dae12800c6311a4ae131bd9c3a4c907d9eb90758865ffcfc33fa6b8623\": container with ID starting with e78046dae12800c6311a4ae131bd9c3a4c907d9eb90758865ffcfc33fa6b8623 not found: ID does not exist" containerID="e78046dae12800c6311a4ae131bd9c3a4c907d9eb90758865ffcfc33fa6b8623" Apr 20 12:16:42.568911 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.568862 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e78046dae12800c6311a4ae131bd9c3a4c907d9eb90758865ffcfc33fa6b8623"} err="failed to get container status \"e78046dae12800c6311a4ae131bd9c3a4c907d9eb90758865ffcfc33fa6b8623\": rpc error: code = NotFound desc = could not find container \"e78046dae12800c6311a4ae131bd9c3a4c907d9eb90758865ffcfc33fa6b8623\": container with ID starting with e78046dae12800c6311a4ae131bd9c3a4c907d9eb90758865ffcfc33fa6b8623 not found: ID does not exist" Apr 20 12:16:42.568911 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.568877 2580 scope.go:117] "RemoveContainer" containerID="d2c198ec574f4f5cb3cd63f888de41b5c4ce71d59e2d8bbb711e318ace206946" Apr 20 12:16:42.569061 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:16:42.569044 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2c198ec574f4f5cb3cd63f888de41b5c4ce71d59e2d8bbb711e318ace206946\": container with ID starting with d2c198ec574f4f5cb3cd63f888de41b5c4ce71d59e2d8bbb711e318ace206946 not found: ID does not exist" containerID="d2c198ec574f4f5cb3cd63f888de41b5c4ce71d59e2d8bbb711e318ace206946" Apr 20 12:16:42.569101 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.569067 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2c198ec574f4f5cb3cd63f888de41b5c4ce71d59e2d8bbb711e318ace206946"} err="failed to get container status \"d2c198ec574f4f5cb3cd63f888de41b5c4ce71d59e2d8bbb711e318ace206946\": rpc error: code = NotFound desc = could not find container \"d2c198ec574f4f5cb3cd63f888de41b5c4ce71d59e2d8bbb711e318ace206946\": container with ID starting with d2c198ec574f4f5cb3cd63f888de41b5c4ce71d59e2d8bbb711e318ace206946 not found: ID does not exist" Apr 20 12:16:42.569101 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.569083 2580 scope.go:117] "RemoveContainer" containerID="df0b8e8b535eae8194ac8abc77bec7b24dd90efe1bc7b828ce8559be09561e9d" Apr 20 12:16:42.569272 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:16:42.569253 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df0b8e8b535eae8194ac8abc77bec7b24dd90efe1bc7b828ce8559be09561e9d\": container with ID starting with df0b8e8b535eae8194ac8abc77bec7b24dd90efe1bc7b828ce8559be09561e9d not found: ID does not exist" containerID="df0b8e8b535eae8194ac8abc77bec7b24dd90efe1bc7b828ce8559be09561e9d" Apr 20 12:16:42.569312 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.569278 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df0b8e8b535eae8194ac8abc77bec7b24dd90efe1bc7b828ce8559be09561e9d"} err="failed to get container status \"df0b8e8b535eae8194ac8abc77bec7b24dd90efe1bc7b828ce8559be09561e9d\": rpc error: code = NotFound desc = could not find container \"df0b8e8b535eae8194ac8abc77bec7b24dd90efe1bc7b828ce8559be09561e9d\": container with ID starting with df0b8e8b535eae8194ac8abc77bec7b24dd90efe1bc7b828ce8559be09561e9d not found: ID does not exist" Apr 20 12:16:42.625925 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.625885 2580 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22d63887-ce5b-4e34-b1a0-9126e4462f30-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:16:42.625925 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.625919 2580 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/22d63887-ce5b-4e34-b1a0-9126e4462f30-config-out\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:16:42.625925 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.625930 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5vg68\" (UniqueName: \"kubernetes.io/projected/22d63887-ce5b-4e34-b1a0-9126e4462f30-kube-api-access-5vg68\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:16:42.626122 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.625939 2580 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-config-volume\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:16:42.626122 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.625948 2580 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/22d63887-ce5b-4e34-b1a0-9126e4462f30-tls-assets\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:16:42.626122 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.625961 2580 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:16:42.626122 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.625975 2580 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:16:42.626122 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.625984 2580 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-web-config\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:16:42.626122 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.625994 2580 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-secret-alertmanager-main-tls\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:16:42.626122 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.626003 2580 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-cluster-tls-config\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:16:42.626122 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.626012 2580 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/22d63887-ce5b-4e34-b1a0-9126e4462f30-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:16:42.626122 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.626021 2580 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/22d63887-ce5b-4e34-b1a0-9126e4462f30-alertmanager-main-db\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:16:42.826257 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.826222 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 12:16:42.829662 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.829631 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 12:16:42.846544 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.846515 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22d63887-ce5b-4e34-b1a0-9126e4462f30" path="/var/lib/kubelet/pods/22d63887-ce5b-4e34-b1a0-9126e4462f30/volumes" Apr 20 12:16:42.853030 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.853002 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 12:16:42.853319 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.853303 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="22d63887-ce5b-4e34-b1a0-9126e4462f30" containerName="kube-rbac-proxy-metric" Apr 20 12:16:42.853412 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.853320 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="22d63887-ce5b-4e34-b1a0-9126e4462f30" containerName="kube-rbac-proxy-metric" Apr 20 12:16:42.853412 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.853336 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="22d63887-ce5b-4e34-b1a0-9126e4462f30" containerName="alertmanager" Apr 20 12:16:42.853412 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.853342 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="22d63887-ce5b-4e34-b1a0-9126e4462f30" containerName="alertmanager" Apr 20 12:16:42.853412 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.853351 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="22d63887-ce5b-4e34-b1a0-9126e4462f30" containerName="prom-label-proxy" Apr 20 12:16:42.853412 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.853357 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="22d63887-ce5b-4e34-b1a0-9126e4462f30" containerName="prom-label-proxy" Apr 20 12:16:42.853412 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.853365 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="22d63887-ce5b-4e34-b1a0-9126e4462f30" containerName="init-config-reloader" Apr 20 12:16:42.853412 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.853370 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="22d63887-ce5b-4e34-b1a0-9126e4462f30" containerName="init-config-reloader" Apr 20 12:16:42.853412 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.853376 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="22d63887-ce5b-4e34-b1a0-9126e4462f30" containerName="kube-rbac-proxy-web" Apr 20 12:16:42.853412 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.853383 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="22d63887-ce5b-4e34-b1a0-9126e4462f30" containerName="kube-rbac-proxy-web" Apr 20 12:16:42.853412 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.853410 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="69db919e-8d10-40f9-b4d7-b37130d3ab45" containerName="console" Apr 20 12:16:42.853412 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.853416 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="69db919e-8d10-40f9-b4d7-b37130d3ab45" containerName="console" Apr 20 12:16:42.853913 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.853422 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="22d63887-ce5b-4e34-b1a0-9126e4462f30" containerName="kube-rbac-proxy" Apr 20 12:16:42.853913 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.853428 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="22d63887-ce5b-4e34-b1a0-9126e4462f30" containerName="kube-rbac-proxy" Apr 20 12:16:42.853913 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.853442 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="22d63887-ce5b-4e34-b1a0-9126e4462f30" containerName="config-reloader" Apr 20 12:16:42.853913 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.853447 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="22d63887-ce5b-4e34-b1a0-9126e4462f30" containerName="config-reloader" Apr 20 12:16:42.853913 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.853495 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="22d63887-ce5b-4e34-b1a0-9126e4462f30" containerName="alertmanager" Apr 20 12:16:42.853913 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.853503 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="22d63887-ce5b-4e34-b1a0-9126e4462f30" containerName="config-reloader" Apr 20 12:16:42.853913 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.853509 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="22d63887-ce5b-4e34-b1a0-9126e4462f30" containerName="kube-rbac-proxy-metric" Apr 20 12:16:42.853913 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.853516 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="22d63887-ce5b-4e34-b1a0-9126e4462f30" containerName="kube-rbac-proxy-web" Apr 20 12:16:42.853913 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.853523 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="69db919e-8d10-40f9-b4d7-b37130d3ab45" containerName="console" Apr 20 12:16:42.853913 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.853530 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="22d63887-ce5b-4e34-b1a0-9126e4462f30" containerName="kube-rbac-proxy" Apr 20 12:16:42.853913 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.853537 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="22d63887-ce5b-4e34-b1a0-9126e4462f30" containerName="prom-label-proxy" Apr 20 12:16:42.857304 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.857284 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:42.859761 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.859733 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 12:16:42.859876 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.859743 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-mhsr8\"" Apr 20 12:16:42.859876 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.859819 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 12:16:42.859876 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.859870 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 12:16:42.860030 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.859970 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 12:16:42.860030 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.859970 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 12:16:42.860030 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.860021 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 12:16:42.860179 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.860068 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 12:16:42.860179 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.860110 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 12:16:42.864995 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.864974 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 12:16:42.869517 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:42.869492 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 12:16:43.028728 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.028689 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/a49f1ae4-b930-4e59-87b0-edf8e27bcbd1-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.028728 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.028729 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a49f1ae4-b930-4e59-87b0-edf8e27bcbd1-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.028945 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.028761 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a49f1ae4-b930-4e59-87b0-edf8e27bcbd1-config-out\") pod \"alertmanager-main-0\" (UID: \"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.028945 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.028788 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a49f1ae4-b930-4e59-87b0-edf8e27bcbd1-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.028945 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.028805 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t49t2\" (UniqueName: \"kubernetes.io/projected/a49f1ae4-b930-4e59-87b0-edf8e27bcbd1-kube-api-access-t49t2\") pod \"alertmanager-main-0\" (UID: \"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.028945 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.028827 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a49f1ae4-b930-4e59-87b0-edf8e27bcbd1-tls-assets\") pod \"alertmanager-main-0\" (UID: \"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.028945 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.028886 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/a49f1ae4-b930-4e59-87b0-edf8e27bcbd1-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.028945 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.028933 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/a49f1ae4-b930-4e59-87b0-edf8e27bcbd1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.029138 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.028959 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a49f1ae4-b930-4e59-87b0-edf8e27bcbd1-config-volume\") pod \"alertmanager-main-0\" (UID: \"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.029138 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.028987 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a49f1ae4-b930-4e59-87b0-edf8e27bcbd1-web-config\") pod \"alertmanager-main-0\" (UID: \"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.029138 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.029043 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/a49f1ae4-b930-4e59-87b0-edf8e27bcbd1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.029138 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.029076 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a49f1ae4-b930-4e59-87b0-edf8e27bcbd1-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.029138 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.029099 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a49f1ae4-b930-4e59-87b0-edf8e27bcbd1-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.130186 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.130103 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/a49f1ae4-b930-4e59-87b0-edf8e27bcbd1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.130186 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.130143 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a49f1ae4-b930-4e59-87b0-edf8e27bcbd1-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.130383 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.130215 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a49f1ae4-b930-4e59-87b0-edf8e27bcbd1-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.130383 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.130267 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/a49f1ae4-b930-4e59-87b0-edf8e27bcbd1-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.130383 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.130283 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a49f1ae4-b930-4e59-87b0-edf8e27bcbd1-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.130383 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.130308 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a49f1ae4-b930-4e59-87b0-edf8e27bcbd1-config-out\") pod \"alertmanager-main-0\" (UID: \"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.130383 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.130337 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a49f1ae4-b930-4e59-87b0-edf8e27bcbd1-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.130383 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.130360 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t49t2\" (UniqueName: \"kubernetes.io/projected/a49f1ae4-b930-4e59-87b0-edf8e27bcbd1-kube-api-access-t49t2\") pod \"alertmanager-main-0\" (UID: \"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.130694 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.130420 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a49f1ae4-b930-4e59-87b0-edf8e27bcbd1-tls-assets\") pod \"alertmanager-main-0\" (UID: \"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.130694 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.130453 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/a49f1ae4-b930-4e59-87b0-edf8e27bcbd1-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.130694 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.130489 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/a49f1ae4-b930-4e59-87b0-edf8e27bcbd1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.130694 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.130522 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a49f1ae4-b930-4e59-87b0-edf8e27bcbd1-config-volume\") pod \"alertmanager-main-0\" (UID: \"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.130694 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.130565 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a49f1ae4-b930-4e59-87b0-edf8e27bcbd1-web-config\") pod \"alertmanager-main-0\" (UID: \"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.130959 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.130689 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/a49f1ae4-b930-4e59-87b0-edf8e27bcbd1-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.131568 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.131526 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a49f1ae4-b930-4e59-87b0-edf8e27bcbd1-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.132112 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.132087 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a49f1ae4-b930-4e59-87b0-edf8e27bcbd1-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.133179 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.133152 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a49f1ae4-b930-4e59-87b0-edf8e27bcbd1-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.133299 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.133218 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a49f1ae4-b930-4e59-87b0-edf8e27bcbd1-tls-assets\") pod \"alertmanager-main-0\" (UID: \"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.133299 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.133221 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/a49f1ae4-b930-4e59-87b0-edf8e27bcbd1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.133430 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.133347 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a49f1ae4-b930-4e59-87b0-edf8e27bcbd1-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.133773 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.133752 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/a49f1ae4-b930-4e59-87b0-edf8e27bcbd1-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.134565 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.134540 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/a49f1ae4-b930-4e59-87b0-edf8e27bcbd1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.134667 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.134648 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a49f1ae4-b930-4e59-87b0-edf8e27bcbd1-config-out\") pod \"alertmanager-main-0\" (UID: \"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.134720 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.134705 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a49f1ae4-b930-4e59-87b0-edf8e27bcbd1-web-config\") pod \"alertmanager-main-0\" (UID: \"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.134779 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.134766 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a49f1ae4-b930-4e59-87b0-edf8e27bcbd1-config-volume\") pod \"alertmanager-main-0\" (UID: \"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.138452 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.138425 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t49t2\" (UniqueName: \"kubernetes.io/projected/a49f1ae4-b930-4e59-87b0-edf8e27bcbd1-kube-api-access-t49t2\") pod \"alertmanager-main-0\" (UID: \"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.168525 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.168487 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 12:16:43.297093 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.297029 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 12:16:43.303537 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:16:43.303507 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda49f1ae4_b930_4e59_87b0_edf8e27bcbd1.slice/crio-13573f7e4de484d729a78da604d0e74670885ba23b68ebdfaf6004e97082be28 WatchSource:0}: Error finding container 13573f7e4de484d729a78da604d0e74670885ba23b68ebdfaf6004e97082be28: Status 404 returned error can't find the container with id 13573f7e4de484d729a78da604d0e74670885ba23b68ebdfaf6004e97082be28 Apr 20 12:16:43.507328 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.507293 2580 generic.go:358] "Generic (PLEG): container finished" podID="a49f1ae4-b930-4e59-87b0-edf8e27bcbd1" containerID="8d210403985c858ebc8bfdb5a7e034348e639838c5a4f19c5237663ea9c32670" exitCode=0 Apr 20 12:16:43.507747 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.507343 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1","Type":"ContainerDied","Data":"8d210403985c858ebc8bfdb5a7e034348e639838c5a4f19c5237663ea9c32670"} Apr 20 12:16:43.507747 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:43.507379 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1","Type":"ContainerStarted","Data":"13573f7e4de484d729a78da604d0e74670885ba23b68ebdfaf6004e97082be28"} Apr 20 12:16:44.513333 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:44.513236 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1","Type":"ContainerStarted","Data":"c98c83eb98480c6b3c374100b7d84268b203b68522fb9ea178d93d19449a57e8"} Apr 20 12:16:44.513333 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:44.513276 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1","Type":"ContainerStarted","Data":"a9bcaf283e59261baa5a8a6e956eef92dcf626f74ffb5384b4fe2950c2ea864f"} Apr 20 12:16:44.513333 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:44.513289 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1","Type":"ContainerStarted","Data":"f4f21da31cd27b913909bce2705b9e5d7802590181f8635b78022578f7c13713"} Apr 20 12:16:44.513333 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:44.513303 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1","Type":"ContainerStarted","Data":"13b26450484ef2133fe7490b3d3c05cf630673d74bb2d597337ab30871461bd9"} Apr 20 12:16:44.513333 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:44.513314 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1","Type":"ContainerStarted","Data":"e32b2cb5215b0daaf1ebe6e582b62308488ee3c50eb8d17c5c412fb3b61f92ff"} Apr 20 12:16:44.513333 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:44.513325 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a49f1ae4-b930-4e59-87b0-edf8e27bcbd1","Type":"ContainerStarted","Data":"cdb8252cf9f7e3ae95349e8aad930cb218fd18008c74414ab24c629bdb13dbe1"} Apr 20 12:16:44.541871 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:44.541800 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.5417813479999998 podStartE2EDuration="2.541781348s" podCreationTimestamp="2026-04-20 12:16:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 12:16:44.540385621 +0000 UTC m=+138.260399613" watchObservedRunningTime="2026-04-20 12:16:44.541781348 +0000 UTC m=+138.261795405" Apr 20 12:16:45.154687 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.154644 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-79cc798455-tph24"] Apr 20 12:16:45.158140 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.158111 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-79cc798455-tph24" Apr 20 12:16:45.160686 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.160643 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 20 12:16:45.160686 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.160667 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-qwfkx\"" Apr 20 12:16:45.160924 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.160732 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 20 12:16:45.160924 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.160648 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 20 12:16:45.160924 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.160775 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 20 12:16:45.160924 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.160787 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 20 12:16:45.171617 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.170196 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 20 12:16:45.171617 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.171476 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-79cc798455-tph24"] Apr 20 12:16:45.248921 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.248878 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22e5ac9a-8222-4837-ae48-0b9ca98383b1-serving-certs-ca-bundle\") pod \"telemeter-client-79cc798455-tph24\" (UID: \"22e5ac9a-8222-4837-ae48-0b9ca98383b1\") " pod="openshift-monitoring/telemeter-client-79cc798455-tph24" Apr 20 12:16:45.248921 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.248922 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/22e5ac9a-8222-4837-ae48-0b9ca98383b1-secret-telemeter-client\") pod \"telemeter-client-79cc798455-tph24\" (UID: \"22e5ac9a-8222-4837-ae48-0b9ca98383b1\") " pod="openshift-monitoring/telemeter-client-79cc798455-tph24" Apr 20 12:16:45.249195 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.248943 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgw6n\" (UniqueName: \"kubernetes.io/projected/22e5ac9a-8222-4837-ae48-0b9ca98383b1-kube-api-access-xgw6n\") pod \"telemeter-client-79cc798455-tph24\" (UID: \"22e5ac9a-8222-4837-ae48-0b9ca98383b1\") " pod="openshift-monitoring/telemeter-client-79cc798455-tph24" Apr 20 12:16:45.249195 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.249080 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/22e5ac9a-8222-4837-ae48-0b9ca98383b1-telemeter-client-tls\") pod \"telemeter-client-79cc798455-tph24\" (UID: \"22e5ac9a-8222-4837-ae48-0b9ca98383b1\") " pod="openshift-monitoring/telemeter-client-79cc798455-tph24" Apr 20 12:16:45.249195 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.249142 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/22e5ac9a-8222-4837-ae48-0b9ca98383b1-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-79cc798455-tph24\" (UID: \"22e5ac9a-8222-4837-ae48-0b9ca98383b1\") " pod="openshift-monitoring/telemeter-client-79cc798455-tph24" Apr 20 12:16:45.249195 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.249176 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22e5ac9a-8222-4837-ae48-0b9ca98383b1-telemeter-trusted-ca-bundle\") pod \"telemeter-client-79cc798455-tph24\" (UID: \"22e5ac9a-8222-4837-ae48-0b9ca98383b1\") " pod="openshift-monitoring/telemeter-client-79cc798455-tph24" Apr 20 12:16:45.249347 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.249255 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/22e5ac9a-8222-4837-ae48-0b9ca98383b1-federate-client-tls\") pod \"telemeter-client-79cc798455-tph24\" (UID: \"22e5ac9a-8222-4837-ae48-0b9ca98383b1\") " pod="openshift-monitoring/telemeter-client-79cc798455-tph24" Apr 20 12:16:45.249347 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.249297 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/22e5ac9a-8222-4837-ae48-0b9ca98383b1-metrics-client-ca\") pod \"telemeter-client-79cc798455-tph24\" (UID: \"22e5ac9a-8222-4837-ae48-0b9ca98383b1\") " pod="openshift-monitoring/telemeter-client-79cc798455-tph24" Apr 20 12:16:45.350210 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.350165 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/22e5ac9a-8222-4837-ae48-0b9ca98383b1-metrics-client-ca\") pod \"telemeter-client-79cc798455-tph24\" (UID: \"22e5ac9a-8222-4837-ae48-0b9ca98383b1\") " pod="openshift-monitoring/telemeter-client-79cc798455-tph24" Apr 20 12:16:45.350366 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.350228 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22e5ac9a-8222-4837-ae48-0b9ca98383b1-serving-certs-ca-bundle\") pod \"telemeter-client-79cc798455-tph24\" (UID: \"22e5ac9a-8222-4837-ae48-0b9ca98383b1\") " pod="openshift-monitoring/telemeter-client-79cc798455-tph24" Apr 20 12:16:45.350366 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.350244 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/22e5ac9a-8222-4837-ae48-0b9ca98383b1-secret-telemeter-client\") pod \"telemeter-client-79cc798455-tph24\" (UID: \"22e5ac9a-8222-4837-ae48-0b9ca98383b1\") " pod="openshift-monitoring/telemeter-client-79cc798455-tph24" Apr 20 12:16:45.350366 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.350263 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xgw6n\" (UniqueName: \"kubernetes.io/projected/22e5ac9a-8222-4837-ae48-0b9ca98383b1-kube-api-access-xgw6n\") pod \"telemeter-client-79cc798455-tph24\" (UID: \"22e5ac9a-8222-4837-ae48-0b9ca98383b1\") " pod="openshift-monitoring/telemeter-client-79cc798455-tph24" Apr 20 12:16:45.350366 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.350297 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/22e5ac9a-8222-4837-ae48-0b9ca98383b1-telemeter-client-tls\") pod \"telemeter-client-79cc798455-tph24\" (UID: \"22e5ac9a-8222-4837-ae48-0b9ca98383b1\") " pod="openshift-monitoring/telemeter-client-79cc798455-tph24" Apr 20 12:16:45.350366 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.350318 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/22e5ac9a-8222-4837-ae48-0b9ca98383b1-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-79cc798455-tph24\" (UID: \"22e5ac9a-8222-4837-ae48-0b9ca98383b1\") " pod="openshift-monitoring/telemeter-client-79cc798455-tph24" Apr 20 12:16:45.350662 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.350468 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22e5ac9a-8222-4837-ae48-0b9ca98383b1-telemeter-trusted-ca-bundle\") pod \"telemeter-client-79cc798455-tph24\" (UID: \"22e5ac9a-8222-4837-ae48-0b9ca98383b1\") " pod="openshift-monitoring/telemeter-client-79cc798455-tph24" Apr 20 12:16:45.350764 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.350740 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/22e5ac9a-8222-4837-ae48-0b9ca98383b1-federate-client-tls\") pod \"telemeter-client-79cc798455-tph24\" (UID: \"22e5ac9a-8222-4837-ae48-0b9ca98383b1\") " pod="openshift-monitoring/telemeter-client-79cc798455-tph24" Apr 20 12:16:45.351165 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.351085 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22e5ac9a-8222-4837-ae48-0b9ca98383b1-serving-certs-ca-bundle\") pod \"telemeter-client-79cc798455-tph24\" (UID: \"22e5ac9a-8222-4837-ae48-0b9ca98383b1\") " pod="openshift-monitoring/telemeter-client-79cc798455-tph24" Apr 20 12:16:45.351263 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.351218 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/22e5ac9a-8222-4837-ae48-0b9ca98383b1-metrics-client-ca\") pod \"telemeter-client-79cc798455-tph24\" (UID: \"22e5ac9a-8222-4837-ae48-0b9ca98383b1\") " pod="openshift-monitoring/telemeter-client-79cc798455-tph24" Apr 20 12:16:45.351424 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.351382 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22e5ac9a-8222-4837-ae48-0b9ca98383b1-telemeter-trusted-ca-bundle\") pod \"telemeter-client-79cc798455-tph24\" (UID: \"22e5ac9a-8222-4837-ae48-0b9ca98383b1\") " pod="openshift-monitoring/telemeter-client-79cc798455-tph24" Apr 20 12:16:45.352918 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.352872 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/22e5ac9a-8222-4837-ae48-0b9ca98383b1-secret-telemeter-client\") pod \"telemeter-client-79cc798455-tph24\" (UID: \"22e5ac9a-8222-4837-ae48-0b9ca98383b1\") " pod="openshift-monitoring/telemeter-client-79cc798455-tph24" Apr 20 12:16:45.353078 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.353060 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/22e5ac9a-8222-4837-ae48-0b9ca98383b1-federate-client-tls\") pod \"telemeter-client-79cc798455-tph24\" (UID: \"22e5ac9a-8222-4837-ae48-0b9ca98383b1\") " pod="openshift-monitoring/telemeter-client-79cc798455-tph24" Apr 20 12:16:45.353364 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.353340 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/22e5ac9a-8222-4837-ae48-0b9ca98383b1-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-79cc798455-tph24\" (UID: \"22e5ac9a-8222-4837-ae48-0b9ca98383b1\") " pod="openshift-monitoring/telemeter-client-79cc798455-tph24" Apr 20 12:16:45.353441 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.353358 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/22e5ac9a-8222-4837-ae48-0b9ca98383b1-telemeter-client-tls\") pod \"telemeter-client-79cc798455-tph24\" (UID: \"22e5ac9a-8222-4837-ae48-0b9ca98383b1\") " pod="openshift-monitoring/telemeter-client-79cc798455-tph24" Apr 20 12:16:45.358851 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.358826 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgw6n\" (UniqueName: \"kubernetes.io/projected/22e5ac9a-8222-4837-ae48-0b9ca98383b1-kube-api-access-xgw6n\") pod \"telemeter-client-79cc798455-tph24\" (UID: \"22e5ac9a-8222-4837-ae48-0b9ca98383b1\") " pod="openshift-monitoring/telemeter-client-79cc798455-tph24" Apr 20 12:16:45.432230 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.432134 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 12:16:45.432740 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.432593 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="4250d722-a976-44b5-8936-846f53dc7cca" containerName="prometheus" containerID="cri-o://91a869e4727c73167d92cd5efce7982a8bf1c7f9384ae9f34143ff7c6dd264f1" gracePeriod=600 Apr 20 12:16:45.432740 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.432637 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="4250d722-a976-44b5-8936-846f53dc7cca" containerName="kube-rbac-proxy" containerID="cri-o://fa280989b6275627b05908c174872f1e9009812bfd1e598f8ca7fee6498e5216" gracePeriod=600 Apr 20 12:16:45.432740 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.432668 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="4250d722-a976-44b5-8936-846f53dc7cca" containerName="thanos-sidecar" containerID="cri-o://16ca6f078d59528a6292161eb2a4883d5d2e63dfd6681b639f009150b68ba684" gracePeriod=600 Apr 20 12:16:45.432740 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.432651 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="4250d722-a976-44b5-8936-846f53dc7cca" containerName="kube-rbac-proxy-web" containerID="cri-o://c0363d3e6fd2690cde36d8af948483224cb372cc4dd5e636bb6850c808f31138" gracePeriod=600 Apr 20 12:16:45.433023 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.432668 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="4250d722-a976-44b5-8936-846f53dc7cca" containerName="config-reloader" containerID="cri-o://fcd9e97e1b219572a49b9fdb8b3c160aff2c81c5a4269c522f4d6bdf11c30816" gracePeriod=600 Apr 20 12:16:45.433023 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.432890 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="4250d722-a976-44b5-8936-846f53dc7cca" containerName="kube-rbac-proxy-thanos" containerID="cri-o://d0f8ca4501c8467c6884b3be53f26cab5d7579b43507e2ea45f2122f0eeddd1a" gracePeriod=600 Apr 20 12:16:45.471019 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.470990 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-79cc798455-tph24" Apr 20 12:16:45.603371 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.603331 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-79cc798455-tph24"] Apr 20 12:16:45.606492 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:16:45.606460 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22e5ac9a_8222_4837_ae48_0b9ca98383b1.slice/crio-f95eca0a27d3a909cbe733c2ed68bbf2ef0498df9810897059ec2ba9ccbc7ba8 WatchSource:0}: Error finding container f95eca0a27d3a909cbe733c2ed68bbf2ef0498df9810897059ec2ba9ccbc7ba8: Status 404 returned error can't find the container with id f95eca0a27d3a909cbe733c2ed68bbf2ef0498df9810897059ec2ba9ccbc7ba8 Apr 20 12:16:45.689775 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.689748 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:45.855425 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.855364 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4250d722-a976-44b5-8936-846f53dc7cca-configmap-metrics-client-ca\") pod \"4250d722-a976-44b5-8936-846f53dc7cca\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " Apr 20 12:16:45.855425 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.855431 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-config\") pod \"4250d722-a976-44b5-8936-846f53dc7cca\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " Apr 20 12:16:45.855669 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.855469 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-secret-metrics-client-certs\") pod \"4250d722-a976-44b5-8936-846f53dc7cca\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " Apr 20 12:16:45.855669 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.855502 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-thanos-prometheus-http-client-file\") pod \"4250d722-a976-44b5-8936-846f53dc7cca\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " Apr 20 12:16:45.855669 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.855527 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-secret-prometheus-k8s-tls\") pod \"4250d722-a976-44b5-8936-846f53dc7cca\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " Apr 20 12:16:45.855669 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.855578 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"4250d722-a976-44b5-8936-846f53dc7cca\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " Apr 20 12:16:45.855669 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.855608 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4250d722-a976-44b5-8936-846f53dc7cca-configmap-serving-certs-ca-bundle\") pod \"4250d722-a976-44b5-8936-846f53dc7cca\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " Apr 20 12:16:45.855669 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.855634 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-secret-grpc-tls\") pod \"4250d722-a976-44b5-8936-846f53dc7cca\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " Apr 20 12:16:45.855669 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.855663 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4250d722-a976-44b5-8936-846f53dc7cca-prometheus-k8s-rulefiles-0\") pod \"4250d722-a976-44b5-8936-846f53dc7cca\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " Apr 20 12:16:45.856020 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.855695 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/4250d722-a976-44b5-8936-846f53dc7cca-prometheus-k8s-db\") pod \"4250d722-a976-44b5-8936-846f53dc7cca\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " Apr 20 12:16:45.856020 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.855722 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4250d722-a976-44b5-8936-846f53dc7cca-configmap-kubelet-serving-ca-bundle\") pod \"4250d722-a976-44b5-8936-846f53dc7cca\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " Apr 20 12:16:45.856020 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.855746 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-web-config\") pod \"4250d722-a976-44b5-8936-846f53dc7cca\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " Apr 20 12:16:45.856020 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.855807 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4250d722-a976-44b5-8936-846f53dc7cca-config-out\") pod \"4250d722-a976-44b5-8936-846f53dc7cca\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " Apr 20 12:16:45.856020 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.855838 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-secret-kube-rbac-proxy\") pod \"4250d722-a976-44b5-8936-846f53dc7cca\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " Apr 20 12:16:45.856020 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.855862 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4250d722-a976-44b5-8936-846f53dc7cca-tls-assets\") pod \"4250d722-a976-44b5-8936-846f53dc7cca\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " Apr 20 12:16:45.856020 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.855862 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4250d722-a976-44b5-8936-846f53dc7cca-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "4250d722-a976-44b5-8936-846f53dc7cca" (UID: "4250d722-a976-44b5-8936-846f53dc7cca"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 12:16:45.856020 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.855888 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4250d722-a976-44b5-8936-846f53dc7cca-prometheus-trusted-ca-bundle\") pod \"4250d722-a976-44b5-8936-846f53dc7cca\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " Apr 20 12:16:45.856020 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.855934 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"4250d722-a976-44b5-8936-846f53dc7cca\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " Apr 20 12:16:45.856020 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.855970 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cl5x\" (UniqueName: \"kubernetes.io/projected/4250d722-a976-44b5-8936-846f53dc7cca-kube-api-access-5cl5x\") pod \"4250d722-a976-44b5-8936-846f53dc7cca\" (UID: \"4250d722-a976-44b5-8936-846f53dc7cca\") " Apr 20 12:16:45.856511 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.856157 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4250d722-a976-44b5-8936-846f53dc7cca-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "4250d722-a976-44b5-8936-846f53dc7cca" (UID: "4250d722-a976-44b5-8936-846f53dc7cca"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 12:16:45.856511 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.856258 2580 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4250d722-a976-44b5-8936-846f53dc7cca-configmap-metrics-client-ca\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:16:45.856511 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.856278 2580 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4250d722-a976-44b5-8936-846f53dc7cca-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:16:45.857118 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.856847 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4250d722-a976-44b5-8936-846f53dc7cca-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "4250d722-a976-44b5-8936-846f53dc7cca" (UID: "4250d722-a976-44b5-8936-846f53dc7cca"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 12:16:45.857698 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.857435 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4250d722-a976-44b5-8936-846f53dc7cca-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "4250d722-a976-44b5-8936-846f53dc7cca" (UID: "4250d722-a976-44b5-8936-846f53dc7cca"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 12:16:45.857698 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.857629 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4250d722-a976-44b5-8936-846f53dc7cca-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "4250d722-a976-44b5-8936-846f53dc7cca" (UID: "4250d722-a976-44b5-8936-846f53dc7cca"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 12:16:45.858322 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.858294 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "4250d722-a976-44b5-8936-846f53dc7cca" (UID: "4250d722-a976-44b5-8936-846f53dc7cca"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 12:16:45.858455 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.858422 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-config" (OuterVolumeSpecName: "config") pod "4250d722-a976-44b5-8936-846f53dc7cca" (UID: "4250d722-a976-44b5-8936-846f53dc7cca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 12:16:45.858682 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.858563 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "4250d722-a976-44b5-8936-846f53dc7cca" (UID: "4250d722-a976-44b5-8936-846f53dc7cca"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 12:16:45.858682 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.858632 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4250d722-a976-44b5-8936-846f53dc7cca-kube-api-access-5cl5x" (OuterVolumeSpecName: "kube-api-access-5cl5x") pod "4250d722-a976-44b5-8936-846f53dc7cca" (UID: "4250d722-a976-44b5-8936-846f53dc7cca"). InnerVolumeSpecName "kube-api-access-5cl5x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 12:16:45.858682 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.858657 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4250d722-a976-44b5-8936-846f53dc7cca-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "4250d722-a976-44b5-8936-846f53dc7cca" (UID: "4250d722-a976-44b5-8936-846f53dc7cca"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 12:16:45.858682 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.858671 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "4250d722-a976-44b5-8936-846f53dc7cca" (UID: "4250d722-a976-44b5-8936-846f53dc7cca"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 12:16:45.859490 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.859451 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "4250d722-a976-44b5-8936-846f53dc7cca" (UID: "4250d722-a976-44b5-8936-846f53dc7cca"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 12:16:45.859919 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.859881 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "4250d722-a976-44b5-8936-846f53dc7cca" (UID: "4250d722-a976-44b5-8936-846f53dc7cca"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 12:16:45.860060 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.860038 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "4250d722-a976-44b5-8936-846f53dc7cca" (UID: "4250d722-a976-44b5-8936-846f53dc7cca"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 12:16:45.860617 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.860595 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4250d722-a976-44b5-8936-846f53dc7cca-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "4250d722-a976-44b5-8936-846f53dc7cca" (UID: "4250d722-a976-44b5-8936-846f53dc7cca"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 12:16:45.860853 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.860831 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "4250d722-a976-44b5-8936-846f53dc7cca" (UID: "4250d722-a976-44b5-8936-846f53dc7cca"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 12:16:45.861069 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.861055 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4250d722-a976-44b5-8936-846f53dc7cca-config-out" (OuterVolumeSpecName: "config-out") pod "4250d722-a976-44b5-8936-846f53dc7cca" (UID: "4250d722-a976-44b5-8936-846f53dc7cca"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 12:16:45.871078 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.871051 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-web-config" (OuterVolumeSpecName: "web-config") pod "4250d722-a976-44b5-8936-846f53dc7cca" (UID: "4250d722-a976-44b5-8936-846f53dc7cca"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 12:16:45.957320 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.957200 2580 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:16:45.957320 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.957247 2580 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4250d722-a976-44b5-8936-846f53dc7cca-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:16:45.957320 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.957263 2580 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-secret-grpc-tls\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:16:45.957320 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.957278 2580 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4250d722-a976-44b5-8936-846f53dc7cca-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:16:45.957320 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.957295 2580 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/4250d722-a976-44b5-8936-846f53dc7cca-prometheus-k8s-db\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:16:45.957320 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.957308 2580 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-web-config\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:16:45.957320 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.957324 2580 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4250d722-a976-44b5-8936-846f53dc7cca-config-out\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:16:45.957702 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.957337 2580 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-secret-kube-rbac-proxy\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:16:45.957702 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.957352 2580 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4250d722-a976-44b5-8936-846f53dc7cca-tls-assets\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:16:45.957702 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.957372 2580 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4250d722-a976-44b5-8936-846f53dc7cca-prometheus-trusted-ca-bundle\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:16:45.957702 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.957388 2580 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:16:45.957702 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.957431 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5cl5x\" (UniqueName: \"kubernetes.io/projected/4250d722-a976-44b5-8936-846f53dc7cca-kube-api-access-5cl5x\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:16:45.957702 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.957445 2580 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-config\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:16:45.957702 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.957469 2580 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-secret-metrics-client-certs\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:16:45.957702 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.957483 2580 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-thanos-prometheus-http-client-file\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:16:45.957702 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:45.957494 2580 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/4250d722-a976-44b5-8936-846f53dc7cca-secret-prometheus-k8s-tls\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:16:46.522637 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.522583 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-79cc798455-tph24" event={"ID":"22e5ac9a-8222-4837-ae48-0b9ca98383b1","Type":"ContainerStarted","Data":"f95eca0a27d3a909cbe733c2ed68bbf2ef0498df9810897059ec2ba9ccbc7ba8"} Apr 20 12:16:46.526018 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.525971 2580 generic.go:358] "Generic (PLEG): container finished" podID="4250d722-a976-44b5-8936-846f53dc7cca" containerID="d0f8ca4501c8467c6884b3be53f26cab5d7579b43507e2ea45f2122f0eeddd1a" exitCode=0 Apr 20 12:16:46.526018 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.526003 2580 generic.go:358] "Generic (PLEG): container finished" podID="4250d722-a976-44b5-8936-846f53dc7cca" containerID="fa280989b6275627b05908c174872f1e9009812bfd1e598f8ca7fee6498e5216" exitCode=0 Apr 20 12:16:46.526018 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.526014 2580 generic.go:358] "Generic (PLEG): container finished" podID="4250d722-a976-44b5-8936-846f53dc7cca" containerID="c0363d3e6fd2690cde36d8af948483224cb372cc4dd5e636bb6850c808f31138" exitCode=0 Apr 20 12:16:46.526018 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.526021 2580 generic.go:358] "Generic (PLEG): container finished" podID="4250d722-a976-44b5-8936-846f53dc7cca" containerID="16ca6f078d59528a6292161eb2a4883d5d2e63dfd6681b639f009150b68ba684" exitCode=0 Apr 20 12:16:46.526328 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.526029 2580 generic.go:358] "Generic (PLEG): container finished" podID="4250d722-a976-44b5-8936-846f53dc7cca" containerID="fcd9e97e1b219572a49b9fdb8b3c160aff2c81c5a4269c522f4d6bdf11c30816" exitCode=0 Apr 20 12:16:46.526328 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.526038 2580 generic.go:358] "Generic (PLEG): container finished" podID="4250d722-a976-44b5-8936-846f53dc7cca" containerID="91a869e4727c73167d92cd5efce7982a8bf1c7f9384ae9f34143ff7c6dd264f1" exitCode=0 Apr 20 12:16:46.526328 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.526055 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4250d722-a976-44b5-8936-846f53dc7cca","Type":"ContainerDied","Data":"d0f8ca4501c8467c6884b3be53f26cab5d7579b43507e2ea45f2122f0eeddd1a"} Apr 20 12:16:46.526328 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.526094 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4250d722-a976-44b5-8936-846f53dc7cca","Type":"ContainerDied","Data":"fa280989b6275627b05908c174872f1e9009812bfd1e598f8ca7fee6498e5216"} Apr 20 12:16:46.526328 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.526108 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4250d722-a976-44b5-8936-846f53dc7cca","Type":"ContainerDied","Data":"c0363d3e6fd2690cde36d8af948483224cb372cc4dd5e636bb6850c808f31138"} Apr 20 12:16:46.526328 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.526108 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.526328 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.526122 2580 scope.go:117] "RemoveContainer" containerID="d0f8ca4501c8467c6884b3be53f26cab5d7579b43507e2ea45f2122f0eeddd1a" Apr 20 12:16:46.526328 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.526122 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4250d722-a976-44b5-8936-846f53dc7cca","Type":"ContainerDied","Data":"16ca6f078d59528a6292161eb2a4883d5d2e63dfd6681b639f009150b68ba684"} Apr 20 12:16:46.526328 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.526217 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4250d722-a976-44b5-8936-846f53dc7cca","Type":"ContainerDied","Data":"fcd9e97e1b219572a49b9fdb8b3c160aff2c81c5a4269c522f4d6bdf11c30816"} Apr 20 12:16:46.526328 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.526234 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4250d722-a976-44b5-8936-846f53dc7cca","Type":"ContainerDied","Data":"91a869e4727c73167d92cd5efce7982a8bf1c7f9384ae9f34143ff7c6dd264f1"} Apr 20 12:16:46.526328 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.526250 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4250d722-a976-44b5-8936-846f53dc7cca","Type":"ContainerDied","Data":"8a198cf82de6a53971cc5faefc30e93b395a05cf24ded81aac0d0de55b10c35f"} Apr 20 12:16:46.536527 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.536502 2580 scope.go:117] "RemoveContainer" containerID="fa280989b6275627b05908c174872f1e9009812bfd1e598f8ca7fee6498e5216" Apr 20 12:16:46.544697 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.544674 2580 scope.go:117] "RemoveContainer" containerID="c0363d3e6fd2690cde36d8af948483224cb372cc4dd5e636bb6850c808f31138" Apr 20 12:16:46.555021 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.554930 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 12:16:46.555096 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.555028 2580 scope.go:117] "RemoveContainer" containerID="16ca6f078d59528a6292161eb2a4883d5d2e63dfd6681b639f009150b68ba684" Apr 20 12:16:46.557220 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.557184 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 12:16:46.563983 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.563953 2580 scope.go:117] "RemoveContainer" containerID="fcd9e97e1b219572a49b9fdb8b3c160aff2c81c5a4269c522f4d6bdf11c30816" Apr 20 12:16:46.572613 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.572584 2580 scope.go:117] "RemoveContainer" containerID="91a869e4727c73167d92cd5efce7982a8bf1c7f9384ae9f34143ff7c6dd264f1" Apr 20 12:16:46.581760 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.581608 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 12:16:46.581855 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.581766 2580 scope.go:117] "RemoveContainer" containerID="7d5bca424f828bdff54ef72bc300c8fc1ac58a7b0c04326f5bff978afb96c258" Apr 20 12:16:46.582468 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.582203 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4250d722-a976-44b5-8936-846f53dc7cca" containerName="kube-rbac-proxy-web" Apr 20 12:16:46.582468 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.582222 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="4250d722-a976-44b5-8936-846f53dc7cca" containerName="kube-rbac-proxy-web" Apr 20 12:16:46.582468 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.582234 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4250d722-a976-44b5-8936-846f53dc7cca" containerName="prometheus" Apr 20 12:16:46.582468 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.582240 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="4250d722-a976-44b5-8936-846f53dc7cca" containerName="prometheus" Apr 20 12:16:46.582468 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.582262 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4250d722-a976-44b5-8936-846f53dc7cca" containerName="config-reloader" Apr 20 12:16:46.582468 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.582271 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="4250d722-a976-44b5-8936-846f53dc7cca" containerName="config-reloader" Apr 20 12:16:46.582468 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.582282 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4250d722-a976-44b5-8936-846f53dc7cca" containerName="kube-rbac-proxy" Apr 20 12:16:46.582468 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.582288 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="4250d722-a976-44b5-8936-846f53dc7cca" containerName="kube-rbac-proxy" Apr 20 12:16:46.582468 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.582303 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4250d722-a976-44b5-8936-846f53dc7cca" containerName="init-config-reloader" Apr 20 12:16:46.582468 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.582311 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="4250d722-a976-44b5-8936-846f53dc7cca" containerName="init-config-reloader" Apr 20 12:16:46.582468 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.582326 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4250d722-a976-44b5-8936-846f53dc7cca" containerName="thanos-sidecar" Apr 20 12:16:46.582468 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.582334 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="4250d722-a976-44b5-8936-846f53dc7cca" containerName="thanos-sidecar" Apr 20 12:16:46.582468 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.582348 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4250d722-a976-44b5-8936-846f53dc7cca" containerName="kube-rbac-proxy-thanos" Apr 20 12:16:46.582468 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.582357 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="4250d722-a976-44b5-8936-846f53dc7cca" containerName="kube-rbac-proxy-thanos" Apr 20 12:16:46.582468 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.582448 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="4250d722-a976-44b5-8936-846f53dc7cca" containerName="kube-rbac-proxy-web" Apr 20 12:16:46.582468 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.582460 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="4250d722-a976-44b5-8936-846f53dc7cca" containerName="kube-rbac-proxy" Apr 20 12:16:46.582468 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.582469 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="4250d722-a976-44b5-8936-846f53dc7cca" containerName="kube-rbac-proxy-thanos" Apr 20 12:16:46.582468 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.582480 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="4250d722-a976-44b5-8936-846f53dc7cca" containerName="prometheus" Apr 20 12:16:46.583446 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.582492 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="4250d722-a976-44b5-8936-846f53dc7cca" containerName="thanos-sidecar" Apr 20 12:16:46.583446 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.582503 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="4250d722-a976-44b5-8936-846f53dc7cca" containerName="config-reloader" Apr 20 12:16:46.587873 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.587847 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.590331 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.590306 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 12:16:46.590467 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.590306 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-wgjj5\"" Apr 20 12:16:46.590539 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.590477 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 12:16:46.590618 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.590599 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 12:16:46.590672 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.590624 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 12:16:46.590723 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.590685 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 12:16:46.591168 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.591146 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 12:16:46.591284 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.591208 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 12:16:46.591422 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.591374 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 12:16:46.591672 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.591646 2580 scope.go:117] "RemoveContainer" containerID="d0f8ca4501c8467c6884b3be53f26cab5d7579b43507e2ea45f2122f0eeddd1a" Apr 20 12:16:46.591761 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.591699 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 12:16:46.591761 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.591647 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-1e2ag3hh82dov\"" Apr 20 12:16:46.591964 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.591945 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 12:16:46.592215 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:16:46.592177 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0f8ca4501c8467c6884b3be53f26cab5d7579b43507e2ea45f2122f0eeddd1a\": container with ID starting with d0f8ca4501c8467c6884b3be53f26cab5d7579b43507e2ea45f2122f0eeddd1a not found: ID does not exist" containerID="d0f8ca4501c8467c6884b3be53f26cab5d7579b43507e2ea45f2122f0eeddd1a" Apr 20 12:16:46.592316 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.592217 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0f8ca4501c8467c6884b3be53f26cab5d7579b43507e2ea45f2122f0eeddd1a"} err="failed to get container status \"d0f8ca4501c8467c6884b3be53f26cab5d7579b43507e2ea45f2122f0eeddd1a\": rpc error: code = NotFound desc = could not find container \"d0f8ca4501c8467c6884b3be53f26cab5d7579b43507e2ea45f2122f0eeddd1a\": container with ID starting with d0f8ca4501c8467c6884b3be53f26cab5d7579b43507e2ea45f2122f0eeddd1a not found: ID does not exist" Apr 20 12:16:46.592316 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.592243 2580 scope.go:117] "RemoveContainer" containerID="fa280989b6275627b05908c174872f1e9009812bfd1e598f8ca7fee6498e5216" Apr 20 12:16:46.592841 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:16:46.592755 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa280989b6275627b05908c174872f1e9009812bfd1e598f8ca7fee6498e5216\": container with ID starting with fa280989b6275627b05908c174872f1e9009812bfd1e598f8ca7fee6498e5216 not found: ID does not exist" containerID="fa280989b6275627b05908c174872f1e9009812bfd1e598f8ca7fee6498e5216" Apr 20 12:16:46.592841 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.592790 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa280989b6275627b05908c174872f1e9009812bfd1e598f8ca7fee6498e5216"} err="failed to get container status \"fa280989b6275627b05908c174872f1e9009812bfd1e598f8ca7fee6498e5216\": rpc error: code = NotFound desc = could not find container \"fa280989b6275627b05908c174872f1e9009812bfd1e598f8ca7fee6498e5216\": container with ID starting with fa280989b6275627b05908c174872f1e9009812bfd1e598f8ca7fee6498e5216 not found: ID does not exist" Apr 20 12:16:46.592841 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.592816 2580 scope.go:117] "RemoveContainer" containerID="c0363d3e6fd2690cde36d8af948483224cb372cc4dd5e636bb6850c808f31138" Apr 20 12:16:46.593218 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:16:46.593192 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0363d3e6fd2690cde36d8af948483224cb372cc4dd5e636bb6850c808f31138\": container with ID starting with c0363d3e6fd2690cde36d8af948483224cb372cc4dd5e636bb6850c808f31138 not found: ID does not exist" containerID="c0363d3e6fd2690cde36d8af948483224cb372cc4dd5e636bb6850c808f31138" Apr 20 12:16:46.593312 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.593223 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0363d3e6fd2690cde36d8af948483224cb372cc4dd5e636bb6850c808f31138"} err="failed to get container status \"c0363d3e6fd2690cde36d8af948483224cb372cc4dd5e636bb6850c808f31138\": rpc error: code = NotFound desc = could not find container \"c0363d3e6fd2690cde36d8af948483224cb372cc4dd5e636bb6850c808f31138\": container with ID starting with c0363d3e6fd2690cde36d8af948483224cb372cc4dd5e636bb6850c808f31138 not found: ID does not exist" Apr 20 12:16:46.593312 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.593240 2580 scope.go:117] "RemoveContainer" containerID="16ca6f078d59528a6292161eb2a4883d5d2e63dfd6681b639f009150b68ba684" Apr 20 12:16:46.593727 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:16:46.593688 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16ca6f078d59528a6292161eb2a4883d5d2e63dfd6681b639f009150b68ba684\": container with ID starting with 16ca6f078d59528a6292161eb2a4883d5d2e63dfd6681b639f009150b68ba684 not found: ID does not exist" containerID="16ca6f078d59528a6292161eb2a4883d5d2e63dfd6681b639f009150b68ba684" Apr 20 12:16:46.593812 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.593721 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16ca6f078d59528a6292161eb2a4883d5d2e63dfd6681b639f009150b68ba684"} err="failed to get container status \"16ca6f078d59528a6292161eb2a4883d5d2e63dfd6681b639f009150b68ba684\": rpc error: code = NotFound desc = could not find container \"16ca6f078d59528a6292161eb2a4883d5d2e63dfd6681b639f009150b68ba684\": container with ID starting with 16ca6f078d59528a6292161eb2a4883d5d2e63dfd6681b639f009150b68ba684 not found: ID does not exist" Apr 20 12:16:46.593812 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.593742 2580 scope.go:117] "RemoveContainer" containerID="fcd9e97e1b219572a49b9fdb8b3c160aff2c81c5a4269c522f4d6bdf11c30816" Apr 20 12:16:46.594015 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.593990 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 12:16:46.594105 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:16:46.594040 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcd9e97e1b219572a49b9fdb8b3c160aff2c81c5a4269c522f4d6bdf11c30816\": container with ID starting with fcd9e97e1b219572a49b9fdb8b3c160aff2c81c5a4269c522f4d6bdf11c30816 not found: ID does not exist" containerID="fcd9e97e1b219572a49b9fdb8b3c160aff2c81c5a4269c522f4d6bdf11c30816" Apr 20 12:16:46.594105 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.594089 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcd9e97e1b219572a49b9fdb8b3c160aff2c81c5a4269c522f4d6bdf11c30816"} err="failed to get container status \"fcd9e97e1b219572a49b9fdb8b3c160aff2c81c5a4269c522f4d6bdf11c30816\": rpc error: code = NotFound desc = could not find container \"fcd9e97e1b219572a49b9fdb8b3c160aff2c81c5a4269c522f4d6bdf11c30816\": container with ID starting with fcd9e97e1b219572a49b9fdb8b3c160aff2c81c5a4269c522f4d6bdf11c30816 not found: ID does not exist" Apr 20 12:16:46.594594 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.594116 2580 scope.go:117] "RemoveContainer" containerID="91a869e4727c73167d92cd5efce7982a8bf1c7f9384ae9f34143ff7c6dd264f1" Apr 20 12:16:46.597581 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:16:46.597259 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91a869e4727c73167d92cd5efce7982a8bf1c7f9384ae9f34143ff7c6dd264f1\": container with ID starting with 91a869e4727c73167d92cd5efce7982a8bf1c7f9384ae9f34143ff7c6dd264f1 not found: ID does not exist" containerID="91a869e4727c73167d92cd5efce7982a8bf1c7f9384ae9f34143ff7c6dd264f1" Apr 20 12:16:46.603953 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.601350 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 12:16:46.604360 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.603969 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91a869e4727c73167d92cd5efce7982a8bf1c7f9384ae9f34143ff7c6dd264f1"} err="failed to get container status \"91a869e4727c73167d92cd5efce7982a8bf1c7f9384ae9f34143ff7c6dd264f1\": rpc error: code = NotFound desc = could not find container \"91a869e4727c73167d92cd5efce7982a8bf1c7f9384ae9f34143ff7c6dd264f1\": container with ID starting with 91a869e4727c73167d92cd5efce7982a8bf1c7f9384ae9f34143ff7c6dd264f1 not found: ID does not exist" Apr 20 12:16:46.604360 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.604007 2580 scope.go:117] "RemoveContainer" containerID="7d5bca424f828bdff54ef72bc300c8fc1ac58a7b0c04326f5bff978afb96c258" Apr 20 12:16:46.604709 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:16:46.604678 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d5bca424f828bdff54ef72bc300c8fc1ac58a7b0c04326f5bff978afb96c258\": container with ID starting with 7d5bca424f828bdff54ef72bc300c8fc1ac58a7b0c04326f5bff978afb96c258 not found: ID does not exist" containerID="7d5bca424f828bdff54ef72bc300c8fc1ac58a7b0c04326f5bff978afb96c258" Apr 20 12:16:46.604837 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.604716 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d5bca424f828bdff54ef72bc300c8fc1ac58a7b0c04326f5bff978afb96c258"} err="failed to get container status \"7d5bca424f828bdff54ef72bc300c8fc1ac58a7b0c04326f5bff978afb96c258\": rpc error: code = NotFound desc = could not find container \"7d5bca424f828bdff54ef72bc300c8fc1ac58a7b0c04326f5bff978afb96c258\": container with ID starting with 7d5bca424f828bdff54ef72bc300c8fc1ac58a7b0c04326f5bff978afb96c258 not found: ID does not exist" Apr 20 12:16:46.604837 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.604736 2580 scope.go:117] "RemoveContainer" containerID="d0f8ca4501c8467c6884b3be53f26cab5d7579b43507e2ea45f2122f0eeddd1a" Apr 20 12:16:46.605042 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.605017 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0f8ca4501c8467c6884b3be53f26cab5d7579b43507e2ea45f2122f0eeddd1a"} err="failed to get container status \"d0f8ca4501c8467c6884b3be53f26cab5d7579b43507e2ea45f2122f0eeddd1a\": rpc error: code = NotFound desc = could not find container \"d0f8ca4501c8467c6884b3be53f26cab5d7579b43507e2ea45f2122f0eeddd1a\": container with ID starting with d0f8ca4501c8467c6884b3be53f26cab5d7579b43507e2ea45f2122f0eeddd1a not found: ID does not exist" Apr 20 12:16:46.605106 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.605043 2580 scope.go:117] "RemoveContainer" containerID="fa280989b6275627b05908c174872f1e9009812bfd1e598f8ca7fee6498e5216" Apr 20 12:16:46.605169 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.605149 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 12:16:46.605359 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.605327 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa280989b6275627b05908c174872f1e9009812bfd1e598f8ca7fee6498e5216"} err="failed to get container status \"fa280989b6275627b05908c174872f1e9009812bfd1e598f8ca7fee6498e5216\": rpc error: code = NotFound desc = could not find container \"fa280989b6275627b05908c174872f1e9009812bfd1e598f8ca7fee6498e5216\": container with ID starting with fa280989b6275627b05908c174872f1e9009812bfd1e598f8ca7fee6498e5216 not found: ID does not exist" Apr 20 12:16:46.605359 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.605351 2580 scope.go:117] "RemoveContainer" containerID="c0363d3e6fd2690cde36d8af948483224cb372cc4dd5e636bb6850c808f31138" Apr 20 12:16:46.605647 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.605628 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0363d3e6fd2690cde36d8af948483224cb372cc4dd5e636bb6850c808f31138"} err="failed to get container status \"c0363d3e6fd2690cde36d8af948483224cb372cc4dd5e636bb6850c808f31138\": rpc error: code = NotFound desc = could not find container \"c0363d3e6fd2690cde36d8af948483224cb372cc4dd5e636bb6850c808f31138\": container with ID starting with c0363d3e6fd2690cde36d8af948483224cb372cc4dd5e636bb6850c808f31138 not found: ID does not exist" Apr 20 12:16:46.605647 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.605647 2580 scope.go:117] "RemoveContainer" containerID="16ca6f078d59528a6292161eb2a4883d5d2e63dfd6681b639f009150b68ba684" Apr 20 12:16:46.605884 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.605861 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16ca6f078d59528a6292161eb2a4883d5d2e63dfd6681b639f009150b68ba684"} err="failed to get container status \"16ca6f078d59528a6292161eb2a4883d5d2e63dfd6681b639f009150b68ba684\": rpc error: code = NotFound desc = could not find container \"16ca6f078d59528a6292161eb2a4883d5d2e63dfd6681b639f009150b68ba684\": container with ID starting with 16ca6f078d59528a6292161eb2a4883d5d2e63dfd6681b639f009150b68ba684 not found: ID does not exist" Apr 20 12:16:46.605884 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.605884 2580 scope.go:117] "RemoveContainer" containerID="fcd9e97e1b219572a49b9fdb8b3c160aff2c81c5a4269c522f4d6bdf11c30816" Apr 20 12:16:46.606202 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.606164 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcd9e97e1b219572a49b9fdb8b3c160aff2c81c5a4269c522f4d6bdf11c30816"} err="failed to get container status \"fcd9e97e1b219572a49b9fdb8b3c160aff2c81c5a4269c522f4d6bdf11c30816\": rpc error: code = NotFound desc = could not find container \"fcd9e97e1b219572a49b9fdb8b3c160aff2c81c5a4269c522f4d6bdf11c30816\": container with ID starting with fcd9e97e1b219572a49b9fdb8b3c160aff2c81c5a4269c522f4d6bdf11c30816 not found: ID does not exist" Apr 20 12:16:46.606202 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.606207 2580 scope.go:117] "RemoveContainer" containerID="91a869e4727c73167d92cd5efce7982a8bf1c7f9384ae9f34143ff7c6dd264f1" Apr 20 12:16:46.606591 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.606551 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91a869e4727c73167d92cd5efce7982a8bf1c7f9384ae9f34143ff7c6dd264f1"} err="failed to get container status \"91a869e4727c73167d92cd5efce7982a8bf1c7f9384ae9f34143ff7c6dd264f1\": rpc error: code = NotFound desc = could not find container \"91a869e4727c73167d92cd5efce7982a8bf1c7f9384ae9f34143ff7c6dd264f1\": container with ID starting with 91a869e4727c73167d92cd5efce7982a8bf1c7f9384ae9f34143ff7c6dd264f1 not found: ID does not exist" Apr 20 12:16:46.606591 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.606593 2580 scope.go:117] "RemoveContainer" containerID="7d5bca424f828bdff54ef72bc300c8fc1ac58a7b0c04326f5bff978afb96c258" Apr 20 12:16:46.606864 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.606844 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d5bca424f828bdff54ef72bc300c8fc1ac58a7b0c04326f5bff978afb96c258"} err="failed to get container status \"7d5bca424f828bdff54ef72bc300c8fc1ac58a7b0c04326f5bff978afb96c258\": rpc error: code = NotFound desc = could not find container \"7d5bca424f828bdff54ef72bc300c8fc1ac58a7b0c04326f5bff978afb96c258\": container with ID starting with 7d5bca424f828bdff54ef72bc300c8fc1ac58a7b0c04326f5bff978afb96c258 not found: ID does not exist" Apr 20 12:16:46.606864 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.606864 2580 scope.go:117] "RemoveContainer" containerID="d0f8ca4501c8467c6884b3be53f26cab5d7579b43507e2ea45f2122f0eeddd1a" Apr 20 12:16:46.607128 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.607099 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0f8ca4501c8467c6884b3be53f26cab5d7579b43507e2ea45f2122f0eeddd1a"} err="failed to get container status \"d0f8ca4501c8467c6884b3be53f26cab5d7579b43507e2ea45f2122f0eeddd1a\": rpc error: code = NotFound desc = could not find container \"d0f8ca4501c8467c6884b3be53f26cab5d7579b43507e2ea45f2122f0eeddd1a\": container with ID starting with d0f8ca4501c8467c6884b3be53f26cab5d7579b43507e2ea45f2122f0eeddd1a not found: ID does not exist" Apr 20 12:16:46.607166 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.607131 2580 scope.go:117] "RemoveContainer" containerID="fa280989b6275627b05908c174872f1e9009812bfd1e598f8ca7fee6498e5216" Apr 20 12:16:46.607362 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.607343 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa280989b6275627b05908c174872f1e9009812bfd1e598f8ca7fee6498e5216"} err="failed to get container status \"fa280989b6275627b05908c174872f1e9009812bfd1e598f8ca7fee6498e5216\": rpc error: code = NotFound desc = could not find container \"fa280989b6275627b05908c174872f1e9009812bfd1e598f8ca7fee6498e5216\": container with ID starting with fa280989b6275627b05908c174872f1e9009812bfd1e598f8ca7fee6498e5216 not found: ID does not exist" Apr 20 12:16:46.607462 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.607364 2580 scope.go:117] "RemoveContainer" containerID="c0363d3e6fd2690cde36d8af948483224cb372cc4dd5e636bb6850c808f31138" Apr 20 12:16:46.607582 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.607564 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0363d3e6fd2690cde36d8af948483224cb372cc4dd5e636bb6850c808f31138"} err="failed to get container status \"c0363d3e6fd2690cde36d8af948483224cb372cc4dd5e636bb6850c808f31138\": rpc error: code = NotFound desc = could not find container \"c0363d3e6fd2690cde36d8af948483224cb372cc4dd5e636bb6850c808f31138\": container with ID starting with c0363d3e6fd2690cde36d8af948483224cb372cc4dd5e636bb6850c808f31138 not found: ID does not exist" Apr 20 12:16:46.607628 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.607586 2580 scope.go:117] "RemoveContainer" containerID="16ca6f078d59528a6292161eb2a4883d5d2e63dfd6681b639f009150b68ba684" Apr 20 12:16:46.607805 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.607788 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16ca6f078d59528a6292161eb2a4883d5d2e63dfd6681b639f009150b68ba684"} err="failed to get container status \"16ca6f078d59528a6292161eb2a4883d5d2e63dfd6681b639f009150b68ba684\": rpc error: code = NotFound desc = could not find container \"16ca6f078d59528a6292161eb2a4883d5d2e63dfd6681b639f009150b68ba684\": container with ID starting with 16ca6f078d59528a6292161eb2a4883d5d2e63dfd6681b639f009150b68ba684 not found: ID does not exist" Apr 20 12:16:46.607863 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.607808 2580 scope.go:117] "RemoveContainer" containerID="fcd9e97e1b219572a49b9fdb8b3c160aff2c81c5a4269c522f4d6bdf11c30816" Apr 20 12:16:46.608072 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.608055 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcd9e97e1b219572a49b9fdb8b3c160aff2c81c5a4269c522f4d6bdf11c30816"} err="failed to get container status \"fcd9e97e1b219572a49b9fdb8b3c160aff2c81c5a4269c522f4d6bdf11c30816\": rpc error: code = NotFound desc = could not find container \"fcd9e97e1b219572a49b9fdb8b3c160aff2c81c5a4269c522f4d6bdf11c30816\": container with ID starting with fcd9e97e1b219572a49b9fdb8b3c160aff2c81c5a4269c522f4d6bdf11c30816 not found: ID does not exist" Apr 20 12:16:46.608136 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.608073 2580 scope.go:117] "RemoveContainer" containerID="91a869e4727c73167d92cd5efce7982a8bf1c7f9384ae9f34143ff7c6dd264f1" Apr 20 12:16:46.608347 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.608312 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91a869e4727c73167d92cd5efce7982a8bf1c7f9384ae9f34143ff7c6dd264f1"} err="failed to get container status \"91a869e4727c73167d92cd5efce7982a8bf1c7f9384ae9f34143ff7c6dd264f1\": rpc error: code = NotFound desc = could not find container \"91a869e4727c73167d92cd5efce7982a8bf1c7f9384ae9f34143ff7c6dd264f1\": container with ID starting with 91a869e4727c73167d92cd5efce7982a8bf1c7f9384ae9f34143ff7c6dd264f1 not found: ID does not exist" Apr 20 12:16:46.608415 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.608351 2580 scope.go:117] "RemoveContainer" containerID="7d5bca424f828bdff54ef72bc300c8fc1ac58a7b0c04326f5bff978afb96c258" Apr 20 12:16:46.608641 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.608623 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d5bca424f828bdff54ef72bc300c8fc1ac58a7b0c04326f5bff978afb96c258"} err="failed to get container status \"7d5bca424f828bdff54ef72bc300c8fc1ac58a7b0c04326f5bff978afb96c258\": rpc error: code = NotFound desc = could not find container \"7d5bca424f828bdff54ef72bc300c8fc1ac58a7b0c04326f5bff978afb96c258\": container with ID starting with 7d5bca424f828bdff54ef72bc300c8fc1ac58a7b0c04326f5bff978afb96c258 not found: ID does not exist" Apr 20 12:16:46.608641 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.608639 2580 scope.go:117] "RemoveContainer" containerID="d0f8ca4501c8467c6884b3be53f26cab5d7579b43507e2ea45f2122f0eeddd1a" Apr 20 12:16:46.608864 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.608848 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0f8ca4501c8467c6884b3be53f26cab5d7579b43507e2ea45f2122f0eeddd1a"} err="failed to get container status \"d0f8ca4501c8467c6884b3be53f26cab5d7579b43507e2ea45f2122f0eeddd1a\": rpc error: code = NotFound desc = could not find container \"d0f8ca4501c8467c6884b3be53f26cab5d7579b43507e2ea45f2122f0eeddd1a\": container with ID starting with d0f8ca4501c8467c6884b3be53f26cab5d7579b43507e2ea45f2122f0eeddd1a not found: ID does not exist" Apr 20 12:16:46.608864 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.608863 2580 scope.go:117] "RemoveContainer" containerID="fa280989b6275627b05908c174872f1e9009812bfd1e598f8ca7fee6498e5216" Apr 20 12:16:46.609079 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.609059 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa280989b6275627b05908c174872f1e9009812bfd1e598f8ca7fee6498e5216"} err="failed to get container status \"fa280989b6275627b05908c174872f1e9009812bfd1e598f8ca7fee6498e5216\": rpc error: code = NotFound desc = could not find container \"fa280989b6275627b05908c174872f1e9009812bfd1e598f8ca7fee6498e5216\": container with ID starting with fa280989b6275627b05908c174872f1e9009812bfd1e598f8ca7fee6498e5216 not found: ID does not exist" Apr 20 12:16:46.609118 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.609080 2580 scope.go:117] "RemoveContainer" containerID="c0363d3e6fd2690cde36d8af948483224cb372cc4dd5e636bb6850c808f31138" Apr 20 12:16:46.609296 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.609278 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0363d3e6fd2690cde36d8af948483224cb372cc4dd5e636bb6850c808f31138"} err="failed to get container status \"c0363d3e6fd2690cde36d8af948483224cb372cc4dd5e636bb6850c808f31138\": rpc error: code = NotFound desc = could not find container \"c0363d3e6fd2690cde36d8af948483224cb372cc4dd5e636bb6850c808f31138\": container with ID starting with c0363d3e6fd2690cde36d8af948483224cb372cc4dd5e636bb6850c808f31138 not found: ID does not exist" Apr 20 12:16:46.609343 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.609297 2580 scope.go:117] "RemoveContainer" containerID="16ca6f078d59528a6292161eb2a4883d5d2e63dfd6681b639f009150b68ba684" Apr 20 12:16:46.609535 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.609516 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16ca6f078d59528a6292161eb2a4883d5d2e63dfd6681b639f009150b68ba684"} err="failed to get container status \"16ca6f078d59528a6292161eb2a4883d5d2e63dfd6681b639f009150b68ba684\": rpc error: code = NotFound desc = could not find container \"16ca6f078d59528a6292161eb2a4883d5d2e63dfd6681b639f009150b68ba684\": container with ID starting with 16ca6f078d59528a6292161eb2a4883d5d2e63dfd6681b639f009150b68ba684 not found: ID does not exist" Apr 20 12:16:46.609588 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.609536 2580 scope.go:117] "RemoveContainer" containerID="fcd9e97e1b219572a49b9fdb8b3c160aff2c81c5a4269c522f4d6bdf11c30816" Apr 20 12:16:46.609767 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.609722 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcd9e97e1b219572a49b9fdb8b3c160aff2c81c5a4269c522f4d6bdf11c30816"} err="failed to get container status \"fcd9e97e1b219572a49b9fdb8b3c160aff2c81c5a4269c522f4d6bdf11c30816\": rpc error: code = NotFound desc = could not find container \"fcd9e97e1b219572a49b9fdb8b3c160aff2c81c5a4269c522f4d6bdf11c30816\": container with ID starting with fcd9e97e1b219572a49b9fdb8b3c160aff2c81c5a4269c522f4d6bdf11c30816 not found: ID does not exist" Apr 20 12:16:46.609767 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.609741 2580 scope.go:117] "RemoveContainer" containerID="91a869e4727c73167d92cd5efce7982a8bf1c7f9384ae9f34143ff7c6dd264f1" Apr 20 12:16:46.609989 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.609972 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91a869e4727c73167d92cd5efce7982a8bf1c7f9384ae9f34143ff7c6dd264f1"} err="failed to get container status \"91a869e4727c73167d92cd5efce7982a8bf1c7f9384ae9f34143ff7c6dd264f1\": rpc error: code = NotFound desc = could not find container \"91a869e4727c73167d92cd5efce7982a8bf1c7f9384ae9f34143ff7c6dd264f1\": container with ID starting with 91a869e4727c73167d92cd5efce7982a8bf1c7f9384ae9f34143ff7c6dd264f1 not found: ID does not exist" Apr 20 12:16:46.609989 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.609989 2580 scope.go:117] "RemoveContainer" containerID="7d5bca424f828bdff54ef72bc300c8fc1ac58a7b0c04326f5bff978afb96c258" Apr 20 12:16:46.610200 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.610183 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d5bca424f828bdff54ef72bc300c8fc1ac58a7b0c04326f5bff978afb96c258"} err="failed to get container status \"7d5bca424f828bdff54ef72bc300c8fc1ac58a7b0c04326f5bff978afb96c258\": rpc error: code = NotFound desc = could not find container \"7d5bca424f828bdff54ef72bc300c8fc1ac58a7b0c04326f5bff978afb96c258\": container with ID starting with 7d5bca424f828bdff54ef72bc300c8fc1ac58a7b0c04326f5bff978afb96c258 not found: ID does not exist" Apr 20 12:16:46.610200 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.610199 2580 scope.go:117] "RemoveContainer" containerID="d0f8ca4501c8467c6884b3be53f26cab5d7579b43507e2ea45f2122f0eeddd1a" Apr 20 12:16:46.610386 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.610368 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0f8ca4501c8467c6884b3be53f26cab5d7579b43507e2ea45f2122f0eeddd1a"} err="failed to get container status \"d0f8ca4501c8467c6884b3be53f26cab5d7579b43507e2ea45f2122f0eeddd1a\": rpc error: code = NotFound desc = could not find container \"d0f8ca4501c8467c6884b3be53f26cab5d7579b43507e2ea45f2122f0eeddd1a\": container with ID starting with d0f8ca4501c8467c6884b3be53f26cab5d7579b43507e2ea45f2122f0eeddd1a not found: ID does not exist" Apr 20 12:16:46.610440 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.610387 2580 scope.go:117] "RemoveContainer" containerID="fa280989b6275627b05908c174872f1e9009812bfd1e598f8ca7fee6498e5216" Apr 20 12:16:46.610620 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.610603 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa280989b6275627b05908c174872f1e9009812bfd1e598f8ca7fee6498e5216"} err="failed to get container status \"fa280989b6275627b05908c174872f1e9009812bfd1e598f8ca7fee6498e5216\": rpc error: code = NotFound desc = could not find container \"fa280989b6275627b05908c174872f1e9009812bfd1e598f8ca7fee6498e5216\": container with ID starting with fa280989b6275627b05908c174872f1e9009812bfd1e598f8ca7fee6498e5216 not found: ID does not exist" Apr 20 12:16:46.610620 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.610619 2580 scope.go:117] "RemoveContainer" containerID="c0363d3e6fd2690cde36d8af948483224cb372cc4dd5e636bb6850c808f31138" Apr 20 12:16:46.610832 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.610817 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0363d3e6fd2690cde36d8af948483224cb372cc4dd5e636bb6850c808f31138"} err="failed to get container status \"c0363d3e6fd2690cde36d8af948483224cb372cc4dd5e636bb6850c808f31138\": rpc error: code = NotFound desc = could not find container \"c0363d3e6fd2690cde36d8af948483224cb372cc4dd5e636bb6850c808f31138\": container with ID starting with c0363d3e6fd2690cde36d8af948483224cb372cc4dd5e636bb6850c808f31138 not found: ID does not exist" Apr 20 12:16:46.610872 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.610833 2580 scope.go:117] "RemoveContainer" containerID="16ca6f078d59528a6292161eb2a4883d5d2e63dfd6681b639f009150b68ba684" Apr 20 12:16:46.611043 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.611022 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16ca6f078d59528a6292161eb2a4883d5d2e63dfd6681b639f009150b68ba684"} err="failed to get container status \"16ca6f078d59528a6292161eb2a4883d5d2e63dfd6681b639f009150b68ba684\": rpc error: code = NotFound desc = could not find container \"16ca6f078d59528a6292161eb2a4883d5d2e63dfd6681b639f009150b68ba684\": container with ID starting with 16ca6f078d59528a6292161eb2a4883d5d2e63dfd6681b639f009150b68ba684 not found: ID does not exist" Apr 20 12:16:46.611093 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.611043 2580 scope.go:117] "RemoveContainer" containerID="fcd9e97e1b219572a49b9fdb8b3c160aff2c81c5a4269c522f4d6bdf11c30816" Apr 20 12:16:46.611210 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.611193 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcd9e97e1b219572a49b9fdb8b3c160aff2c81c5a4269c522f4d6bdf11c30816"} err="failed to get container status \"fcd9e97e1b219572a49b9fdb8b3c160aff2c81c5a4269c522f4d6bdf11c30816\": rpc error: code = NotFound desc = could not find container \"fcd9e97e1b219572a49b9fdb8b3c160aff2c81c5a4269c522f4d6bdf11c30816\": container with ID starting with fcd9e97e1b219572a49b9fdb8b3c160aff2c81c5a4269c522f4d6bdf11c30816 not found: ID does not exist" Apr 20 12:16:46.611210 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.611210 2580 scope.go:117] "RemoveContainer" containerID="91a869e4727c73167d92cd5efce7982a8bf1c7f9384ae9f34143ff7c6dd264f1" Apr 20 12:16:46.611359 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.611345 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91a869e4727c73167d92cd5efce7982a8bf1c7f9384ae9f34143ff7c6dd264f1"} err="failed to get container status \"91a869e4727c73167d92cd5efce7982a8bf1c7f9384ae9f34143ff7c6dd264f1\": rpc error: code = NotFound desc = could not find container \"91a869e4727c73167d92cd5efce7982a8bf1c7f9384ae9f34143ff7c6dd264f1\": container with ID starting with 91a869e4727c73167d92cd5efce7982a8bf1c7f9384ae9f34143ff7c6dd264f1 not found: ID does not exist" Apr 20 12:16:46.611412 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.611358 2580 scope.go:117] "RemoveContainer" containerID="7d5bca424f828bdff54ef72bc300c8fc1ac58a7b0c04326f5bff978afb96c258" Apr 20 12:16:46.611559 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.611544 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d5bca424f828bdff54ef72bc300c8fc1ac58a7b0c04326f5bff978afb96c258"} err="failed to get container status \"7d5bca424f828bdff54ef72bc300c8fc1ac58a7b0c04326f5bff978afb96c258\": rpc error: code = NotFound desc = could not find container \"7d5bca424f828bdff54ef72bc300c8fc1ac58a7b0c04326f5bff978afb96c258\": container with ID starting with 7d5bca424f828bdff54ef72bc300c8fc1ac58a7b0c04326f5bff978afb96c258 not found: ID does not exist" Apr 20 12:16:46.611559 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.611558 2580 scope.go:117] "RemoveContainer" containerID="d0f8ca4501c8467c6884b3be53f26cab5d7579b43507e2ea45f2122f0eeddd1a" Apr 20 12:16:46.611729 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.611712 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0f8ca4501c8467c6884b3be53f26cab5d7579b43507e2ea45f2122f0eeddd1a"} err="failed to get container status \"d0f8ca4501c8467c6884b3be53f26cab5d7579b43507e2ea45f2122f0eeddd1a\": rpc error: code = NotFound desc = could not find container \"d0f8ca4501c8467c6884b3be53f26cab5d7579b43507e2ea45f2122f0eeddd1a\": container with ID starting with d0f8ca4501c8467c6884b3be53f26cab5d7579b43507e2ea45f2122f0eeddd1a not found: ID does not exist" Apr 20 12:16:46.611773 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.611730 2580 scope.go:117] "RemoveContainer" containerID="fa280989b6275627b05908c174872f1e9009812bfd1e598f8ca7fee6498e5216" Apr 20 12:16:46.611891 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.611876 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa280989b6275627b05908c174872f1e9009812bfd1e598f8ca7fee6498e5216"} err="failed to get container status \"fa280989b6275627b05908c174872f1e9009812bfd1e598f8ca7fee6498e5216\": rpc error: code = NotFound desc = could not find container \"fa280989b6275627b05908c174872f1e9009812bfd1e598f8ca7fee6498e5216\": container with ID starting with fa280989b6275627b05908c174872f1e9009812bfd1e598f8ca7fee6498e5216 not found: ID does not exist" Apr 20 12:16:46.611930 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.611890 2580 scope.go:117] "RemoveContainer" containerID="c0363d3e6fd2690cde36d8af948483224cb372cc4dd5e636bb6850c808f31138" Apr 20 12:16:46.612046 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.612031 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0363d3e6fd2690cde36d8af948483224cb372cc4dd5e636bb6850c808f31138"} err="failed to get container status \"c0363d3e6fd2690cde36d8af948483224cb372cc4dd5e636bb6850c808f31138\": rpc error: code = NotFound desc = could not find container \"c0363d3e6fd2690cde36d8af948483224cb372cc4dd5e636bb6850c808f31138\": container with ID starting with c0363d3e6fd2690cde36d8af948483224cb372cc4dd5e636bb6850c808f31138 not found: ID does not exist" Apr 20 12:16:46.612088 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.612047 2580 scope.go:117] "RemoveContainer" containerID="16ca6f078d59528a6292161eb2a4883d5d2e63dfd6681b639f009150b68ba684" Apr 20 12:16:46.612235 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.612218 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16ca6f078d59528a6292161eb2a4883d5d2e63dfd6681b639f009150b68ba684"} err="failed to get container status \"16ca6f078d59528a6292161eb2a4883d5d2e63dfd6681b639f009150b68ba684\": rpc error: code = NotFound desc = could not find container \"16ca6f078d59528a6292161eb2a4883d5d2e63dfd6681b639f009150b68ba684\": container with ID starting with 16ca6f078d59528a6292161eb2a4883d5d2e63dfd6681b639f009150b68ba684 not found: ID does not exist" Apr 20 12:16:46.612286 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.612234 2580 scope.go:117] "RemoveContainer" containerID="fcd9e97e1b219572a49b9fdb8b3c160aff2c81c5a4269c522f4d6bdf11c30816" Apr 20 12:16:46.612423 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.612378 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcd9e97e1b219572a49b9fdb8b3c160aff2c81c5a4269c522f4d6bdf11c30816"} err="failed to get container status \"fcd9e97e1b219572a49b9fdb8b3c160aff2c81c5a4269c522f4d6bdf11c30816\": rpc error: code = NotFound desc = could not find container \"fcd9e97e1b219572a49b9fdb8b3c160aff2c81c5a4269c522f4d6bdf11c30816\": container with ID starting with fcd9e97e1b219572a49b9fdb8b3c160aff2c81c5a4269c522f4d6bdf11c30816 not found: ID does not exist" Apr 20 12:16:46.612423 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.612407 2580 scope.go:117] "RemoveContainer" containerID="91a869e4727c73167d92cd5efce7982a8bf1c7f9384ae9f34143ff7c6dd264f1" Apr 20 12:16:46.612606 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.612591 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91a869e4727c73167d92cd5efce7982a8bf1c7f9384ae9f34143ff7c6dd264f1"} err="failed to get container status \"91a869e4727c73167d92cd5efce7982a8bf1c7f9384ae9f34143ff7c6dd264f1\": rpc error: code = NotFound desc = could not find container \"91a869e4727c73167d92cd5efce7982a8bf1c7f9384ae9f34143ff7c6dd264f1\": container with ID starting with 91a869e4727c73167d92cd5efce7982a8bf1c7f9384ae9f34143ff7c6dd264f1 not found: ID does not exist" Apr 20 12:16:46.612606 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.612606 2580 scope.go:117] "RemoveContainer" containerID="7d5bca424f828bdff54ef72bc300c8fc1ac58a7b0c04326f5bff978afb96c258" Apr 20 12:16:46.612767 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.612745 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d5bca424f828bdff54ef72bc300c8fc1ac58a7b0c04326f5bff978afb96c258"} err="failed to get container status \"7d5bca424f828bdff54ef72bc300c8fc1ac58a7b0c04326f5bff978afb96c258\": rpc error: code = NotFound desc = could not find container \"7d5bca424f828bdff54ef72bc300c8fc1ac58a7b0c04326f5bff978afb96c258\": container with ID starting with 7d5bca424f828bdff54ef72bc300c8fc1ac58a7b0c04326f5bff978afb96c258 not found: ID does not exist" Apr 20 12:16:46.763874 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.763834 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/899a908e-40e7-476b-b7ba-7f9134886317-config-out\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.763874 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.763888 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/899a908e-40e7-476b-b7ba-7f9134886317-web-config\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.764136 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.763911 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/899a908e-40e7-476b-b7ba-7f9134886317-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.764136 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.763973 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/899a908e-40e7-476b-b7ba-7f9134886317-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.764136 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.764001 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/899a908e-40e7-476b-b7ba-7f9134886317-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.764136 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.764057 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/899a908e-40e7-476b-b7ba-7f9134886317-config\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.764136 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.764079 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/899a908e-40e7-476b-b7ba-7f9134886317-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.764136 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.764099 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-565pk\" (UniqueName: \"kubernetes.io/projected/899a908e-40e7-476b-b7ba-7f9134886317-kube-api-access-565pk\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.764387 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.764138 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/899a908e-40e7-476b-b7ba-7f9134886317-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.764387 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.764193 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/899a908e-40e7-476b-b7ba-7f9134886317-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.764387 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.764237 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/899a908e-40e7-476b-b7ba-7f9134886317-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.764387 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.764309 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/899a908e-40e7-476b-b7ba-7f9134886317-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.764387 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.764340 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/899a908e-40e7-476b-b7ba-7f9134886317-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.764387 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.764388 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/899a908e-40e7-476b-b7ba-7f9134886317-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.764666 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.764440 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/899a908e-40e7-476b-b7ba-7f9134886317-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.764666 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.764470 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/899a908e-40e7-476b-b7ba-7f9134886317-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.764666 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.764503 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/899a908e-40e7-476b-b7ba-7f9134886317-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.764666 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.764533 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/899a908e-40e7-476b-b7ba-7f9134886317-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.848178 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.848142 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4250d722-a976-44b5-8936-846f53dc7cca" path="/var/lib/kubelet/pods/4250d722-a976-44b5-8936-846f53dc7cca/volumes" Apr 20 12:16:46.865475 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.865353 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/899a908e-40e7-476b-b7ba-7f9134886317-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.865475 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.865390 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/899a908e-40e7-476b-b7ba-7f9134886317-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.865475 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.865448 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/899a908e-40e7-476b-b7ba-7f9134886317-config\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.865475 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.865473 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/899a908e-40e7-476b-b7ba-7f9134886317-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.865792 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.865494 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-565pk\" (UniqueName: \"kubernetes.io/projected/899a908e-40e7-476b-b7ba-7f9134886317-kube-api-access-565pk\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.865792 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.865531 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/899a908e-40e7-476b-b7ba-7f9134886317-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.865792 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.865555 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/899a908e-40e7-476b-b7ba-7f9134886317-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.865792 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.865587 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/899a908e-40e7-476b-b7ba-7f9134886317-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.865792 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.865631 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/899a908e-40e7-476b-b7ba-7f9134886317-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.865792 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.865665 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/899a908e-40e7-476b-b7ba-7f9134886317-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.865792 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.865702 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/899a908e-40e7-476b-b7ba-7f9134886317-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.865792 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.865735 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/899a908e-40e7-476b-b7ba-7f9134886317-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.865792 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.865767 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/899a908e-40e7-476b-b7ba-7f9134886317-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.866225 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.865805 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/899a908e-40e7-476b-b7ba-7f9134886317-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.866225 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.865846 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/899a908e-40e7-476b-b7ba-7f9134886317-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.866225 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.865880 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/899a908e-40e7-476b-b7ba-7f9134886317-config-out\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.866225 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.865905 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/899a908e-40e7-476b-b7ba-7f9134886317-web-config\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.866225 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.865927 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/899a908e-40e7-476b-b7ba-7f9134886317-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.867288 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.866586 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/899a908e-40e7-476b-b7ba-7f9134886317-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.867288 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.866732 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/899a908e-40e7-476b-b7ba-7f9134886317-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.869427 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.869278 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/899a908e-40e7-476b-b7ba-7f9134886317-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.869667 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.869563 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/899a908e-40e7-476b-b7ba-7f9134886317-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.870504 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.870041 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/899a908e-40e7-476b-b7ba-7f9134886317-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.871656 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.871441 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/899a908e-40e7-476b-b7ba-7f9134886317-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.871656 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.871488 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/899a908e-40e7-476b-b7ba-7f9134886317-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.871937 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.871913 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/899a908e-40e7-476b-b7ba-7f9134886317-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.872002 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.871975 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/899a908e-40e7-476b-b7ba-7f9134886317-config\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.872349 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.872327 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/899a908e-40e7-476b-b7ba-7f9134886317-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.872686 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.872613 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/899a908e-40e7-476b-b7ba-7f9134886317-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.872803 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.872682 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/899a908e-40e7-476b-b7ba-7f9134886317-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.873411 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.873347 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/899a908e-40e7-476b-b7ba-7f9134886317-web-config\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.873649 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.873623 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/899a908e-40e7-476b-b7ba-7f9134886317-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.874177 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.874151 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/899a908e-40e7-476b-b7ba-7f9134886317-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.874347 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.874323 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/899a908e-40e7-476b-b7ba-7f9134886317-config-out\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.875015 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.874994 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-565pk\" (UniqueName: \"kubernetes.io/projected/899a908e-40e7-476b-b7ba-7f9134886317-kube-api-access-565pk\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.875374 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.875358 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/899a908e-40e7-476b-b7ba-7f9134886317-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"899a908e-40e7-476b-b7ba-7f9134886317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:46.903076 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:46.903034 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:47.051191 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:47.051107 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 12:16:47.403657 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:16:47.403614 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod899a908e_40e7_476b_b7ba_7f9134886317.slice/crio-be09519b25f58837d8627a54a8664312f363c640ef0112437a22bf41d16647ca WatchSource:0}: Error finding container be09519b25f58837d8627a54a8664312f363c640ef0112437a22bf41d16647ca: Status 404 returned error can't find the container with id be09519b25f58837d8627a54a8664312f363c640ef0112437a22bf41d16647ca Apr 20 12:16:47.530636 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:47.530600 2580 generic.go:358] "Generic (PLEG): container finished" podID="899a908e-40e7-476b-b7ba-7f9134886317" containerID="02c9979ce7b6a1c9cae7d182eeb54c126756ddf7aae5d3a4ede843061f1dd97a" exitCode=0 Apr 20 12:16:47.530821 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:47.530689 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"899a908e-40e7-476b-b7ba-7f9134886317","Type":"ContainerDied","Data":"02c9979ce7b6a1c9cae7d182eeb54c126756ddf7aae5d3a4ede843061f1dd97a"} Apr 20 12:16:47.530821 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:47.530731 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"899a908e-40e7-476b-b7ba-7f9134886317","Type":"ContainerStarted","Data":"be09519b25f58837d8627a54a8664312f363c640ef0112437a22bf41d16647ca"} Apr 20 12:16:48.536878 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:48.536835 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-79cc798455-tph24" event={"ID":"22e5ac9a-8222-4837-ae48-0b9ca98383b1","Type":"ContainerStarted","Data":"20779a5f435ed928ab96329140b22721ae6e6b781813198b4a174e090284e2b5"} Apr 20 12:16:48.536878 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:48.536882 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-79cc798455-tph24" event={"ID":"22e5ac9a-8222-4837-ae48-0b9ca98383b1","Type":"ContainerStarted","Data":"1f1896eaa49e8fd8347d93c14246e530a3579f69eb8601e6dabf3d67e07ed6ee"} Apr 20 12:16:48.537347 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:48.536896 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-79cc798455-tph24" event={"ID":"22e5ac9a-8222-4837-ae48-0b9ca98383b1","Type":"ContainerStarted","Data":"3f29ea71c0c7bf2d6f95e06fb5db4d4fae8dbc26f3f8ec99d1e3c5e33a7226a5"} Apr 20 12:16:48.539653 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:48.539619 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"899a908e-40e7-476b-b7ba-7f9134886317","Type":"ContainerStarted","Data":"08eab1fb47bed706199cd96dbf8fbd118e17bae6fae17ad7e06b79e036a1d57f"} Apr 20 12:16:48.539768 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:48.539659 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"899a908e-40e7-476b-b7ba-7f9134886317","Type":"ContainerStarted","Data":"be23e94505dc1869b25af07ecfa06eaad47ad44cb0f3ea13a8952aa8f24dc3c2"} Apr 20 12:16:48.539768 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:48.539672 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"899a908e-40e7-476b-b7ba-7f9134886317","Type":"ContainerStarted","Data":"5b13f367ef88075ec667a8f5718b0327df6c243c4c884b14fffe5fbd9fd72a13"} Apr 20 12:16:48.539768 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:48.539685 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"899a908e-40e7-476b-b7ba-7f9134886317","Type":"ContainerStarted","Data":"55e66a4656720362c27449c2da448d8f8d28b49a9e4a45dce3802db92e5f1d79"} Apr 20 12:16:48.539768 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:48.539697 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"899a908e-40e7-476b-b7ba-7f9134886317","Type":"ContainerStarted","Data":"42a517bfc8e82339a112dc52b53a6a8db42fe065a5236ec7b042e1632be8dc52"} Apr 20 12:16:48.539768 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:48.539710 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"899a908e-40e7-476b-b7ba-7f9134886317","Type":"ContainerStarted","Data":"c1bb24fb0f250abe1336bfb902786943f657dba972a1ee32f9fe404b59104c9f"} Apr 20 12:16:48.558534 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:48.558483 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-79cc798455-tph24" podStartSLOduration=1.709293748 podStartE2EDuration="3.558466443s" podCreationTimestamp="2026-04-20 12:16:45 +0000 UTC" firstStartedPulling="2026-04-20 12:16:45.608517413 +0000 UTC m=+139.328531382" lastFinishedPulling="2026-04-20 12:16:47.457690089 +0000 UTC m=+141.177704077" observedRunningTime="2026-04-20 12:16:48.556641079 +0000 UTC m=+142.276655098" watchObservedRunningTime="2026-04-20 12:16:48.558466443 +0000 UTC m=+142.278480434" Apr 20 12:16:48.583116 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:48.583062 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.583048137 podStartE2EDuration="2.583048137s" podCreationTimestamp="2026-04-20 12:16:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 12:16:48.581383615 +0000 UTC m=+142.301397650" watchObservedRunningTime="2026-04-20 12:16:48.583048137 +0000 UTC m=+142.303062127" Apr 20 12:16:51.904073 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:51.904019 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:16:59.828340 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:16:59.828298 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-865b486975-pvr2h"] Apr 20 12:17:24.847336 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:17:24.847267 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-865b486975-pvr2h" podUID="e011dacd-0632-4429-8ccc-5da8870c87f4" containerName="console" containerID="cri-o://bbd06af96ee89f4ea09c18ef8f4994a36200485bc2b3fc5c1633fe2a6a2daea4" gracePeriod=15 Apr 20 12:17:25.090352 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:17:25.090328 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-865b486975-pvr2h_e011dacd-0632-4429-8ccc-5da8870c87f4/console/0.log" Apr 20 12:17:25.090519 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:17:25.090393 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-865b486975-pvr2h" Apr 20 12:17:25.208453 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:17:25.208331 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e011dacd-0632-4429-8ccc-5da8870c87f4-trusted-ca-bundle\") pod \"e011dacd-0632-4429-8ccc-5da8870c87f4\" (UID: \"e011dacd-0632-4429-8ccc-5da8870c87f4\") " Apr 20 12:17:25.208453 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:17:25.208385 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e011dacd-0632-4429-8ccc-5da8870c87f4-service-ca\") pod \"e011dacd-0632-4429-8ccc-5da8870c87f4\" (UID: \"e011dacd-0632-4429-8ccc-5da8870c87f4\") " Apr 20 12:17:25.208676 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:17:25.208465 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdt5x\" (UniqueName: \"kubernetes.io/projected/e011dacd-0632-4429-8ccc-5da8870c87f4-kube-api-access-kdt5x\") pod \"e011dacd-0632-4429-8ccc-5da8870c87f4\" (UID: \"e011dacd-0632-4429-8ccc-5da8870c87f4\") " Apr 20 12:17:25.208676 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:17:25.208532 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e011dacd-0632-4429-8ccc-5da8870c87f4-console-serving-cert\") pod \"e011dacd-0632-4429-8ccc-5da8870c87f4\" (UID: \"e011dacd-0632-4429-8ccc-5da8870c87f4\") " Apr 20 12:17:25.208676 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:17:25.208582 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e011dacd-0632-4429-8ccc-5da8870c87f4-oauth-serving-cert\") pod \"e011dacd-0632-4429-8ccc-5da8870c87f4\" (UID: \"e011dacd-0632-4429-8ccc-5da8870c87f4\") " Apr 20 12:17:25.208676 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:17:25.208640 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e011dacd-0632-4429-8ccc-5da8870c87f4-console-oauth-config\") pod \"e011dacd-0632-4429-8ccc-5da8870c87f4\" (UID: \"e011dacd-0632-4429-8ccc-5da8870c87f4\") " Apr 20 12:17:25.208882 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:17:25.208680 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e011dacd-0632-4429-8ccc-5da8870c87f4-console-config\") pod \"e011dacd-0632-4429-8ccc-5da8870c87f4\" (UID: \"e011dacd-0632-4429-8ccc-5da8870c87f4\") " Apr 20 12:17:25.208882 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:17:25.208837 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e011dacd-0632-4429-8ccc-5da8870c87f4-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e011dacd-0632-4429-8ccc-5da8870c87f4" (UID: "e011dacd-0632-4429-8ccc-5da8870c87f4"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 12:17:25.208882 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:17:25.208862 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e011dacd-0632-4429-8ccc-5da8870c87f4-service-ca" (OuterVolumeSpecName: "service-ca") pod "e011dacd-0632-4429-8ccc-5da8870c87f4" (UID: "e011dacd-0632-4429-8ccc-5da8870c87f4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 12:17:25.209122 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:17:25.209093 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e011dacd-0632-4429-8ccc-5da8870c87f4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e011dacd-0632-4429-8ccc-5da8870c87f4" (UID: "e011dacd-0632-4429-8ccc-5da8870c87f4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 12:17:25.209189 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:17:25.209103 2580 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e011dacd-0632-4429-8ccc-5da8870c87f4-trusted-ca-bundle\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:17:25.209189 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:17:25.209149 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e011dacd-0632-4429-8ccc-5da8870c87f4-console-config" (OuterVolumeSpecName: "console-config") pod "e011dacd-0632-4429-8ccc-5da8870c87f4" (UID: "e011dacd-0632-4429-8ccc-5da8870c87f4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 12:17:25.209189 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:17:25.209154 2580 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e011dacd-0632-4429-8ccc-5da8870c87f4-service-ca\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:17:25.210882 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:17:25.210852 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e011dacd-0632-4429-8ccc-5da8870c87f4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e011dacd-0632-4429-8ccc-5da8870c87f4" (UID: "e011dacd-0632-4429-8ccc-5da8870c87f4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 12:17:25.210997 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:17:25.210876 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e011dacd-0632-4429-8ccc-5da8870c87f4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e011dacd-0632-4429-8ccc-5da8870c87f4" (UID: "e011dacd-0632-4429-8ccc-5da8870c87f4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 12:17:25.210997 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:17:25.210959 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e011dacd-0632-4429-8ccc-5da8870c87f4-kube-api-access-kdt5x" (OuterVolumeSpecName: "kube-api-access-kdt5x") pod "e011dacd-0632-4429-8ccc-5da8870c87f4" (UID: "e011dacd-0632-4429-8ccc-5da8870c87f4"). InnerVolumeSpecName "kube-api-access-kdt5x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 12:17:25.310642 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:17:25.310591 2580 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e011dacd-0632-4429-8ccc-5da8870c87f4-console-serving-cert\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:17:25.310642 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:17:25.310635 2580 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e011dacd-0632-4429-8ccc-5da8870c87f4-oauth-serving-cert\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:17:25.310642 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:17:25.310646 2580 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e011dacd-0632-4429-8ccc-5da8870c87f4-console-oauth-config\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:17:25.310642 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:17:25.310655 2580 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e011dacd-0632-4429-8ccc-5da8870c87f4-console-config\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:17:25.310642 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:17:25.310664 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kdt5x\" (UniqueName: \"kubernetes.io/projected/e011dacd-0632-4429-8ccc-5da8870c87f4-kube-api-access-kdt5x\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:17:25.656201 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:17:25.656174 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-865b486975-pvr2h_e011dacd-0632-4429-8ccc-5da8870c87f4/console/0.log" Apr 20 12:17:25.656369 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:17:25.656220 2580 generic.go:358] "Generic (PLEG): container finished" podID="e011dacd-0632-4429-8ccc-5da8870c87f4" containerID="bbd06af96ee89f4ea09c18ef8f4994a36200485bc2b3fc5c1633fe2a6a2daea4" exitCode=2 Apr 20 12:17:25.656369 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:17:25.656287 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-865b486975-pvr2h" event={"ID":"e011dacd-0632-4429-8ccc-5da8870c87f4","Type":"ContainerDied","Data":"bbd06af96ee89f4ea09c18ef8f4994a36200485bc2b3fc5c1633fe2a6a2daea4"} Apr 20 12:17:25.656369 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:17:25.656315 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-865b486975-pvr2h" event={"ID":"e011dacd-0632-4429-8ccc-5da8870c87f4","Type":"ContainerDied","Data":"abb2a051b5af9572a2f32b9aaf8cd8f73ba0aea14e13b362752c0c5c520b4bb9"} Apr 20 12:17:25.656369 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:17:25.656331 2580 scope.go:117] "RemoveContainer" containerID="bbd06af96ee89f4ea09c18ef8f4994a36200485bc2b3fc5c1633fe2a6a2daea4" Apr 20 12:17:25.656369 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:17:25.656295 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-865b486975-pvr2h" Apr 20 12:17:25.664690 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:17:25.664663 2580 scope.go:117] "RemoveContainer" containerID="bbd06af96ee89f4ea09c18ef8f4994a36200485bc2b3fc5c1633fe2a6a2daea4" Apr 20 12:17:25.664990 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:17:25.664957 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbd06af96ee89f4ea09c18ef8f4994a36200485bc2b3fc5c1633fe2a6a2daea4\": container with ID starting with bbd06af96ee89f4ea09c18ef8f4994a36200485bc2b3fc5c1633fe2a6a2daea4 not found: ID does not exist" containerID="bbd06af96ee89f4ea09c18ef8f4994a36200485bc2b3fc5c1633fe2a6a2daea4" Apr 20 12:17:25.665065 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:17:25.665004 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbd06af96ee89f4ea09c18ef8f4994a36200485bc2b3fc5c1633fe2a6a2daea4"} err="failed to get container status \"bbd06af96ee89f4ea09c18ef8f4994a36200485bc2b3fc5c1633fe2a6a2daea4\": rpc error: code = NotFound desc = could not find container \"bbd06af96ee89f4ea09c18ef8f4994a36200485bc2b3fc5c1633fe2a6a2daea4\": container with ID starting with bbd06af96ee89f4ea09c18ef8f4994a36200485bc2b3fc5c1633fe2a6a2daea4 not found: ID does not exist" Apr 20 12:17:25.676593 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:17:25.676563 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-865b486975-pvr2h"] Apr 20 12:17:25.685348 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:17:25.685319 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-865b486975-pvr2h"] Apr 20 12:17:26.846609 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:17:26.846580 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e011dacd-0632-4429-8ccc-5da8870c87f4" path="/var/lib/kubelet/pods/e011dacd-0632-4429-8ccc-5da8870c87f4/volumes" Apr 20 12:17:46.903591 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:17:46.903551 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:17:46.919350 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:17:46.919320 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:17:47.738695 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:17:47.738659 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 12:18:48.294532 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:18:48.294451 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-9zp5f"] Apr 20 12:18:48.294934 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:18:48.294768 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e011dacd-0632-4429-8ccc-5da8870c87f4" containerName="console" Apr 20 12:18:48.294934 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:18:48.294780 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="e011dacd-0632-4429-8ccc-5da8870c87f4" containerName="console" Apr 20 12:18:48.294934 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:18:48.294864 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="e011dacd-0632-4429-8ccc-5da8870c87f4" containerName="console" Apr 20 12:18:48.297546 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:18:48.297529 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9zp5f" Apr 20 12:18:48.299761 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:18:48.299736 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 12:18:48.304965 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:18:48.304798 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9zp5f"] Apr 20 12:18:48.366046 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:18:48.366009 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/41d4bf7c-87b7-4e06-b676-27b36ca55dd9-kubelet-config\") pod \"global-pull-secret-syncer-9zp5f\" (UID: \"41d4bf7c-87b7-4e06-b676-27b36ca55dd9\") " pod="kube-system/global-pull-secret-syncer-9zp5f" Apr 20 12:18:48.366247 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:18:48.366057 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/41d4bf7c-87b7-4e06-b676-27b36ca55dd9-dbus\") pod \"global-pull-secret-syncer-9zp5f\" (UID: \"41d4bf7c-87b7-4e06-b676-27b36ca55dd9\") " pod="kube-system/global-pull-secret-syncer-9zp5f" Apr 20 12:18:48.366247 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:18:48.366076 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/41d4bf7c-87b7-4e06-b676-27b36ca55dd9-original-pull-secret\") pod \"global-pull-secret-syncer-9zp5f\" (UID: \"41d4bf7c-87b7-4e06-b676-27b36ca55dd9\") " pod="kube-system/global-pull-secret-syncer-9zp5f" Apr 20 12:18:48.466928 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:18:48.466884 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/41d4bf7c-87b7-4e06-b676-27b36ca55dd9-kubelet-config\") pod \"global-pull-secret-syncer-9zp5f\" (UID: \"41d4bf7c-87b7-4e06-b676-27b36ca55dd9\") " pod="kube-system/global-pull-secret-syncer-9zp5f" Apr 20 12:18:48.467110 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:18:48.466937 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/41d4bf7c-87b7-4e06-b676-27b36ca55dd9-dbus\") pod \"global-pull-secret-syncer-9zp5f\" (UID: \"41d4bf7c-87b7-4e06-b676-27b36ca55dd9\") " pod="kube-system/global-pull-secret-syncer-9zp5f" Apr 20 12:18:48.467110 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:18:48.466970 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/41d4bf7c-87b7-4e06-b676-27b36ca55dd9-original-pull-secret\") pod \"global-pull-secret-syncer-9zp5f\" (UID: \"41d4bf7c-87b7-4e06-b676-27b36ca55dd9\") " pod="kube-system/global-pull-secret-syncer-9zp5f" Apr 20 12:18:48.467110 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:18:48.467022 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/41d4bf7c-87b7-4e06-b676-27b36ca55dd9-kubelet-config\") pod \"global-pull-secret-syncer-9zp5f\" (UID: \"41d4bf7c-87b7-4e06-b676-27b36ca55dd9\") " pod="kube-system/global-pull-secret-syncer-9zp5f" Apr 20 12:18:48.467110 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:18:48.467082 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/41d4bf7c-87b7-4e06-b676-27b36ca55dd9-dbus\") pod \"global-pull-secret-syncer-9zp5f\" (UID: \"41d4bf7c-87b7-4e06-b676-27b36ca55dd9\") " pod="kube-system/global-pull-secret-syncer-9zp5f" Apr 20 12:18:48.469263 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:18:48.469243 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/41d4bf7c-87b7-4e06-b676-27b36ca55dd9-original-pull-secret\") pod \"global-pull-secret-syncer-9zp5f\" (UID: \"41d4bf7c-87b7-4e06-b676-27b36ca55dd9\") " pod="kube-system/global-pull-secret-syncer-9zp5f" Apr 20 12:18:48.607587 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:18:48.607479 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9zp5f" Apr 20 12:18:48.730133 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:18:48.730098 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9zp5f"] Apr 20 12:18:48.733563 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:18:48.733530 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41d4bf7c_87b7_4e06_b676_27b36ca55dd9.slice/crio-43134cd3435363fdb10868879bef9eabb9c2f0750112f309fbbabea6232dc19a WatchSource:0}: Error finding container 43134cd3435363fdb10868879bef9eabb9c2f0750112f309fbbabea6232dc19a: Status 404 returned error can't find the container with id 43134cd3435363fdb10868879bef9eabb9c2f0750112f309fbbabea6232dc19a Apr 20 12:18:48.893308 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:18:48.893220 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9zp5f" event={"ID":"41d4bf7c-87b7-4e06-b676-27b36ca55dd9","Type":"ContainerStarted","Data":"43134cd3435363fdb10868879bef9eabb9c2f0750112f309fbbabea6232dc19a"} Apr 20 12:18:52.906154 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:18:52.906114 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9zp5f" event={"ID":"41d4bf7c-87b7-4e06-b676-27b36ca55dd9","Type":"ContainerStarted","Data":"cc1aefd2423e6e96a7483f43860c902484c9b717d8066701ffe430e138687fc3"} Apr 20 12:18:52.921776 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:18:52.921713 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-9zp5f" podStartSLOduration=1.158377629 podStartE2EDuration="4.921696758s" podCreationTimestamp="2026-04-20 12:18:48 +0000 UTC" firstStartedPulling="2026-04-20 12:18:48.735500534 +0000 UTC m=+262.455514503" lastFinishedPulling="2026-04-20 12:18:52.498819646 +0000 UTC m=+266.218833632" observedRunningTime="2026-04-20 12:18:52.919742189 +0000 UTC m=+266.639756194" watchObservedRunningTime="2026-04-20 12:18:52.921696758 +0000 UTC m=+266.641710748" Apr 20 12:19:26.717575 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:19:26.717546 2580 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 12:20:11.119938 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:11.119898 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-w9krv"] Apr 20 12:20:11.123341 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:11.123322 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-w9krv" Apr 20 12:20:11.125741 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:11.125703 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 20 12:20:11.125915 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:11.125770 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 20 12:20:11.125915 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:11.125807 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-5shsq\"" Apr 20 12:20:11.129259 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:11.129230 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-w9krv"] Apr 20 12:20:11.279471 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:11.279437 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/353333db-f87d-4a03-99c3-157814a035e6-bound-sa-token\") pod \"cert-manager-759f64656b-w9krv\" (UID: \"353333db-f87d-4a03-99c3-157814a035e6\") " pod="cert-manager/cert-manager-759f64656b-w9krv" Apr 20 12:20:11.279644 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:11.279553 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6kgh\" (UniqueName: \"kubernetes.io/projected/353333db-f87d-4a03-99c3-157814a035e6-kube-api-access-h6kgh\") pod \"cert-manager-759f64656b-w9krv\" (UID: \"353333db-f87d-4a03-99c3-157814a035e6\") " pod="cert-manager/cert-manager-759f64656b-w9krv" Apr 20 12:20:11.380856 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:11.380749 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h6kgh\" (UniqueName: \"kubernetes.io/projected/353333db-f87d-4a03-99c3-157814a035e6-kube-api-access-h6kgh\") pod \"cert-manager-759f64656b-w9krv\" (UID: \"353333db-f87d-4a03-99c3-157814a035e6\") " pod="cert-manager/cert-manager-759f64656b-w9krv" Apr 20 12:20:11.381027 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:11.380898 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/353333db-f87d-4a03-99c3-157814a035e6-bound-sa-token\") pod \"cert-manager-759f64656b-w9krv\" (UID: \"353333db-f87d-4a03-99c3-157814a035e6\") " pod="cert-manager/cert-manager-759f64656b-w9krv" Apr 20 12:20:11.388295 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:11.388269 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/353333db-f87d-4a03-99c3-157814a035e6-bound-sa-token\") pod \"cert-manager-759f64656b-w9krv\" (UID: \"353333db-f87d-4a03-99c3-157814a035e6\") " pod="cert-manager/cert-manager-759f64656b-w9krv" Apr 20 12:20:11.388495 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:11.388476 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6kgh\" (UniqueName: \"kubernetes.io/projected/353333db-f87d-4a03-99c3-157814a035e6-kube-api-access-h6kgh\") pod \"cert-manager-759f64656b-w9krv\" (UID: \"353333db-f87d-4a03-99c3-157814a035e6\") " pod="cert-manager/cert-manager-759f64656b-w9krv" Apr 20 12:20:11.444053 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:11.444010 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-w9krv" Apr 20 12:20:11.564213 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:11.564180 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-w9krv"] Apr 20 12:20:11.567172 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:20:11.567141 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod353333db_f87d_4a03_99c3_157814a035e6.slice/crio-2f632e1a5f9e67a6c429cbdae5c6e5aa59d7c80a1f25180a65ecc5229ac58557 WatchSource:0}: Error finding container 2f632e1a5f9e67a6c429cbdae5c6e5aa59d7c80a1f25180a65ecc5229ac58557: Status 404 returned error can't find the container with id 2f632e1a5f9e67a6c429cbdae5c6e5aa59d7c80a1f25180a65ecc5229ac58557 Apr 20 12:20:11.568978 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:11.568963 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 12:20:12.144468 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:12.144426 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-w9krv" event={"ID":"353333db-f87d-4a03-99c3-157814a035e6","Type":"ContainerStarted","Data":"2f632e1a5f9e67a6c429cbdae5c6e5aa59d7c80a1f25180a65ecc5229ac58557"} Apr 20 12:20:15.156207 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:15.156108 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-w9krv" event={"ID":"353333db-f87d-4a03-99c3-157814a035e6","Type":"ContainerStarted","Data":"848fad4e09345103028d24b790d9c1319f43005338cec481da93d50c4c0d046e"} Apr 20 12:20:15.173128 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:15.173064 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-w9krv" podStartSLOduration=0.898391498 podStartE2EDuration="4.173046664s" podCreationTimestamp="2026-04-20 12:20:11 +0000 UTC" firstStartedPulling="2026-04-20 12:20:11.569089257 +0000 UTC m=+345.289103226" lastFinishedPulling="2026-04-20 12:20:14.843744421 +0000 UTC m=+348.563758392" observedRunningTime="2026-04-20 12:20:15.1715954 +0000 UTC m=+348.891609391" watchObservedRunningTime="2026-04-20 12:20:15.173046664 +0000 UTC m=+348.893060654" Apr 20 12:20:27.292010 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:27.291970 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-5hd7q"] Apr 20 12:20:27.295497 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:27.295478 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-operator-747c5859c7-5hd7q" Apr 20 12:20:27.297835 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:27.297811 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"openshift-service-ca.crt\"" Apr 20 12:20:27.298610 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:27.298591 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"kube-root-ca.crt\"" Apr 20 12:20:27.298706 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:27.298654 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"jobset-operator-dockercfg-qzx5c\"" Apr 20 12:20:27.304919 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:27.304897 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-5hd7q"] Apr 20 12:20:27.319870 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:27.319839 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f860921a-209f-4315-abc7-2f35ba7024eb-tmp\") pod \"jobset-operator-747c5859c7-5hd7q\" (UID: \"f860921a-209f-4315-abc7-2f35ba7024eb\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-5hd7q" Apr 20 12:20:27.319980 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:27.319878 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkx8l\" (UniqueName: \"kubernetes.io/projected/f860921a-209f-4315-abc7-2f35ba7024eb-kube-api-access-zkx8l\") pod \"jobset-operator-747c5859c7-5hd7q\" (UID: \"f860921a-209f-4315-abc7-2f35ba7024eb\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-5hd7q" Apr 20 12:20:27.421159 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:27.421121 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f860921a-209f-4315-abc7-2f35ba7024eb-tmp\") pod \"jobset-operator-747c5859c7-5hd7q\" (UID: \"f860921a-209f-4315-abc7-2f35ba7024eb\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-5hd7q" Apr 20 12:20:27.421159 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:27.421164 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zkx8l\" (UniqueName: \"kubernetes.io/projected/f860921a-209f-4315-abc7-2f35ba7024eb-kube-api-access-zkx8l\") pod \"jobset-operator-747c5859c7-5hd7q\" (UID: \"f860921a-209f-4315-abc7-2f35ba7024eb\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-5hd7q" Apr 20 12:20:27.421619 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:27.421600 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f860921a-209f-4315-abc7-2f35ba7024eb-tmp\") pod \"jobset-operator-747c5859c7-5hd7q\" (UID: \"f860921a-209f-4315-abc7-2f35ba7024eb\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-5hd7q" Apr 20 12:20:27.429214 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:27.429190 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkx8l\" (UniqueName: \"kubernetes.io/projected/f860921a-209f-4315-abc7-2f35ba7024eb-kube-api-access-zkx8l\") pod \"jobset-operator-747c5859c7-5hd7q\" (UID: \"f860921a-209f-4315-abc7-2f35ba7024eb\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-5hd7q" Apr 20 12:20:27.605169 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:27.605054 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-operator-747c5859c7-5hd7q" Apr 20 12:20:27.733084 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:27.733048 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-5hd7q"] Apr 20 12:20:27.736184 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:20:27.736155 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf860921a_209f_4315_abc7_2f35ba7024eb.slice/crio-4afe9de8bc829b1f63ca24caf281f1108fbdaee53340ec70fe358dead39c185a WatchSource:0}: Error finding container 4afe9de8bc829b1f63ca24caf281f1108fbdaee53340ec70fe358dead39c185a: Status 404 returned error can't find the container with id 4afe9de8bc829b1f63ca24caf281f1108fbdaee53340ec70fe358dead39c185a Apr 20 12:20:28.195712 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:28.195676 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-operator-747c5859c7-5hd7q" event={"ID":"f860921a-209f-4315-abc7-2f35ba7024eb","Type":"ContainerStarted","Data":"4afe9de8bc829b1f63ca24caf281f1108fbdaee53340ec70fe358dead39c185a"} Apr 20 12:20:31.206747 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:31.206643 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-operator-747c5859c7-5hd7q" event={"ID":"f860921a-209f-4315-abc7-2f35ba7024eb","Type":"ContainerStarted","Data":"e00ba8ad7cad4aefc1112c2d95e0752a29f755054299ddf8dd84f6a3bfa3b827"} Apr 20 12:20:31.222929 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:31.222875 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-jobset-operator/jobset-operator-747c5859c7-5hd7q" podStartSLOduration=1.086972533 podStartE2EDuration="4.222861311s" podCreationTimestamp="2026-04-20 12:20:27 +0000 UTC" firstStartedPulling="2026-04-20 12:20:27.737860664 +0000 UTC m=+361.457874632" lastFinishedPulling="2026-04-20 12:20:30.873749442 +0000 UTC m=+364.593763410" observedRunningTime="2026-04-20 12:20:31.221210414 +0000 UTC m=+364.941224398" watchObservedRunningTime="2026-04-20 12:20:31.222861311 +0000 UTC m=+364.942875301" Apr 20 12:20:56.493691 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:56.493655 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-mq294"] Apr 20 12:20:56.502711 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:56.502683 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-mq294" Apr 20 12:20:56.505145 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:56.505117 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-webhook-cert\"" Apr 20 12:20:56.505362 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:56.505344 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-controller-manager-dockercfg-vrr5s\"" Apr 20 12:20:56.505536 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:56.505522 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kubeflow-trainer-config\"" Apr 20 12:20:56.506051 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:56.506028 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 12:20:56.506144 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:56.506058 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 12:20:56.506144 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:56.506099 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-mq294"] Apr 20 12:20:56.570130 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:56.570095 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9311bed3-a12b-4280-8dd9-4d1b12d716e8-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-mq294\" (UID: \"9311bed3-a12b-4280-8dd9-4d1b12d716e8\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-mq294" Apr 20 12:20:56.570322 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:56.570162 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbczc\" (UniqueName: \"kubernetes.io/projected/9311bed3-a12b-4280-8dd9-4d1b12d716e8-kube-api-access-fbczc\") pod \"kubeflow-trainer-controller-manager-55f5694779-mq294\" (UID: \"9311bed3-a12b-4280-8dd9-4d1b12d716e8\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-mq294" Apr 20 12:20:56.570322 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:56.570193 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/9311bed3-a12b-4280-8dd9-4d1b12d716e8-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-mq294\" (UID: \"9311bed3-a12b-4280-8dd9-4d1b12d716e8\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-mq294" Apr 20 12:20:56.671691 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:56.671647 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fbczc\" (UniqueName: \"kubernetes.io/projected/9311bed3-a12b-4280-8dd9-4d1b12d716e8-kube-api-access-fbczc\") pod \"kubeflow-trainer-controller-manager-55f5694779-mq294\" (UID: \"9311bed3-a12b-4280-8dd9-4d1b12d716e8\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-mq294" Apr 20 12:20:56.671882 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:56.671703 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/9311bed3-a12b-4280-8dd9-4d1b12d716e8-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-mq294\" (UID: \"9311bed3-a12b-4280-8dd9-4d1b12d716e8\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-mq294" Apr 20 12:20:56.671882 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:56.671772 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9311bed3-a12b-4280-8dd9-4d1b12d716e8-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-mq294\" (UID: \"9311bed3-a12b-4280-8dd9-4d1b12d716e8\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-mq294" Apr 20 12:20:56.672883 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:56.672854 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/9311bed3-a12b-4280-8dd9-4d1b12d716e8-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-mq294\" (UID: \"9311bed3-a12b-4280-8dd9-4d1b12d716e8\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-mq294" Apr 20 12:20:56.674164 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:56.674141 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9311bed3-a12b-4280-8dd9-4d1b12d716e8-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-mq294\" (UID: \"9311bed3-a12b-4280-8dd9-4d1b12d716e8\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-mq294" Apr 20 12:20:56.680661 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:56.680637 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbczc\" (UniqueName: \"kubernetes.io/projected/9311bed3-a12b-4280-8dd9-4d1b12d716e8-kube-api-access-fbczc\") pod \"kubeflow-trainer-controller-manager-55f5694779-mq294\" (UID: \"9311bed3-a12b-4280-8dd9-4d1b12d716e8\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-mq294" Apr 20 12:20:56.814093 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:56.814055 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-mq294" Apr 20 12:20:56.939716 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:56.939675 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-mq294"] Apr 20 12:20:56.942244 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:20:56.942214 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9311bed3_a12b_4280_8dd9_4d1b12d716e8.slice/crio-e93dab30f4970fdf28212cba44f3028ee76420c72b49556c39166306cd967121 WatchSource:0}: Error finding container e93dab30f4970fdf28212cba44f3028ee76420c72b49556c39166306cd967121: Status 404 returned error can't find the container with id e93dab30f4970fdf28212cba44f3028ee76420c72b49556c39166306cd967121 Apr 20 12:20:57.283743 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:57.283701 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-mq294" event={"ID":"9311bed3-a12b-4280-8dd9-4d1b12d716e8","Type":"ContainerStarted","Data":"e93dab30f4970fdf28212cba44f3028ee76420c72b49556c39166306cd967121"} Apr 20 12:20:59.291382 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:59.291342 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-mq294" event={"ID":"9311bed3-a12b-4280-8dd9-4d1b12d716e8","Type":"ContainerStarted","Data":"ddae2334da0c5d804d34722d95f30b7b00735f3b0f3c0ffb11c537d265037e4a"} Apr 20 12:20:59.291795 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:59.291482 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-mq294" Apr 20 12:20:59.309292 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:20:59.309228 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-mq294" podStartSLOduration=1.095750654 podStartE2EDuration="3.309205017s" podCreationTimestamp="2026-04-20 12:20:56 +0000 UTC" firstStartedPulling="2026-04-20 12:20:56.944055946 +0000 UTC m=+390.664069915" lastFinishedPulling="2026-04-20 12:20:59.157510293 +0000 UTC m=+392.877524278" observedRunningTime="2026-04-20 12:20:59.307385767 +0000 UTC m=+393.027399758" watchObservedRunningTime="2026-04-20 12:20:59.309205017 +0000 UTC m=+393.029219009" Apr 20 12:21:15.300560 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:21:15.300525 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-mq294" Apr 20 12:22:52.170475 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:22:52.170433 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-s576s/test-trainjob-p9rjp-node-0-0-5w6zv"] Apr 20 12:22:52.172653 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:22:52.172636 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-s576s/test-trainjob-p9rjp-node-0-0-5w6zv" Apr 20 12:22:52.174835 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:22:52.174803 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-s576s\"/\"openshift-service-ca.crt\"" Apr 20 12:22:52.175594 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:22:52.175575 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-s576s\"/\"default-dockercfg-z4vrn\"" Apr 20 12:22:52.175716 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:22:52.175602 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-s576s\"/\"kube-root-ca.crt\"" Apr 20 12:22:52.182191 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:22:52.182163 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-s576s/test-trainjob-p9rjp-node-0-0-5w6zv"] Apr 20 12:22:52.298557 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:22:52.298511 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwb5b\" (UniqueName: \"kubernetes.io/projected/a5e47330-30ad-4ea6-a652-f6fabbbd4995-kube-api-access-gwb5b\") pod \"test-trainjob-p9rjp-node-0-0-5w6zv\" (UID: \"a5e47330-30ad-4ea6-a652-f6fabbbd4995\") " pod="test-ns-s576s/test-trainjob-p9rjp-node-0-0-5w6zv" Apr 20 12:22:52.399645 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:22:52.399611 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwb5b\" (UniqueName: \"kubernetes.io/projected/a5e47330-30ad-4ea6-a652-f6fabbbd4995-kube-api-access-gwb5b\") pod \"test-trainjob-p9rjp-node-0-0-5w6zv\" (UID: \"a5e47330-30ad-4ea6-a652-f6fabbbd4995\") " pod="test-ns-s576s/test-trainjob-p9rjp-node-0-0-5w6zv" Apr 20 12:22:52.407653 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:22:52.407622 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwb5b\" (UniqueName: \"kubernetes.io/projected/a5e47330-30ad-4ea6-a652-f6fabbbd4995-kube-api-access-gwb5b\") pod \"test-trainjob-p9rjp-node-0-0-5w6zv\" (UID: \"a5e47330-30ad-4ea6-a652-f6fabbbd4995\") " pod="test-ns-s576s/test-trainjob-p9rjp-node-0-0-5w6zv" Apr 20 12:22:52.483595 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:22:52.483487 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-s576s/test-trainjob-p9rjp-node-0-0-5w6zv" Apr 20 12:22:52.610902 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:22:52.610873 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-s576s/test-trainjob-p9rjp-node-0-0-5w6zv"] Apr 20 12:22:52.613725 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:22:52.613678 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5e47330_30ad_4ea6_a652_f6fabbbd4995.slice/crio-4211680f21add2760fc8e7b772b41b534ed1e53ecb9644da06d43577390c90de WatchSource:0}: Error finding container 4211680f21add2760fc8e7b772b41b534ed1e53ecb9644da06d43577390c90de: Status 404 returned error can't find the container with id 4211680f21add2760fc8e7b772b41b534ed1e53ecb9644da06d43577390c90de Apr 20 12:22:52.634414 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:22:52.634374 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-s576s/test-trainjob-p9rjp-node-0-0-5w6zv" event={"ID":"a5e47330-30ad-4ea6-a652-f6fabbbd4995","Type":"ContainerStarted","Data":"4211680f21add2760fc8e7b772b41b534ed1e53ecb9644da06d43577390c90de"} Apr 20 12:27:00.501997 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:27:00.501949 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-s576s/test-trainjob-p9rjp-node-0-0-5w6zv" event={"ID":"a5e47330-30ad-4ea6-a652-f6fabbbd4995","Type":"ContainerStarted","Data":"ca47e80dc469c32bdd998d0d9ea09ba1df14f81acbf32c59fd31e871b157682b"} Apr 20 12:27:00.527582 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:27:00.527515 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-s576s/test-trainjob-p9rjp-node-0-0-5w6zv" podStartSLOduration=1.631411881 podStartE2EDuration="4m8.52749482s" podCreationTimestamp="2026-04-20 12:22:52 +0000 UTC" firstStartedPulling="2026-04-20 12:22:52.615823766 +0000 UTC m=+506.335837735" lastFinishedPulling="2026-04-20 12:26:59.511906694 +0000 UTC m=+753.231920674" observedRunningTime="2026-04-20 12:27:00.526432919 +0000 UTC m=+754.246446910" watchObservedRunningTime="2026-04-20 12:27:00.52749482 +0000 UTC m=+754.247508811" Apr 20 12:27:05.520500 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:27:05.520465 2580 generic.go:358] "Generic (PLEG): container finished" podID="a5e47330-30ad-4ea6-a652-f6fabbbd4995" containerID="ca47e80dc469c32bdd998d0d9ea09ba1df14f81acbf32c59fd31e871b157682b" exitCode=0 Apr 20 12:27:05.520975 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:27:05.520543 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-s576s/test-trainjob-p9rjp-node-0-0-5w6zv" event={"ID":"a5e47330-30ad-4ea6-a652-f6fabbbd4995","Type":"ContainerDied","Data":"ca47e80dc469c32bdd998d0d9ea09ba1df14f81acbf32c59fd31e871b157682b"} Apr 20 12:27:06.872554 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:27:06.872528 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-s576s/test-trainjob-p9rjp-node-0-0-5w6zv" Apr 20 12:27:06.979484 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:27:06.979443 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwb5b\" (UniqueName: \"kubernetes.io/projected/a5e47330-30ad-4ea6-a652-f6fabbbd4995-kube-api-access-gwb5b\") pod \"a5e47330-30ad-4ea6-a652-f6fabbbd4995\" (UID: \"a5e47330-30ad-4ea6-a652-f6fabbbd4995\") " Apr 20 12:27:06.981708 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:27:06.981669 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5e47330-30ad-4ea6-a652-f6fabbbd4995-kube-api-access-gwb5b" (OuterVolumeSpecName: "kube-api-access-gwb5b") pod "a5e47330-30ad-4ea6-a652-f6fabbbd4995" (UID: "a5e47330-30ad-4ea6-a652-f6fabbbd4995"). InnerVolumeSpecName "kube-api-access-gwb5b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 12:27:07.081217 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:27:07.081120 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gwb5b\" (UniqueName: \"kubernetes.io/projected/a5e47330-30ad-4ea6-a652-f6fabbbd4995-kube-api-access-gwb5b\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:27:07.528594 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:27:07.528553 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-s576s/test-trainjob-p9rjp-node-0-0-5w6zv" event={"ID":"a5e47330-30ad-4ea6-a652-f6fabbbd4995","Type":"ContainerDied","Data":"4211680f21add2760fc8e7b772b41b534ed1e53ecb9644da06d43577390c90de"} Apr 20 12:27:07.528594 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:27:07.528571 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-s576s/test-trainjob-p9rjp-node-0-0-5w6zv" Apr 20 12:27:07.528594 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:27:07.528588 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4211680f21add2760fc8e7b772b41b534ed1e53ecb9644da06d43577390c90de" Apr 20 12:27:08.221084 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:27:08.221046 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-c94cp/test-trainjob-vz8ww-node-0-0-8m848"] Apr 20 12:27:08.221501 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:27:08.221376 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a5e47330-30ad-4ea6-a652-f6fabbbd4995" containerName="node" Apr 20 12:27:08.221501 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:27:08.221386 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e47330-30ad-4ea6-a652-f6fabbbd4995" containerName="node" Apr 20 12:27:08.221501 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:27:08.221453 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="a5e47330-30ad-4ea6-a652-f6fabbbd4995" containerName="node" Apr 20 12:27:08.245319 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:27:08.245284 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-c94cp/test-trainjob-vz8ww-node-0-0-8m848"] Apr 20 12:27:08.245485 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:27:08.245387 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-c94cp/test-trainjob-vz8ww-node-0-0-8m848" Apr 20 12:27:08.247749 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:27:08.247692 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-c94cp\"/\"openshift-service-ca.crt\"" Apr 20 12:27:08.248617 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:27:08.248599 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-c94cp\"/\"default-dockercfg-mf42b\"" Apr 20 12:27:08.248713 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:27:08.248688 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-c94cp\"/\"kube-root-ca.crt\"" Apr 20 12:27:08.394238 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:27:08.394196 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwbbk\" (UniqueName: \"kubernetes.io/projected/c2fb40db-e8de-4ad6-b990-6224089ceb9b-kube-api-access-dwbbk\") pod \"test-trainjob-vz8ww-node-0-0-8m848\" (UID: \"c2fb40db-e8de-4ad6-b990-6224089ceb9b\") " pod="test-ns-c94cp/test-trainjob-vz8ww-node-0-0-8m848" Apr 20 12:27:08.495233 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:27:08.495137 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dwbbk\" (UniqueName: \"kubernetes.io/projected/c2fb40db-e8de-4ad6-b990-6224089ceb9b-kube-api-access-dwbbk\") pod \"test-trainjob-vz8ww-node-0-0-8m848\" (UID: \"c2fb40db-e8de-4ad6-b990-6224089ceb9b\") " pod="test-ns-c94cp/test-trainjob-vz8ww-node-0-0-8m848" Apr 20 12:27:08.503300 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:27:08.503269 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwbbk\" (UniqueName: \"kubernetes.io/projected/c2fb40db-e8de-4ad6-b990-6224089ceb9b-kube-api-access-dwbbk\") pod \"test-trainjob-vz8ww-node-0-0-8m848\" (UID: \"c2fb40db-e8de-4ad6-b990-6224089ceb9b\") " pod="test-ns-c94cp/test-trainjob-vz8ww-node-0-0-8m848" Apr 20 12:27:08.554176 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:27:08.554138 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-c94cp/test-trainjob-vz8ww-node-0-0-8m848" Apr 20 12:27:08.746868 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:27:08.746789 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-c94cp/test-trainjob-vz8ww-node-0-0-8m848"] Apr 20 12:27:08.750723 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:27:08.750694 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2fb40db_e8de_4ad6_b990_6224089ceb9b.slice/crio-d9f06668e9d2e56543a6ece3e8693765331aad1b38f46748b410a4714aae626f WatchSource:0}: Error finding container d9f06668e9d2e56543a6ece3e8693765331aad1b38f46748b410a4714aae626f: Status 404 returned error can't find the container with id d9f06668e9d2e56543a6ece3e8693765331aad1b38f46748b410a4714aae626f Apr 20 12:27:08.752499 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:27:08.752480 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 12:27:09.536633 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:27:09.536595 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-c94cp/test-trainjob-vz8ww-node-0-0-8m848" event={"ID":"c2fb40db-e8de-4ad6-b990-6224089ceb9b","Type":"ContainerStarted","Data":"d9f06668e9d2e56543a6ece3e8693765331aad1b38f46748b410a4714aae626f"} Apr 20 12:31:06.390043 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:31:06.390008 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-c94cp/test-trainjob-vz8ww-node-0-0-8m848" event={"ID":"c2fb40db-e8de-4ad6-b990-6224089ceb9b","Type":"ContainerStarted","Data":"fec3f7029edfe8ce76d761e38fb9f96e528825e7e9b05a62712cd7dfc9a26998"} Apr 20 12:31:06.413334 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:31:06.413279 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-c94cp/test-trainjob-vz8ww-node-0-0-8m848" podStartSLOduration=1.469578424 podStartE2EDuration="3m58.413265548s" podCreationTimestamp="2026-04-20 12:27:08 +0000 UTC" firstStartedPulling="2026-04-20 12:27:08.752603305 +0000 UTC m=+762.472617274" lastFinishedPulling="2026-04-20 12:31:05.69629043 +0000 UTC m=+999.416304398" observedRunningTime="2026-04-20 12:31:06.412218494 +0000 UTC m=+1000.132232480" watchObservedRunningTime="2026-04-20 12:31:06.413265548 +0000 UTC m=+1000.133279537" Apr 20 12:31:12.413131 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:31:12.413098 2580 generic.go:358] "Generic (PLEG): container finished" podID="c2fb40db-e8de-4ad6-b990-6224089ceb9b" containerID="fec3f7029edfe8ce76d761e38fb9f96e528825e7e9b05a62712cd7dfc9a26998" exitCode=0 Apr 20 12:31:12.413574 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:31:12.413179 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-c94cp/test-trainjob-vz8ww-node-0-0-8m848" event={"ID":"c2fb40db-e8de-4ad6-b990-6224089ceb9b","Type":"ContainerDied","Data":"fec3f7029edfe8ce76d761e38fb9f96e528825e7e9b05a62712cd7dfc9a26998"} Apr 20 12:31:13.546211 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:31:13.546189 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-c94cp/test-trainjob-vz8ww-node-0-0-8m848" Apr 20 12:31:13.646191 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:31:13.646156 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwbbk\" (UniqueName: \"kubernetes.io/projected/c2fb40db-e8de-4ad6-b990-6224089ceb9b-kube-api-access-dwbbk\") pod \"c2fb40db-e8de-4ad6-b990-6224089ceb9b\" (UID: \"c2fb40db-e8de-4ad6-b990-6224089ceb9b\") " Apr 20 12:31:13.648340 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:31:13.648315 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2fb40db-e8de-4ad6-b990-6224089ceb9b-kube-api-access-dwbbk" (OuterVolumeSpecName: "kube-api-access-dwbbk") pod "c2fb40db-e8de-4ad6-b990-6224089ceb9b" (UID: "c2fb40db-e8de-4ad6-b990-6224089ceb9b"). InnerVolumeSpecName "kube-api-access-dwbbk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 12:31:13.747770 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:31:13.747679 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dwbbk\" (UniqueName: \"kubernetes.io/projected/c2fb40db-e8de-4ad6-b990-6224089ceb9b-kube-api-access-dwbbk\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:31:14.421517 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:31:14.421486 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-c94cp/test-trainjob-vz8ww-node-0-0-8m848" Apr 20 12:31:14.421517 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:31:14.421502 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-c94cp/test-trainjob-vz8ww-node-0-0-8m848" event={"ID":"c2fb40db-e8de-4ad6-b990-6224089ceb9b","Type":"ContainerDied","Data":"d9f06668e9d2e56543a6ece3e8693765331aad1b38f46748b410a4714aae626f"} Apr 20 12:31:14.421714 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:31:14.421528 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9f06668e9d2e56543a6ece3e8693765331aad1b38f46748b410a4714aae626f" Apr 20 12:31:15.270022 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:31:15.269987 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-lbqtk/test-trainjob-f86l8-node-0-0-54wwp"] Apr 20 12:31:15.270438 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:31:15.270310 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2fb40db-e8de-4ad6-b990-6224089ceb9b" containerName="node" Apr 20 12:31:15.270438 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:31:15.270322 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2fb40db-e8de-4ad6-b990-6224089ceb9b" containerName="node" Apr 20 12:31:15.270438 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:31:15.270385 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="c2fb40db-e8de-4ad6-b990-6224089ceb9b" containerName="node" Apr 20 12:31:15.488622 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:31:15.488581 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-lbqtk/test-trainjob-f86l8-node-0-0-54wwp"] Apr 20 12:31:15.488798 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:31:15.488728 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-lbqtk/test-trainjob-f86l8-node-0-0-54wwp" Apr 20 12:31:15.491288 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:31:15.491257 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-lbqtk\"/\"openshift-service-ca.crt\"" Apr 20 12:31:15.491288 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:31:15.491276 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-lbqtk\"/\"kube-root-ca.crt\"" Apr 20 12:31:15.491288 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:31:15.491285 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-lbqtk\"/\"default-dockercfg-nbvpf\"" Apr 20 12:31:15.561888 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:31:15.561797 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz5bq\" (UniqueName: \"kubernetes.io/projected/5c6aecff-fa78-4d01-a7df-0618470dcde3-kube-api-access-pz5bq\") pod \"test-trainjob-f86l8-node-0-0-54wwp\" (UID: \"5c6aecff-fa78-4d01-a7df-0618470dcde3\") " pod="test-ns-lbqtk/test-trainjob-f86l8-node-0-0-54wwp" Apr 20 12:31:15.662328 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:31:15.662286 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pz5bq\" (UniqueName: \"kubernetes.io/projected/5c6aecff-fa78-4d01-a7df-0618470dcde3-kube-api-access-pz5bq\") pod \"test-trainjob-f86l8-node-0-0-54wwp\" (UID: \"5c6aecff-fa78-4d01-a7df-0618470dcde3\") " pod="test-ns-lbqtk/test-trainjob-f86l8-node-0-0-54wwp" Apr 20 12:31:15.669861 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:31:15.669831 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz5bq\" (UniqueName: \"kubernetes.io/projected/5c6aecff-fa78-4d01-a7df-0618470dcde3-kube-api-access-pz5bq\") pod \"test-trainjob-f86l8-node-0-0-54wwp\" (UID: \"5c6aecff-fa78-4d01-a7df-0618470dcde3\") " pod="test-ns-lbqtk/test-trainjob-f86l8-node-0-0-54wwp" Apr 20 12:31:15.797978 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:31:15.797939 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-lbqtk/test-trainjob-f86l8-node-0-0-54wwp" Apr 20 12:31:15.918432 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:31:15.918386 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-lbqtk/test-trainjob-f86l8-node-0-0-54wwp"] Apr 20 12:31:15.920619 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:31:15.920587 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c6aecff_fa78_4d01_a7df_0618470dcde3.slice/crio-50c534cd28650b5d7e8da464316955b17b9fb1774680b44bc764dd5746bd0ff2 WatchSource:0}: Error finding container 50c534cd28650b5d7e8da464316955b17b9fb1774680b44bc764dd5746bd0ff2: Status 404 returned error can't find the container with id 50c534cd28650b5d7e8da464316955b17b9fb1774680b44bc764dd5746bd0ff2 Apr 20 12:31:16.428991 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:31:16.428956 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-lbqtk/test-trainjob-f86l8-node-0-0-54wwp" event={"ID":"5c6aecff-fa78-4d01-a7df-0618470dcde3","Type":"ContainerStarted","Data":"50c534cd28650b5d7e8da464316955b17b9fb1774680b44bc764dd5746bd0ff2"} Apr 20 12:32:31.717289 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:32:31.717246 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-lbqtk/test-trainjob-f86l8-node-0-0-54wwp" event={"ID":"5c6aecff-fa78-4d01-a7df-0618470dcde3","Type":"ContainerStarted","Data":"e8468a48ce7e1fe09b97c3ae31e54022783b2c9a8f834d2241dabc2eb28c97b2"} Apr 20 12:32:31.732315 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:32:31.732253 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-lbqtk/test-trainjob-f86l8-node-0-0-54wwp" podStartSLOduration=1.4331921890000001 podStartE2EDuration="1m16.732234614s" podCreationTimestamp="2026-04-20 12:31:15 +0000 UTC" firstStartedPulling="2026-04-20 12:31:15.922638534 +0000 UTC m=+1009.642652503" lastFinishedPulling="2026-04-20 12:32:31.221680957 +0000 UTC m=+1084.941694928" observedRunningTime="2026-04-20 12:32:31.73136368 +0000 UTC m=+1085.451377669" watchObservedRunningTime="2026-04-20 12:32:31.732234614 +0000 UTC m=+1085.452248607" Apr 20 12:32:34.733595 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:32:34.733561 2580 generic.go:358] "Generic (PLEG): container finished" podID="5c6aecff-fa78-4d01-a7df-0618470dcde3" containerID="e8468a48ce7e1fe09b97c3ae31e54022783b2c9a8f834d2241dabc2eb28c97b2" exitCode=0 Apr 20 12:32:34.733979 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:32:34.733639 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-lbqtk/test-trainjob-f86l8-node-0-0-54wwp" event={"ID":"5c6aecff-fa78-4d01-a7df-0618470dcde3","Type":"ContainerDied","Data":"e8468a48ce7e1fe09b97c3ae31e54022783b2c9a8f834d2241dabc2eb28c97b2"} Apr 20 12:32:35.860088 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:32:35.860063 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-lbqtk/test-trainjob-f86l8-node-0-0-54wwp" Apr 20 12:32:36.000470 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:32:36.000360 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pz5bq\" (UniqueName: \"kubernetes.io/projected/5c6aecff-fa78-4d01-a7df-0618470dcde3-kube-api-access-pz5bq\") pod \"5c6aecff-fa78-4d01-a7df-0618470dcde3\" (UID: \"5c6aecff-fa78-4d01-a7df-0618470dcde3\") " Apr 20 12:32:36.002519 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:32:36.002493 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c6aecff-fa78-4d01-a7df-0618470dcde3-kube-api-access-pz5bq" (OuterVolumeSpecName: "kube-api-access-pz5bq") pod "5c6aecff-fa78-4d01-a7df-0618470dcde3" (UID: "5c6aecff-fa78-4d01-a7df-0618470dcde3"). InnerVolumeSpecName "kube-api-access-pz5bq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 12:32:36.101664 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:32:36.101623 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pz5bq\" (UniqueName: \"kubernetes.io/projected/5c6aecff-fa78-4d01-a7df-0618470dcde3-kube-api-access-pz5bq\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:32:36.742303 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:32:36.742263 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-lbqtk/test-trainjob-f86l8-node-0-0-54wwp" event={"ID":"5c6aecff-fa78-4d01-a7df-0618470dcde3","Type":"ContainerDied","Data":"50c534cd28650b5d7e8da464316955b17b9fb1774680b44bc764dd5746bd0ff2"} Apr 20 12:32:36.742303 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:32:36.742298 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-lbqtk/test-trainjob-f86l8-node-0-0-54wwp" Apr 20 12:32:36.742303 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:32:36.742302 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50c534cd28650b5d7e8da464316955b17b9fb1774680b44bc764dd5746bd0ff2" Apr 20 12:32:37.587692 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:32:37.587656 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-bfxtn/test-trainjob-bmd59-node-0-0-hx4bg"] Apr 20 12:32:37.588135 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:32:37.587989 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c6aecff-fa78-4d01-a7df-0618470dcde3" containerName="node" Apr 20 12:32:37.588135 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:32:37.587999 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c6aecff-fa78-4d01-a7df-0618470dcde3" containerName="node" Apr 20 12:32:37.588135 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:32:37.588055 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="5c6aecff-fa78-4d01-a7df-0618470dcde3" containerName="node" Apr 20 12:32:37.654531 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:32:37.654495 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-bfxtn/test-trainjob-bmd59-node-0-0-hx4bg"] Apr 20 12:32:37.654723 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:32:37.654613 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-bfxtn/test-trainjob-bmd59-node-0-0-hx4bg" Apr 20 12:32:37.656941 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:32:37.656916 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-bfxtn\"/\"kube-root-ca.crt\"" Apr 20 12:32:37.657092 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:32:37.656955 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-bfxtn\"/\"openshift-service-ca.crt\"" Apr 20 12:32:37.657092 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:32:37.657044 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-bfxtn\"/\"default-dockercfg-2nzcw\"" Apr 20 12:32:37.817841 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:32:37.817806 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfxpp\" (UniqueName: \"kubernetes.io/projected/63cbeafd-b105-4704-962a-693b6d801050-kube-api-access-bfxpp\") pod \"test-trainjob-bmd59-node-0-0-hx4bg\" (UID: \"63cbeafd-b105-4704-962a-693b6d801050\") " pod="test-ns-bfxtn/test-trainjob-bmd59-node-0-0-hx4bg" Apr 20 12:32:37.918912 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:32:37.918822 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bfxpp\" (UniqueName: \"kubernetes.io/projected/63cbeafd-b105-4704-962a-693b6d801050-kube-api-access-bfxpp\") pod \"test-trainjob-bmd59-node-0-0-hx4bg\" (UID: \"63cbeafd-b105-4704-962a-693b6d801050\") " pod="test-ns-bfxtn/test-trainjob-bmd59-node-0-0-hx4bg" Apr 20 12:32:37.926885 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:32:37.926861 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfxpp\" (UniqueName: \"kubernetes.io/projected/63cbeafd-b105-4704-962a-693b6d801050-kube-api-access-bfxpp\") pod \"test-trainjob-bmd59-node-0-0-hx4bg\" (UID: \"63cbeafd-b105-4704-962a-693b6d801050\") " pod="test-ns-bfxtn/test-trainjob-bmd59-node-0-0-hx4bg" Apr 20 12:32:37.963714 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:32:37.963671 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-bfxtn/test-trainjob-bmd59-node-0-0-hx4bg" Apr 20 12:32:38.152656 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:32:38.152635 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-bfxtn/test-trainjob-bmd59-node-0-0-hx4bg"] Apr 20 12:32:38.154791 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:32:38.154757 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63cbeafd_b105_4704_962a_693b6d801050.slice/crio-aefd0493b0c111f206cc9a4dcc0ca65777f7ab10aebc2aa3181cb21ce83e1ddd WatchSource:0}: Error finding container aefd0493b0c111f206cc9a4dcc0ca65777f7ab10aebc2aa3181cb21ce83e1ddd: Status 404 returned error can't find the container with id aefd0493b0c111f206cc9a4dcc0ca65777f7ab10aebc2aa3181cb21ce83e1ddd Apr 20 12:32:38.156699 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:32:38.156683 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 12:32:38.750949 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:32:38.750910 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-bfxtn/test-trainjob-bmd59-node-0-0-hx4bg" event={"ID":"63cbeafd-b105-4704-962a-693b6d801050","Type":"ContainerStarted","Data":"aefd0493b0c111f206cc9a4dcc0ca65777f7ab10aebc2aa3181cb21ce83e1ddd"} Apr 20 12:38:57.087853 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:38:57.087801 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-bfxtn/test-trainjob-bmd59-node-0-0-hx4bg" event={"ID":"63cbeafd-b105-4704-962a-693b6d801050","Type":"ContainerStarted","Data":"9e4033aab7673c1ed542cb12881c0cceb06d7f3f6f5aaf951e7b5a0eef9f921b"} Apr 20 12:38:57.090330 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:38:57.090307 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-bfxtn\"/\"default-dockercfg-2nzcw\"" Apr 20 12:38:57.111895 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:38:57.111849 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-bfxtn/test-trainjob-bmd59-node-0-0-hx4bg" podStartSLOduration=1.8866691709999999 podStartE2EDuration="6m20.111835879s" podCreationTimestamp="2026-04-20 12:32:37 +0000 UTC" firstStartedPulling="2026-04-20 12:32:38.156814551 +0000 UTC m=+1091.876828519" lastFinishedPulling="2026-04-20 12:38:56.381981255 +0000 UTC m=+1470.101995227" observedRunningTime="2026-04-20 12:38:57.1103879 +0000 UTC m=+1470.830401890" watchObservedRunningTime="2026-04-20 12:38:57.111835879 +0000 UTC m=+1470.831849869" Apr 20 12:38:57.187365 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:38:57.187331 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-bfxtn\"/\"kube-root-ca.crt\"" Apr 20 12:38:57.197207 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:38:57.197191 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-bfxtn\"/\"openshift-service-ca.crt\"" Apr 20 12:39:00.828484 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:39:00.828456 2580 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63cbeafd_b105_4704_962a_693b6d801050.slice/crio-conmon-9e4033aab7673c1ed542cb12881c0cceb06d7f3f6f5aaf951e7b5a0eef9f921b.scope\": RecentStats: unable to find data in memory cache]" Apr 20 12:39:01.104846 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:39:01.104779 2580 generic.go:358] "Generic (PLEG): container finished" podID="63cbeafd-b105-4704-962a-693b6d801050" containerID="9e4033aab7673c1ed542cb12881c0cceb06d7f3f6f5aaf951e7b5a0eef9f921b" exitCode=0 Apr 20 12:39:01.104969 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:39:01.104852 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-bfxtn/test-trainjob-bmd59-node-0-0-hx4bg" event={"ID":"63cbeafd-b105-4704-962a-693b6d801050","Type":"ContainerDied","Data":"9e4033aab7673c1ed542cb12881c0cceb06d7f3f6f5aaf951e7b5a0eef9f921b"} Apr 20 12:39:02.262215 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:39:02.262197 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-bfxtn/test-trainjob-bmd59-node-0-0-hx4bg" Apr 20 12:39:02.426128 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:39:02.426064 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfxpp\" (UniqueName: \"kubernetes.io/projected/63cbeafd-b105-4704-962a-693b6d801050-kube-api-access-bfxpp\") pod \"63cbeafd-b105-4704-962a-693b6d801050\" (UID: \"63cbeafd-b105-4704-962a-693b6d801050\") " Apr 20 12:39:02.428104 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:39:02.428076 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63cbeafd-b105-4704-962a-693b6d801050-kube-api-access-bfxpp" (OuterVolumeSpecName: "kube-api-access-bfxpp") pod "63cbeafd-b105-4704-962a-693b6d801050" (UID: "63cbeafd-b105-4704-962a-693b6d801050"). InnerVolumeSpecName "kube-api-access-bfxpp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 12:39:02.527615 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:39:02.527592 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bfxpp\" (UniqueName: \"kubernetes.io/projected/63cbeafd-b105-4704-962a-693b6d801050-kube-api-access-bfxpp\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:39:03.111894 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:39:03.111825 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-bfxtn/test-trainjob-bmd59-node-0-0-hx4bg" Apr 20 12:39:03.111894 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:39:03.111840 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-bfxtn/test-trainjob-bmd59-node-0-0-hx4bg" event={"ID":"63cbeafd-b105-4704-962a-693b6d801050","Type":"ContainerDied","Data":"aefd0493b0c111f206cc9a4dcc0ca65777f7ab10aebc2aa3181cb21ce83e1ddd"} Apr 20 12:39:03.111894 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:39:03.111877 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aefd0493b0c111f206cc9a4dcc0ca65777f7ab10aebc2aa3181cb21ce83e1ddd" Apr 20 12:39:03.338162 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:39:03.338129 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-mlwm8/test-trainjob-zp52s-node-0-0-wsrcr"] Apr 20 12:39:03.338647 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:39:03.338629 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="63cbeafd-b105-4704-962a-693b6d801050" containerName="node" Apr 20 12:39:03.338715 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:39:03.338649 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="63cbeafd-b105-4704-962a-693b6d801050" containerName="node" Apr 20 12:39:03.338771 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:39:03.338747 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="63cbeafd-b105-4704-962a-693b6d801050" containerName="node" Apr 20 12:39:03.370568 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:39:03.370507 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-mlwm8/test-trainjob-zp52s-node-0-0-wsrcr"] Apr 20 12:39:03.370707 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:39:03.370617 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-mlwm8/test-trainjob-zp52s-node-0-0-wsrcr" Apr 20 12:39:03.373073 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:39:03.373046 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-mlwm8\"/\"default-dockercfg-fwhng\"" Apr 20 12:39:03.373210 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:39:03.373101 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-mlwm8\"/\"kube-root-ca.crt\"" Apr 20 12:39:03.373266 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:39:03.373199 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-mlwm8\"/\"openshift-service-ca.crt\"" Apr 20 12:39:03.536593 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:39:03.536565 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrtgc\" (UniqueName: \"kubernetes.io/projected/80af771a-fc72-4aa0-91f9-1058cbaca702-kube-api-access-hrtgc\") pod \"test-trainjob-zp52s-node-0-0-wsrcr\" (UID: \"80af771a-fc72-4aa0-91f9-1058cbaca702\") " pod="test-ns-mlwm8/test-trainjob-zp52s-node-0-0-wsrcr" Apr 20 12:39:03.637253 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:39:03.637195 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hrtgc\" (UniqueName: \"kubernetes.io/projected/80af771a-fc72-4aa0-91f9-1058cbaca702-kube-api-access-hrtgc\") pod \"test-trainjob-zp52s-node-0-0-wsrcr\" (UID: \"80af771a-fc72-4aa0-91f9-1058cbaca702\") " pod="test-ns-mlwm8/test-trainjob-zp52s-node-0-0-wsrcr" Apr 20 12:39:03.645456 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:39:03.645429 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrtgc\" (UniqueName: \"kubernetes.io/projected/80af771a-fc72-4aa0-91f9-1058cbaca702-kube-api-access-hrtgc\") pod \"test-trainjob-zp52s-node-0-0-wsrcr\" (UID: \"80af771a-fc72-4aa0-91f9-1058cbaca702\") " pod="test-ns-mlwm8/test-trainjob-zp52s-node-0-0-wsrcr" Apr 20 12:39:03.679508 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:39:03.679487 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-mlwm8/test-trainjob-zp52s-node-0-0-wsrcr" Apr 20 12:39:03.790938 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:39:03.790914 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-mlwm8/test-trainjob-zp52s-node-0-0-wsrcr"] Apr 20 12:39:03.793124 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:39:03.793098 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80af771a_fc72_4aa0_91f9_1058cbaca702.slice/crio-a8bc2a7ea86e212a70227c80693503fac625713a8edd7cb5e3d53e9006a073da WatchSource:0}: Error finding container a8bc2a7ea86e212a70227c80693503fac625713a8edd7cb5e3d53e9006a073da: Status 404 returned error can't find the container with id a8bc2a7ea86e212a70227c80693503fac625713a8edd7cb5e3d53e9006a073da Apr 20 12:39:03.795142 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:39:03.795127 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 12:39:04.116170 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:39:04.116139 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-mlwm8/test-trainjob-zp52s-node-0-0-wsrcr" event={"ID":"80af771a-fc72-4aa0-91f9-1058cbaca702","Type":"ContainerStarted","Data":"a8bc2a7ea86e212a70227c80693503fac625713a8edd7cb5e3d53e9006a073da"} Apr 20 12:44:51.986572 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:44:51.986541 2580 eviction_manager.go:376] "Eviction manager: attempting to reclaim" resourceName="ephemeral-storage" Apr 20 12:44:51.987113 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:44:51.986608 2580 container_gc.go:86] "Attempting to delete unused containers" Apr 20 12:44:51.988082 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:44:51.988061 2580 scope.go:117] "RemoveContainer" containerID="e8468a48ce7e1fe09b97c3ae31e54022783b2c9a8f834d2241dabc2eb28c97b2" Apr 20 12:44:53.704197 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:44:53.704168 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-91.ec2.internal" event="NodeHasDiskPressure" Apr 20 12:45:55.256573 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:45:55.256473 2580 log.go:32] "ListImages with filter from image service failed" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" filter="nil" Apr 20 12:45:55.256573 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:45:55.256533 2580 kuberuntime_image.go:104] "Failed to list images" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" Apr 20 12:45:55.256573 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:45:55.256544 2580 image_gc_manager.go:230] "Failed to update image list" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" Apr 20 12:46:51.989582 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:46:51.989541 2580 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" containerID="e8468a48ce7e1fe09b97c3ae31e54022783b2c9a8f834d2241dabc2eb28c97b2" Apr 20 12:46:51.989582 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:46:51.989590 2580 kuberuntime_gc.go:150] "Failed to remove container" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" containerID="e8468a48ce7e1fe09b97c3ae31e54022783b2c9a8f834d2241dabc2eb28c97b2" Apr 20 12:46:51.990136 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:46:51.989606 2580 scope.go:117] "RemoveContainer" containerID="fec3f7029edfe8ce76d761e38fb9f96e528825e7e9b05a62712cd7dfc9a26998" Apr 20 12:48:04.962905 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:48:04.962856 2580 log.go:32] "ListImages with filter from image service failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" filter="nil" Apr 20 12:48:04.962905 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:48:04.962907 2580 kuberuntime_image.go:104] "Failed to list images" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 20 12:48:04.963510 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:48:04.962920 2580 image_gc_manager.go:222] "Failed to monitor images" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 20 12:48:04.967104 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:48:04.967068 2580 log.go:32] "ListImages with filter from image service failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" filter="nil" Apr 20 12:48:04.967192 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:48:04.967127 2580 kuberuntime_image.go:104] "Failed to list images" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 20 12:48:04.967192 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:48:04.967143 2580 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 20 12:48:25.258210 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:48:25.258168 2580 log.go:32] "ListImages with filter from image service failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" filter="nil" Apr 20 12:48:25.258210 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:48:25.258215 2580 kuberuntime_image.go:104] "Failed to list images" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 20 12:48:25.258714 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:48:25.258228 2580 image_gc_manager.go:230] "Failed to update image list" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 20 12:48:51.990721 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:48:51.990682 2580 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" containerID="fec3f7029edfe8ce76d761e38fb9f96e528825e7e9b05a62712cd7dfc9a26998" Apr 20 12:48:51.991153 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:48:51.990729 2580 kuberuntime_gc.go:150] "Failed to remove container" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" containerID="fec3f7029edfe8ce76d761e38fb9f96e528825e7e9b05a62712cd7dfc9a26998" Apr 20 12:48:51.991153 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:48:51.990747 2580 scope.go:117] "RemoveContainer" containerID="ca47e80dc469c32bdd998d0d9ea09ba1df14f81acbf32c59fd31e871b157682b" Apr 20 12:48:54.444149 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:48:54.444127 2580 image_gc_manager.go:447] "Attempting to delete unused images" Apr 20 12:48:54.489260 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:48:54.489234 2580 image_gc_manager.go:514] "Removing image to free bytes" imageID="5151a6030289f6d1ef2c984ebd3e465632a3bf64de79db6f7b3d6e2e638b0557" size=1065600018 runtimeHandler="" Apr 20 12:48:54.583156 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:48:54.583129 2580 image_gc_manager.go:514] "Removing image to free bytes" imageID="ac4be6c7a52584c773ae754a4ccfb9fb1db440f4c9d858ad0f78765a85625b4b" size=1065006420 runtimeHandler="" Apr 20 12:48:55.035545 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:48:55.035512 2580 image_gc_manager.go:514] "Removing image to free bytes" imageID="8cfae5f12a3d5e8f5711d1531d223358c13a3d4b36be844d8c6890efdfa09339" size=622989096 runtimeHandler="" Apr 20 12:48:55.090264 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:48:55.090234 2580 image_gc_manager.go:514] "Removing image to free bytes" imageID="bd2f0c6a473dfa650b536cfe1992446bf45305b3ace698398143f161694113a5" size=20806872103 runtimeHandler="" Apr 20 12:48:55.144713 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:48:55.144675 2580 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: copying system image from manifest list: writing blob: adding layer with blob \"sha256:56a6a09b03e81131ca690210efc814701da032525ada07591ffe5d6c5d5a4906\"/\"\"/\"sha256:0832a7269a80e5ed5e1c5c749a8d30a6b248ce9b970d81f61a4f93a3b72673f9\": unpacking failed (error: exit status 1; output: write /opt/rocm-6.4.3/lib/rocblas/library/TensileLibrary_Type_SS_Contraction_l_Ailk_Bljk_Cijk_Dijk_XPU_gfx942.co: no space left on device); artifact err: provided artifact is a container image" image="quay.io/opendatahub/odh-training-rocm64-torch29-py312@sha256:8a053c8ee3a4c326b745b2516a291c6b8a6e92defc5406ac2e9590bb742153f6" Apr 20 12:48:55.144910 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:48:55.144861 2580 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:node,Image:quay.io/opendatahub/odh-training-rocm64-torch29-py312@sha256:8a053c8ee3a4c326b745b2516a291c6b8a6e92defc5406ac2e9590bb742153f6,Command:[python -c import torch; print(f'PyTorch version: {torch.__version__}'); print('Training completed successfully')],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:,HostPort:0,ContainerPort:29500,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:PET_NNODES,Value:1,ValueFrom:nil,},EnvVar{Name:PET_NPROC_PER_NODE,Value:1,ValueFrom:nil,},EnvVar{Name:PET_NODE_RANK,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.annotations['batch.kubernetes.io/job-completion-index'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:PET_MASTER_ADDR,Value:test-trainjob-zp52s-node-0-0.test-trainjob-zp52s,ValueFrom:nil,},EnvVar{Name:PET_MASTER_PORT,Value:29500,ValueFrom:nil,},EnvVar{Name:JOB_COMPLETION_INDEX,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.labels['batch.kubernetes.io/job-completion-index'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hrtgc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000710000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-trainjob-zp52s-node-0-0-wsrcr_test-ns-mlwm8(80af771a-fc72-4aa0-91f9-1058cbaca702): ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: writing blob: adding layer with blob \"sha256:56a6a09b03e81131ca690210efc814701da032525ada07591ffe5d6c5d5a4906\"/\"\"/\"sha256:0832a7269a80e5ed5e1c5c749a8d30a6b248ce9b970d81f61a4f93a3b72673f9\": unpacking failed (error: exit status 1; output: write /opt/rocm-6.4.3/lib/rocblas/library/TensileLibrary_Type_SS_Contraction_l_Ailk_Bljk_Cijk_Dijk_XPU_gfx942.co: no space left on device); artifact err: provided artifact is a container image" logger="UnhandledError" Apr 20 12:48:55.146033 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:48:55.146010 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: copying system image from manifest list: writing blob: adding layer with blob \\\"sha256:56a6a09b03e81131ca690210efc814701da032525ada07591ffe5d6c5d5a4906\\\"/\\\"\\\"/\\\"sha256:0832a7269a80e5ed5e1c5c749a8d30a6b248ce9b970d81f61a4f93a3b72673f9\\\": unpacking failed (error: exit status 1; output: write /opt/rocm-6.4.3/lib/rocblas/library/TensileLibrary_Type_SS_Contraction_l_Ailk_Bljk_Cijk_Dijk_XPU_gfx942.co: no space left on device); artifact err: provided artifact is a container image\"" pod="test-ns-mlwm8/test-trainjob-zp52s-node-0-0-wsrcr" podUID="80af771a-fc72-4aa0-91f9-1058cbaca702" Apr 20 12:48:55.197373 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:48:55.197322 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-mlwm8\"/\"default-dockercfg-fwhng\"" Apr 20 12:48:55.207383 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:48:55.207369 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-mlwm8\"/\"kube-root-ca.crt\"" Apr 20 12:48:55.217879 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:48:55.217858 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-mlwm8\"/\"openshift-service-ca.crt\"" Apr 20 12:48:58.765796 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:48:58.765304 2580 image_gc_manager.go:514] "Removing image to free bytes" imageID="ad110250a85fcdba558f7f776c90e8eeba85487d69852b32b99f6e3e85c4336a" size=23201654703 runtimeHandler="" Apr 20 12:48:58.765796 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:48:58.765581 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 12:48:58.765796 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:48:58.765779 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-training-rocm64-torch29-py312@sha256:8a053c8ee3a4c326b745b2516a291c6b8a6e92defc5406ac2e9590bb742153f6\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: writing blob: adding layer with blob \\\"sha256:56a6a09b03e81131ca690210efc814701da032525ada07591ffe5d6c5d5a4906\\\"/\\\"\\\"/\\\"sha256:0832a7269a80e5ed5e1c5c749a8d30a6b248ce9b970d81f61a4f93a3b72673f9\\\": unpacking failed (error: exit status 1; output: write /opt/rocm-6.4.3/lib/rocblas/library/TensileLibrary_Type_SS_Contraction_l_Ailk_Bljk_Cijk_Dijk_XPU_gfx942.co: no space left on device); artifact err: provided artifact is a container image\"" pod="test-ns-mlwm8/test-trainjob-zp52s-node-0-0-wsrcr" podUID="80af771a-fc72-4aa0-91f9-1058cbaca702" Apr 20 12:49:02.584652 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:49:02.584616 2580 image_gc_manager.go:514] "Removing image to free bytes" imageID="038c73cf9d35e89709b4f826c0ceb8dc783a3aa366d3139240c1a1da0ec1e546" size=7588072914 runtimeHandler="" Apr 20 12:49:05.791228 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:49:05.791198 2580 eviction_manager.go:383] "Eviction manager: able to reduce resource pressure without evicting pods." resourceName="ephemeral-storage" Apr 20 12:49:59.203106 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:49:59.203074 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-91.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 12:53:04.998523 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:53:04.998483 2580 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 12:56:05.784499 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:56:05.784458 2580 log.go:32] "ListImages with filter from image service failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" filter="nil" Apr 20 12:56:05.784499 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:56:05.784499 2580 kuberuntime_image.go:104] "Failed to list images" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 20 12:56:05.847998 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:56:05.784512 2580 image_gc_manager.go:230] "Failed to update image list" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 20 12:58:03.052247 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:03.052217 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-mlwm8/test-trainjob-zp52s-node-0-0-wsrcr" event={"ID":"80af771a-fc72-4aa0-91f9-1058cbaca702","Type":"ContainerStarted","Data":"d0d528390d44d720ee522419667b62bb5ae57e73959c7bed6303e027a2d8b360"} Apr 20 12:58:03.054721 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:03.054706 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-mlwm8\"/\"default-dockercfg-fwhng\"" Apr 20 12:58:03.079020 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:03.078984 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-mlwm8/test-trainjob-zp52s-node-0-0-wsrcr" podStartSLOduration=1.447174125 podStartE2EDuration="19m0.078973062s" podCreationTimestamp="2026-04-20 12:39:03 +0000 UTC" firstStartedPulling="2026-04-20 12:39:03.795244642 +0000 UTC m=+1477.515258611" lastFinishedPulling="2026-04-20 12:58:02.427043578 +0000 UTC m=+2616.147057548" observedRunningTime="2026-04-20 12:58:03.077385222 +0000 UTC m=+2616.797399212" watchObservedRunningTime="2026-04-20 12:58:03.078973062 +0000 UTC m=+2616.798987052" Apr 20 12:58:03.202979 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:03.202955 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-mlwm8\"/\"kube-root-ca.crt\"" Apr 20 12:58:03.213482 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:03.213464 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-mlwm8\"/\"openshift-service-ca.crt\"" Apr 20 12:58:24.125508 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:24.125452 2580 generic.go:358] "Generic (PLEG): container finished" podID="80af771a-fc72-4aa0-91f9-1058cbaca702" containerID="d0d528390d44d720ee522419667b62bb5ae57e73959c7bed6303e027a2d8b360" exitCode=0 Apr 20 12:58:24.125818 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:24.125521 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-mlwm8/test-trainjob-zp52s-node-0-0-wsrcr" event={"ID":"80af771a-fc72-4aa0-91f9-1058cbaca702","Type":"ContainerDied","Data":"d0d528390d44d720ee522419667b62bb5ae57e73959c7bed6303e027a2d8b360"} Apr 20 12:58:25.350680 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:25.350658 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-mlwm8/test-trainjob-zp52s-node-0-0-wsrcr" Apr 20 12:58:25.358825 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:25.358805 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrtgc\" (UniqueName: \"kubernetes.io/projected/80af771a-fc72-4aa0-91f9-1058cbaca702-kube-api-access-hrtgc\") pod \"80af771a-fc72-4aa0-91f9-1058cbaca702\" (UID: \"80af771a-fc72-4aa0-91f9-1058cbaca702\") " Apr 20 12:58:25.360816 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:25.360794 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80af771a-fc72-4aa0-91f9-1058cbaca702-kube-api-access-hrtgc" (OuterVolumeSpecName: "kube-api-access-hrtgc") pod "80af771a-fc72-4aa0-91f9-1058cbaca702" (UID: "80af771a-fc72-4aa0-91f9-1058cbaca702"). InnerVolumeSpecName "kube-api-access-hrtgc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 12:58:25.459299 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:25.459247 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hrtgc\" (UniqueName: \"kubernetes.io/projected/80af771a-fc72-4aa0-91f9-1058cbaca702-kube-api-access-hrtgc\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:58:26.133263 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:26.133239 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-mlwm8/test-trainjob-zp52s-node-0-0-wsrcr" Apr 20 12:58:26.133263 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:26.133256 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-mlwm8/test-trainjob-zp52s-node-0-0-wsrcr" event={"ID":"80af771a-fc72-4aa0-91f9-1058cbaca702","Type":"ContainerDied","Data":"a8bc2a7ea86e212a70227c80693503fac625713a8edd7cb5e3d53e9006a073da"} Apr 20 12:58:26.133438 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:26.133280 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8bc2a7ea86e212a70227c80693503fac625713a8edd7cb5e3d53e9006a073da" Apr 20 12:58:26.307824 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:26.307800 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/test-ns-mlwm8_test-trainjob-zp52s-node-0-0-wsrcr_80af771a-fc72-4aa0-91f9-1058cbaca702/node/0.log" Apr 20 12:58:26.402905 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:26.402851 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/test-ns-bfxtn_test-trainjob-bmd59-node-0-0-hx4bg_63cbeafd-b105-4704-962a-693b6d801050/node/0.log" Apr 20 12:58:26.495071 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:58:26.495046 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8468a48ce7e1fe09b97c3ae31e54022783b2c9a8f834d2241dabc2eb28c97b2\": container with ID starting with e8468a48ce7e1fe09b97c3ae31e54022783b2c9a8f834d2241dabc2eb28c97b2 not found: ID does not exist" containerID="e8468a48ce7e1fe09b97c3ae31e54022783b2c9a8f834d2241dabc2eb28c97b2" Apr 20 12:58:26.595683 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:58:26.595656 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fec3f7029edfe8ce76d761e38fb9f96e528825e7e9b05a62712cd7dfc9a26998\": container with ID starting with fec3f7029edfe8ce76d761e38fb9f96e528825e7e9b05a62712cd7dfc9a26998 not found: ID does not exist" containerID="fec3f7029edfe8ce76d761e38fb9f96e528825e7e9b05a62712cd7dfc9a26998" Apr 20 12:58:27.091322 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:58:27.091291 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca47e80dc469c32bdd998d0d9ea09ba1df14f81acbf32c59fd31e871b157682b\": container with ID starting with ca47e80dc469c32bdd998d0d9ea09ba1df14f81acbf32c59fd31e871b157682b not found: ID does not exist" containerID="ca47e80dc469c32bdd998d0d9ea09ba1df14f81acbf32c59fd31e871b157682b" Apr 20 12:58:28.724039 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:28.724003 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-r5p9p/must-gather-cm798"] Apr 20 12:58:28.724450 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:28.724360 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80af771a-fc72-4aa0-91f9-1058cbaca702" containerName="node" Apr 20 12:58:28.724450 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:28.724371 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="80af771a-fc72-4aa0-91f9-1058cbaca702" containerName="node" Apr 20 12:58:28.724532 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:28.724456 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="80af771a-fc72-4aa0-91f9-1058cbaca702" containerName="node" Apr 20 12:58:28.750090 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:28.750055 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-r5p9p/must-gather-cm798"] Apr 20 12:58:28.750090 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:28.750066 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r5p9p/must-gather-cm798" Apr 20 12:58:28.752470 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:28.752446 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-r5p9p\"/\"openshift-service-ca.crt\"" Apr 20 12:58:28.752583 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:28.752518 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-r5p9p\"/\"kube-root-ca.crt\"" Apr 20 12:58:28.753441 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:28.753420 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-r5p9p\"/\"default-dockercfg-96hzl\"" Apr 20 12:58:28.781138 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:28.781118 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7dfb68e1-0e92-4e14-af6d-030c5423151e-must-gather-output\") pod \"must-gather-cm798\" (UID: \"7dfb68e1-0e92-4e14-af6d-030c5423151e\") " pod="openshift-must-gather-r5p9p/must-gather-cm798" Apr 20 12:58:28.781232 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:28.781143 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qrj6\" (UniqueName: \"kubernetes.io/projected/7dfb68e1-0e92-4e14-af6d-030c5423151e-kube-api-access-4qrj6\") pod \"must-gather-cm798\" (UID: \"7dfb68e1-0e92-4e14-af6d-030c5423151e\") " pod="openshift-must-gather-r5p9p/must-gather-cm798" Apr 20 12:58:28.881920 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:28.881897 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7dfb68e1-0e92-4e14-af6d-030c5423151e-must-gather-output\") pod \"must-gather-cm798\" (UID: \"7dfb68e1-0e92-4e14-af6d-030c5423151e\") " pod="openshift-must-gather-r5p9p/must-gather-cm798" Apr 20 12:58:28.882015 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:28.881924 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qrj6\" (UniqueName: \"kubernetes.io/projected/7dfb68e1-0e92-4e14-af6d-030c5423151e-kube-api-access-4qrj6\") pod \"must-gather-cm798\" (UID: \"7dfb68e1-0e92-4e14-af6d-030c5423151e\") " pod="openshift-must-gather-r5p9p/must-gather-cm798" Apr 20 12:58:28.882339 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:28.882322 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7dfb68e1-0e92-4e14-af6d-030c5423151e-must-gather-output\") pod \"must-gather-cm798\" (UID: \"7dfb68e1-0e92-4e14-af6d-030c5423151e\") " pod="openshift-must-gather-r5p9p/must-gather-cm798" Apr 20 12:58:28.890644 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:28.890617 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qrj6\" (UniqueName: \"kubernetes.io/projected/7dfb68e1-0e92-4e14-af6d-030c5423151e-kube-api-access-4qrj6\") pod \"must-gather-cm798\" (UID: \"7dfb68e1-0e92-4e14-af6d-030c5423151e\") " pod="openshift-must-gather-r5p9p/must-gather-cm798" Apr 20 12:58:29.058878 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:29.058855 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r5p9p/must-gather-cm798" Apr 20 12:58:29.174644 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:29.174612 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-r5p9p/must-gather-cm798"] Apr 20 12:58:29.176120 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:58:29.176094 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7dfb68e1_0e92_4e14_af6d_030c5423151e.slice/crio-ab81668c4d77268cbf67b1d9d6d7092b6ba3f995b945d4c47108fbf952e6e162 WatchSource:0}: Error finding container ab81668c4d77268cbf67b1d9d6d7092b6ba3f995b945d4c47108fbf952e6e162: Status 404 returned error can't find the container with id ab81668c4d77268cbf67b1d9d6d7092b6ba3f995b945d4c47108fbf952e6e162 Apr 20 12:58:29.177641 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:29.177623 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 12:58:30.148828 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:30.148782 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r5p9p/must-gather-cm798" event={"ID":"7dfb68e1-0e92-4e14-af6d-030c5423151e","Type":"ContainerStarted","Data":"ab81668c4d77268cbf67b1d9d6d7092b6ba3f995b945d4c47108fbf952e6e162"} Apr 20 12:58:31.340744 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:31.340709 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-mlwm8/test-trainjob-zp52s-node-0-0-wsrcr"] Apr 20 12:58:31.345041 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:31.345006 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-mlwm8/test-trainjob-zp52s-node-0-0-wsrcr"] Apr 20 12:58:31.441535 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:31.441494 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-bfxtn/test-trainjob-bmd59-node-0-0-hx4bg"] Apr 20 12:58:31.444987 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:31.444934 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-bfxtn/test-trainjob-bmd59-node-0-0-hx4bg"] Apr 20 12:58:31.541931 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:31.541900 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-lbqtk/test-trainjob-f86l8-node-0-0-54wwp"] Apr 20 12:58:31.558800 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:31.558772 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-lbqtk/test-trainjob-f86l8-node-0-0-54wwp"] Apr 20 12:58:31.710343 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:31.710262 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-c94cp/test-trainjob-vz8ww-node-0-0-8m848"] Apr 20 12:58:31.713589 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:31.713560 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-c94cp/test-trainjob-vz8ww-node-0-0-8m848"] Apr 20 12:58:32.310221 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:32.310187 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-s576s/test-trainjob-p9rjp-node-0-0-5w6zv"] Apr 20 12:58:32.314527 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:32.314487 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-s576s/test-trainjob-p9rjp-node-0-0-5w6zv"] Apr 20 12:58:32.846286 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:32.846255 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c6aecff-fa78-4d01-a7df-0618470dcde3" path="/var/lib/kubelet/pods/5c6aecff-fa78-4d01-a7df-0618470dcde3/volumes" Apr 20 12:58:32.846702 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:32.846684 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63cbeafd-b105-4704-962a-693b6d801050" path="/var/lib/kubelet/pods/63cbeafd-b105-4704-962a-693b6d801050/volumes" Apr 20 12:58:32.846973 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:32.846961 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80af771a-fc72-4aa0-91f9-1058cbaca702" path="/var/lib/kubelet/pods/80af771a-fc72-4aa0-91f9-1058cbaca702/volumes" Apr 20 12:58:32.847227 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:32.847217 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5e47330-30ad-4ea6-a652-f6fabbbd4995" path="/var/lib/kubelet/pods/a5e47330-30ad-4ea6-a652-f6fabbbd4995/volumes" Apr 20 12:58:32.847522 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:32.847510 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2fb40db-e8de-4ad6-b990-6224089ceb9b" path="/var/lib/kubelet/pods/c2fb40db-e8de-4ad6-b990-6224089ceb9b/volumes" Apr 20 12:58:36.174267 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:36.174237 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r5p9p/must-gather-cm798" event={"ID":"7dfb68e1-0e92-4e14-af6d-030c5423151e","Type":"ContainerStarted","Data":"3d4a9385c1577249c0cba95a3c311db835c0053150f7984d347ea4ab10ad3014"} Apr 20 12:58:36.174777 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:36.174273 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r5p9p/must-gather-cm798" event={"ID":"7dfb68e1-0e92-4e14-af6d-030c5423151e","Type":"ContainerStarted","Data":"755769c4b95267244b74784da76493ab90cedcd60d8b08ac17bca600b8f47808"} Apr 20 12:58:36.194658 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:36.194606 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-r5p9p/must-gather-cm798" podStartSLOduration=2.306616104 podStartE2EDuration="8.194587922s" podCreationTimestamp="2026-04-20 12:58:28 +0000 UTC" firstStartedPulling="2026-04-20 12:58:29.177743209 +0000 UTC m=+2642.897757177" lastFinishedPulling="2026-04-20 12:58:35.065715026 +0000 UTC m=+2648.785728995" observedRunningTime="2026-04-20 12:58:36.192451144 +0000 UTC m=+2649.912465133" watchObservedRunningTime="2026-04-20 12:58:36.194587922 +0000 UTC m=+2649.914601918" Apr 20 12:58:45.294418 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:45.294365 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-mq294_9311bed3-a12b-4280-8dd9-4d1b12d716e8/manager/0.log" Apr 20 12:58:45.778195 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:45.778167 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-mq294_9311bed3-a12b-4280-8dd9-4d1b12d716e8/manager/0.log" Apr 20 12:58:46.627552 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:58:46.627505 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-mq294_9311bed3-a12b-4280-8dd9-4d1b12d716e8/manager/0.log" Apr 20 12:59:24.341887 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:24.341852 2580 generic.go:358] "Generic (PLEG): container finished" podID="7dfb68e1-0e92-4e14-af6d-030c5423151e" containerID="755769c4b95267244b74784da76493ab90cedcd60d8b08ac17bca600b8f47808" exitCode=0 Apr 20 12:59:24.342386 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:24.341934 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r5p9p/must-gather-cm798" event={"ID":"7dfb68e1-0e92-4e14-af6d-030c5423151e","Type":"ContainerDied","Data":"755769c4b95267244b74784da76493ab90cedcd60d8b08ac17bca600b8f47808"} Apr 20 12:59:24.342386 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:24.342362 2580 scope.go:117] "RemoveContainer" containerID="755769c4b95267244b74784da76493ab90cedcd60d8b08ac17bca600b8f47808" Apr 20 12:59:24.634126 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:24.634056 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-r5p9p_must-gather-cm798_7dfb68e1-0e92-4e14-af6d-030c5423151e/gather/0.log" Apr 20 12:59:26.943526 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:26.943498 2580 scope.go:117] "RemoveContainer" containerID="9e4033aab7673c1ed542cb12881c0cceb06d7f3f6f5aaf951e7b5a0eef9f921b" Apr 20 12:59:27.783755 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:27.783730 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-9zp5f_41d4bf7c-87b7-4e06-b676-27b36ca55dd9/global-pull-secret-syncer/0.log" Apr 20 12:59:27.883787 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:27.883759 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-8hww4_021e270f-fe7a-402c-a482-41d496fec5fb/konnectivity-agent/0.log" Apr 20 12:59:28.009430 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:28.009390 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-91.ec2.internal_4e4cd406deda512383d472584c7956de/haproxy/0.log" Apr 20 12:59:29.950111 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:29.950078 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-r5p9p/must-gather-cm798"] Apr 20 12:59:29.950506 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:29.950276 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-r5p9p/must-gather-cm798" podUID="7dfb68e1-0e92-4e14-af6d-030c5423151e" containerName="copy" containerID="cri-o://3d4a9385c1577249c0cba95a3c311db835c0053150f7984d347ea4ab10ad3014" gracePeriod=2 Apr 20 12:59:29.954389 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:29.953982 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-r5p9p/must-gather-cm798"] Apr 20 12:59:30.177552 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:30.177531 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-r5p9p_must-gather-cm798_7dfb68e1-0e92-4e14-af6d-030c5423151e/copy/0.log" Apr 20 12:59:30.177873 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:30.177858 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r5p9p/must-gather-cm798" Apr 20 12:59:30.180529 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:30.180506 2580 status_manager.go:895] "Failed to get status for pod" podUID="7dfb68e1-0e92-4e14-af6d-030c5423151e" pod="openshift-must-gather-r5p9p/must-gather-cm798" err="pods \"must-gather-cm798\" is forbidden: User \"system:node:ip-10-0-137-91.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-r5p9p\": no relationship found between node 'ip-10-0-137-91.ec2.internal' and this object" Apr 20 12:59:30.226603 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:30.226550 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qrj6\" (UniqueName: \"kubernetes.io/projected/7dfb68e1-0e92-4e14-af6d-030c5423151e-kube-api-access-4qrj6\") pod \"7dfb68e1-0e92-4e14-af6d-030c5423151e\" (UID: \"7dfb68e1-0e92-4e14-af6d-030c5423151e\") " Apr 20 12:59:30.226693 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:30.226623 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7dfb68e1-0e92-4e14-af6d-030c5423151e-must-gather-output\") pod \"7dfb68e1-0e92-4e14-af6d-030c5423151e\" (UID: \"7dfb68e1-0e92-4e14-af6d-030c5423151e\") " Apr 20 12:59:30.228534 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:30.228511 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dfb68e1-0e92-4e14-af6d-030c5423151e-kube-api-access-4qrj6" (OuterVolumeSpecName: "kube-api-access-4qrj6") pod "7dfb68e1-0e92-4e14-af6d-030c5423151e" (UID: "7dfb68e1-0e92-4e14-af6d-030c5423151e"). InnerVolumeSpecName "kube-api-access-4qrj6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 12:59:30.228608 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:30.228530 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dfb68e1-0e92-4e14-af6d-030c5423151e-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "7dfb68e1-0e92-4e14-af6d-030c5423151e" (UID: "7dfb68e1-0e92-4e14-af6d-030c5423151e"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 12:59:30.327912 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:30.327887 2580 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7dfb68e1-0e92-4e14-af6d-030c5423151e-must-gather-output\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:59:30.327912 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:30.327913 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4qrj6\" (UniqueName: \"kubernetes.io/projected/7dfb68e1-0e92-4e14-af6d-030c5423151e-kube-api-access-4qrj6\") on node \"ip-10-0-137-91.ec2.internal\" DevicePath \"\"" Apr 20 12:59:30.360056 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:30.360038 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-r5p9p_must-gather-cm798_7dfb68e1-0e92-4e14-af6d-030c5423151e/copy/0.log" Apr 20 12:59:30.360351 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:30.360328 2580 generic.go:358] "Generic (PLEG): container finished" podID="7dfb68e1-0e92-4e14-af6d-030c5423151e" containerID="3d4a9385c1577249c0cba95a3c311db835c0053150f7984d347ea4ab10ad3014" exitCode=143 Apr 20 12:59:30.360438 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:30.360376 2580 scope.go:117] "RemoveContainer" containerID="3d4a9385c1577249c0cba95a3c311db835c0053150f7984d347ea4ab10ad3014" Apr 20 12:59:30.360502 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:30.360376 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r5p9p/must-gather-cm798" Apr 20 12:59:30.362494 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:30.362462 2580 status_manager.go:895] "Failed to get status for pod" podUID="7dfb68e1-0e92-4e14-af6d-030c5423151e" pod="openshift-must-gather-r5p9p/must-gather-cm798" err="pods \"must-gather-cm798\" is forbidden: User \"system:node:ip-10-0-137-91.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-r5p9p\": no relationship found between node 'ip-10-0-137-91.ec2.internal' and this object" Apr 20 12:59:30.367293 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:30.367277 2580 scope.go:117] "RemoveContainer" containerID="755769c4b95267244b74784da76493ab90cedcd60d8b08ac17bca600b8f47808" Apr 20 12:59:30.369946 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:30.369920 2580 status_manager.go:895] "Failed to get status for pod" podUID="7dfb68e1-0e92-4e14-af6d-030c5423151e" pod="openshift-must-gather-r5p9p/must-gather-cm798" err="pods \"must-gather-cm798\" is forbidden: User \"system:node:ip-10-0-137-91.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-r5p9p\": no relationship found between node 'ip-10-0-137-91.ec2.internal' and this object" Apr 20 12:59:30.378585 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:30.378563 2580 scope.go:117] "RemoveContainer" containerID="3d4a9385c1577249c0cba95a3c311db835c0053150f7984d347ea4ab10ad3014" Apr 20 12:59:30.378843 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:59:30.378828 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d4a9385c1577249c0cba95a3c311db835c0053150f7984d347ea4ab10ad3014\": container with ID starting with 3d4a9385c1577249c0cba95a3c311db835c0053150f7984d347ea4ab10ad3014 not found: ID does not exist" containerID="3d4a9385c1577249c0cba95a3c311db835c0053150f7984d347ea4ab10ad3014" Apr 20 12:59:30.378904 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:30.378850 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d4a9385c1577249c0cba95a3c311db835c0053150f7984d347ea4ab10ad3014"} err="failed to get container status \"3d4a9385c1577249c0cba95a3c311db835c0053150f7984d347ea4ab10ad3014\": rpc error: code = NotFound desc = could not find container \"3d4a9385c1577249c0cba95a3c311db835c0053150f7984d347ea4ab10ad3014\": container with ID starting with 3d4a9385c1577249c0cba95a3c311db835c0053150f7984d347ea4ab10ad3014 not found: ID does not exist" Apr 20 12:59:30.378904 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:30.378865 2580 scope.go:117] "RemoveContainer" containerID="755769c4b95267244b74784da76493ab90cedcd60d8b08ac17bca600b8f47808" Apr 20 12:59:30.379095 ip-10-0-137-91 kubenswrapper[2580]: E0420 12:59:30.379075 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"755769c4b95267244b74784da76493ab90cedcd60d8b08ac17bca600b8f47808\": container with ID starting with 755769c4b95267244b74784da76493ab90cedcd60d8b08ac17bca600b8f47808 not found: ID does not exist" containerID="755769c4b95267244b74784da76493ab90cedcd60d8b08ac17bca600b8f47808" Apr 20 12:59:30.379148 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:30.379104 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"755769c4b95267244b74784da76493ab90cedcd60d8b08ac17bca600b8f47808"} err="failed to get container status \"755769c4b95267244b74784da76493ab90cedcd60d8b08ac17bca600b8f47808\": rpc error: code = NotFound desc = could not find container \"755769c4b95267244b74784da76493ab90cedcd60d8b08ac17bca600b8f47808\": container with ID starting with 755769c4b95267244b74784da76493ab90cedcd60d8b08ac17bca600b8f47808 not found: ID does not exist" Apr 20 12:59:30.846830 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:30.846790 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dfb68e1-0e92-4e14-af6d-030c5423151e" path="/var/lib/kubelet/pods/7dfb68e1-0e92-4e14-af6d-030c5423151e/volumes" Apr 20 12:59:31.113629 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:31.113561 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a49f1ae4-b930-4e59-87b0-edf8e27bcbd1/alertmanager/0.log" Apr 20 12:59:31.134578 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:31.134554 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a49f1ae4-b930-4e59-87b0-edf8e27bcbd1/config-reloader/0.log" Apr 20 12:59:31.158282 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:31.158258 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a49f1ae4-b930-4e59-87b0-edf8e27bcbd1/kube-rbac-proxy-web/0.log" Apr 20 12:59:31.181291 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:31.181267 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a49f1ae4-b930-4e59-87b0-edf8e27bcbd1/kube-rbac-proxy/0.log" Apr 20 12:59:31.200933 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:31.200917 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a49f1ae4-b930-4e59-87b0-edf8e27bcbd1/kube-rbac-proxy-metric/0.log" Apr 20 12:59:31.221301 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:31.221281 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a49f1ae4-b930-4e59-87b0-edf8e27bcbd1/prom-label-proxy/0.log" Apr 20 12:59:31.245695 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:31.245677 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a49f1ae4-b930-4e59-87b0-edf8e27bcbd1/init-config-reloader/0.log" Apr 20 12:59:31.310088 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:31.310068 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-q8rbm_09b97c75-b838-4128-bb6f-d6a54d1cc11e/kube-state-metrics/0.log" Apr 20 12:59:31.330500 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:31.330483 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-q8rbm_09b97c75-b838-4128-bb6f-d6a54d1cc11e/kube-rbac-proxy-main/0.log" Apr 20 12:59:31.354123 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:31.354094 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-q8rbm_09b97c75-b838-4128-bb6f-d6a54d1cc11e/kube-rbac-proxy-self/0.log" Apr 20 12:59:31.382717 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:31.382661 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-7b75467d8b-xqm69_6aeb557f-5fc7-4469-90cb-cfdbee9a0458/metrics-server/0.log" Apr 20 12:59:31.409250 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:31.409231 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-6vjwl_6d32df24-8beb-401c-9a8a-8999fce03374/monitoring-plugin/0.log" Apr 20 12:59:31.518288 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:31.518258 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4rfp8_9c983cd8-5455-48d0-a6b5-0f8277cb2ca9/node-exporter/0.log" Apr 20 12:59:31.538145 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:31.538130 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4rfp8_9c983cd8-5455-48d0-a6b5-0f8277cb2ca9/kube-rbac-proxy/0.log" Apr 20 12:59:31.560195 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:31.560177 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4rfp8_9c983cd8-5455-48d0-a6b5-0f8277cb2ca9/init-textfile/0.log" Apr 20 12:59:31.655229 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:31.655144 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-2p7tn_3f8898a4-1988-4c88-81d7-7b424aaf64e7/kube-rbac-proxy-main/0.log" Apr 20 12:59:31.674965 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:31.674942 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-2p7tn_3f8898a4-1988-4c88-81d7-7b424aaf64e7/kube-rbac-proxy-self/0.log" Apr 20 12:59:31.698076 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:31.698058 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-2p7tn_3f8898a4-1988-4c88-81d7-7b424aaf64e7/openshift-state-metrics/0.log" Apr 20 12:59:31.739085 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:31.739062 2580 log.go:25] "Incomplete line in log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_899a908e-40e7-476b-b7ba-7f9134886317/prometheus/0.log" line="2026-04-20T12:48:45.156138893+00:00 stderr F time=2026-04-20T12:48:45.156Z level=WARN source=group.go:570 msg=\"Rule sample appending failed\"" Apr 20 12:59:31.739164 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:31.739104 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_899a908e-40e7-476b-b7ba-7f9134886317/prometheus/0.log" Apr 20 12:59:31.757895 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:31.757874 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_899a908e-40e7-476b-b7ba-7f9134886317/config-reloader/0.log" Apr 20 12:59:31.777291 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:31.777273 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_899a908e-40e7-476b-b7ba-7f9134886317/thanos-sidecar/0.log" Apr 20 12:59:31.796083 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:31.796052 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_899a908e-40e7-476b-b7ba-7f9134886317/kube-rbac-proxy-web/0.log" Apr 20 12:59:31.816249 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:31.816234 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_899a908e-40e7-476b-b7ba-7f9134886317/kube-rbac-proxy/0.log" Apr 20 12:59:31.838412 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:31.838379 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_899a908e-40e7-476b-b7ba-7f9134886317/kube-rbac-proxy-thanos/0.log" Apr 20 12:59:31.861390 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:31.861358 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_899a908e-40e7-476b-b7ba-7f9134886317/init-config-reloader/0.log" Apr 20 12:59:31.964549 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:31.964526 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-79cc798455-tph24_22e5ac9a-8222-4837-ae48-0b9ca98383b1/telemeter-client/0.log" Apr 20 12:59:31.983751 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:31.983725 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-79cc798455-tph24_22e5ac9a-8222-4837-ae48-0b9ca98383b1/reload/0.log" Apr 20 12:59:32.005549 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:32.005512 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-79cc798455-tph24_22e5ac9a-8222-4837-ae48-0b9ca98383b1/kube-rbac-proxy/0.log" Apr 20 12:59:32.046908 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:32.046884 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-579895dbf5-cfrwz_9748a2da-a0df-48e0-8d2e-bc1ef97f4fed/thanos-query/0.log" Apr 20 12:59:32.072949 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:32.072930 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-579895dbf5-cfrwz_9748a2da-a0df-48e0-8d2e-bc1ef97f4fed/kube-rbac-proxy-web/0.log" Apr 20 12:59:32.129704 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:32.129688 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-579895dbf5-cfrwz_9748a2da-a0df-48e0-8d2e-bc1ef97f4fed/kube-rbac-proxy/0.log" Apr 20 12:59:32.157034 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:32.157008 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-579895dbf5-cfrwz_9748a2da-a0df-48e0-8d2e-bc1ef97f4fed/prom-label-proxy/0.log" Apr 20 12:59:32.180182 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:32.180126 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-579895dbf5-cfrwz_9748a2da-a0df-48e0-8d2e-bc1ef97f4fed/kube-rbac-proxy-rules/0.log" Apr 20 12:59:32.206291 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:32.206264 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-579895dbf5-cfrwz_9748a2da-a0df-48e0-8d2e-bc1ef97f4fed/kube-rbac-proxy-metrics/0.log" Apr 20 12:59:34.376286 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:34.376259 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-chpds/perf-node-gather-daemonset-m7kgd"] Apr 20 12:59:34.376656 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:34.376594 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7dfb68e1-0e92-4e14-af6d-030c5423151e" containerName="gather" Apr 20 12:59:34.376656 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:34.376606 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dfb68e1-0e92-4e14-af6d-030c5423151e" containerName="gather" Apr 20 12:59:34.376656 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:34.376613 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7dfb68e1-0e92-4e14-af6d-030c5423151e" containerName="copy" Apr 20 12:59:34.376656 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:34.376618 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dfb68e1-0e92-4e14-af6d-030c5423151e" containerName="copy" Apr 20 12:59:34.376773 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:34.376663 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="7dfb68e1-0e92-4e14-af6d-030c5423151e" containerName="copy" Apr 20 12:59:34.376773 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:34.376672 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="7dfb68e1-0e92-4e14-af6d-030c5423151e" containerName="gather" Apr 20 12:59:34.379218 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:34.379196 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-chpds/perf-node-gather-daemonset-m7kgd" Apr 20 12:59:34.381435 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:34.381415 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-chpds\"/\"openshift-service-ca.crt\"" Apr 20 12:59:34.382356 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:34.382341 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-chpds\"/\"default-dockercfg-pv8sl\"" Apr 20 12:59:34.382423 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:34.382378 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-chpds\"/\"kube-root-ca.crt\"" Apr 20 12:59:34.389338 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:34.389318 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-chpds/perf-node-gather-daemonset-m7kgd"] Apr 20 12:59:34.459026 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:34.458998 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b1205f85-d539-470c-a21d-8553264ffdb4-podres\") pod \"perf-node-gather-daemonset-m7kgd\" (UID: \"b1205f85-d539-470c-a21d-8553264ffdb4\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-m7kgd" Apr 20 12:59:34.459155 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:34.459056 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b1205f85-d539-470c-a21d-8553264ffdb4-lib-modules\") pod \"perf-node-gather-daemonset-m7kgd\" (UID: \"b1205f85-d539-470c-a21d-8553264ffdb4\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-m7kgd" Apr 20 12:59:34.459155 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:34.459080 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b1205f85-d539-470c-a21d-8553264ffdb4-proc\") pod \"perf-node-gather-daemonset-m7kgd\" (UID: \"b1205f85-d539-470c-a21d-8553264ffdb4\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-m7kgd" Apr 20 12:59:34.459155 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:34.459103 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b1205f85-d539-470c-a21d-8553264ffdb4-sys\") pod \"perf-node-gather-daemonset-m7kgd\" (UID: \"b1205f85-d539-470c-a21d-8553264ffdb4\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-m7kgd" Apr 20 12:59:34.459155 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:34.459120 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlgqf\" (UniqueName: \"kubernetes.io/projected/b1205f85-d539-470c-a21d-8553264ffdb4-kube-api-access-qlgqf\") pod \"perf-node-gather-daemonset-m7kgd\" (UID: \"b1205f85-d539-470c-a21d-8553264ffdb4\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-m7kgd" Apr 20 12:59:34.559622 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:34.559600 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b1205f85-d539-470c-a21d-8553264ffdb4-lib-modules\") pod \"perf-node-gather-daemonset-m7kgd\" (UID: \"b1205f85-d539-470c-a21d-8553264ffdb4\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-m7kgd" Apr 20 12:59:34.559723 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:34.559626 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b1205f85-d539-470c-a21d-8553264ffdb4-proc\") pod \"perf-node-gather-daemonset-m7kgd\" (UID: \"b1205f85-d539-470c-a21d-8553264ffdb4\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-m7kgd" Apr 20 12:59:34.559723 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:34.559645 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b1205f85-d539-470c-a21d-8553264ffdb4-sys\") pod \"perf-node-gather-daemonset-m7kgd\" (UID: \"b1205f85-d539-470c-a21d-8553264ffdb4\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-m7kgd" Apr 20 12:59:34.559723 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:34.559664 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qlgqf\" (UniqueName: \"kubernetes.io/projected/b1205f85-d539-470c-a21d-8553264ffdb4-kube-api-access-qlgqf\") pod \"perf-node-gather-daemonset-m7kgd\" (UID: \"b1205f85-d539-470c-a21d-8553264ffdb4\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-m7kgd" Apr 20 12:59:34.559824 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:34.559723 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b1205f85-d539-470c-a21d-8553264ffdb4-podres\") pod \"perf-node-gather-daemonset-m7kgd\" (UID: \"b1205f85-d539-470c-a21d-8553264ffdb4\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-m7kgd" Apr 20 12:59:34.559824 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:34.559731 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b1205f85-d539-470c-a21d-8553264ffdb4-sys\") pod \"perf-node-gather-daemonset-m7kgd\" (UID: \"b1205f85-d539-470c-a21d-8553264ffdb4\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-m7kgd" Apr 20 12:59:34.559824 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:34.559725 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b1205f85-d539-470c-a21d-8553264ffdb4-proc\") pod \"perf-node-gather-daemonset-m7kgd\" (UID: \"b1205f85-d539-470c-a21d-8553264ffdb4\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-m7kgd" Apr 20 12:59:34.559824 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:34.559794 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b1205f85-d539-470c-a21d-8553264ffdb4-lib-modules\") pod \"perf-node-gather-daemonset-m7kgd\" (UID: \"b1205f85-d539-470c-a21d-8553264ffdb4\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-m7kgd" Apr 20 12:59:34.559938 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:34.559826 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b1205f85-d539-470c-a21d-8553264ffdb4-podres\") pod \"perf-node-gather-daemonset-m7kgd\" (UID: \"b1205f85-d539-470c-a21d-8553264ffdb4\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-m7kgd" Apr 20 12:59:34.567104 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:34.567086 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlgqf\" (UniqueName: \"kubernetes.io/projected/b1205f85-d539-470c-a21d-8553264ffdb4-kube-api-access-qlgqf\") pod \"perf-node-gather-daemonset-m7kgd\" (UID: \"b1205f85-d539-470c-a21d-8553264ffdb4\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-m7kgd" Apr 20 12:59:34.689624 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:34.689539 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-chpds/perf-node-gather-daemonset-m7kgd" Apr 20 12:59:34.807499 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:34.807473 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-chpds/perf-node-gather-daemonset-m7kgd"] Apr 20 12:59:34.809820 ip-10-0-137-91 kubenswrapper[2580]: W0420 12:59:34.809788 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb1205f85_d539_470c_a21d_8553264ffdb4.slice/crio-1044d0f1d00adccc9decf6fa1db3e010ed1688b094663d1947302803d4f955c2 WatchSource:0}: Error finding container 1044d0f1d00adccc9decf6fa1db3e010ed1688b094663d1947302803d4f955c2: Status 404 returned error can't find the container with id 1044d0f1d00adccc9decf6fa1db3e010ed1688b094663d1947302803d4f955c2 Apr 20 12:59:35.021766 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:35.021737 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-kffw5_0b7a32b8-6e66-44fe-b5c2-72348a1935b3/dns/0.log" Apr 20 12:59:35.040658 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:35.040636 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-kffw5_0b7a32b8-6e66-44fe-b5c2-72348a1935b3/kube-rbac-proxy/0.log" Apr 20 12:59:35.152115 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:35.152088 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zwm76_900f999a-6b3b-4648-b015-7ca045ba8dcd/dns-node-resolver/0.log" Apr 20 12:59:35.376252 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:35.376170 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-chpds/perf-node-gather-daemonset-m7kgd" event={"ID":"b1205f85-d539-470c-a21d-8553264ffdb4","Type":"ContainerStarted","Data":"d894e056974bff13ed356e002447a8c70e7696a8e8011a8de7628b015bec2de6"} Apr 20 12:59:35.376252 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:35.376206 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-chpds/perf-node-gather-daemonset-m7kgd" event={"ID":"b1205f85-d539-470c-a21d-8553264ffdb4","Type":"ContainerStarted","Data":"1044d0f1d00adccc9decf6fa1db3e010ed1688b094663d1947302803d4f955c2"} Apr 20 12:59:35.376252 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:35.376236 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-chpds/perf-node-gather-daemonset-m7kgd" Apr 20 12:59:35.393634 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:35.393590 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-chpds/perf-node-gather-daemonset-m7kgd" podStartSLOduration=1.393577662 podStartE2EDuration="1.393577662s" podCreationTimestamp="2026-04-20 12:59:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 12:59:35.392196288 +0000 UTC m=+2709.112210278" watchObservedRunningTime="2026-04-20 12:59:35.393577662 +0000 UTC m=+2709.113591687" Apr 20 12:59:35.537873 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:35.537847 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-p26xh_1ab58f2d-5007-4797-8a83-489889f35e06/node-ca/0.log" Apr 20 12:59:36.524230 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:36.524194 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-mlr8x_24ad10ec-2768-4402-b312-c7462cdbf063/serve-healthcheck-canary/0.log" Apr 20 12:59:37.029930 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:37.029904 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kxx9j_05263fd7-7f8e-489b-bd5a-837b7e49f5bb/kube-rbac-proxy/0.log" Apr 20 12:59:37.048780 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:37.048758 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kxx9j_05263fd7-7f8e-489b-bd5a-837b7e49f5bb/exporter/0.log" Apr 20 12:59:37.067888 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:37.067867 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kxx9j_05263fd7-7f8e-489b-bd5a-837b7e49f5bb/extractor/0.log" Apr 20 12:59:38.667822 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:38.667793 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-jobset-operator_jobset-operator-747c5859c7-5hd7q_f860921a-209f-4315-abc7-2f35ba7024eb/jobset-operator/0.log" Apr 20 12:59:41.388193 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:41.388169 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-chpds/perf-node-gather-daemonset-m7kgd" Apr 20 12:59:43.066002 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:43.065976 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-26xn7_e3f51f42-77bf-412b-970c-03006a2ef077/kube-multus/0.log" Apr 20 12:59:43.252047 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:43.252024 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j99mh_0279eea6-b7aa-4e72-bec0-5aa87266cc8b/kube-multus-additional-cni-plugins/0.log" Apr 20 12:59:43.273471 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:43.273449 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j99mh_0279eea6-b7aa-4e72-bec0-5aa87266cc8b/egress-router-binary-copy/0.log" Apr 20 12:59:43.300426 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:43.300363 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j99mh_0279eea6-b7aa-4e72-bec0-5aa87266cc8b/cni-plugins/0.log" Apr 20 12:59:43.321665 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:43.321645 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j99mh_0279eea6-b7aa-4e72-bec0-5aa87266cc8b/bond-cni-plugin/0.log" Apr 20 12:59:43.346300 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:43.346283 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j99mh_0279eea6-b7aa-4e72-bec0-5aa87266cc8b/routeoverride-cni/0.log" Apr 20 12:59:43.372888 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:43.372865 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j99mh_0279eea6-b7aa-4e72-bec0-5aa87266cc8b/whereabouts-cni-bincopy/0.log" Apr 20 12:59:43.403582 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:43.403564 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j99mh_0279eea6-b7aa-4e72-bec0-5aa87266cc8b/whereabouts-cni/0.log" Apr 20 12:59:43.784108 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:43.784082 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-nm452_430e704c-5d70-4df6-baaa-2296216f1239/network-metrics-daemon/0.log" Apr 20 12:59:43.809948 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:43.809921 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-nm452_430e704c-5d70-4df6-baaa-2296216f1239/kube-rbac-proxy/0.log" Apr 20 12:59:44.523569 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:44.523547 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dwphh_2da468d5-2794-41ba-8344-4246be8732d7/ovn-controller/0.log" Apr 20 12:59:44.551170 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:44.551146 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dwphh_2da468d5-2794-41ba-8344-4246be8732d7/ovn-acl-logging/0.log" Apr 20 12:59:44.568081 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:44.568060 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dwphh_2da468d5-2794-41ba-8344-4246be8732d7/kube-rbac-proxy-node/0.log" Apr 20 12:59:44.588342 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:44.588317 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dwphh_2da468d5-2794-41ba-8344-4246be8732d7/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 12:59:44.607386 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:44.607364 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dwphh_2da468d5-2794-41ba-8344-4246be8732d7/northd/0.log" Apr 20 12:59:44.624372 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:44.624350 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dwphh_2da468d5-2794-41ba-8344-4246be8732d7/nbdb/0.log" Apr 20 12:59:44.643700 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:44.643682 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dwphh_2da468d5-2794-41ba-8344-4246be8732d7/sbdb/0.log" Apr 20 12:59:44.731053 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:44.731032 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dwphh_2da468d5-2794-41ba-8344-4246be8732d7/ovnkube-controller/0.log" Apr 20 12:59:46.213694 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:46.213664 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-cz8d9_deac4f19-5105-40df-bd7a-9d7c576cd705/network-check-target-container/0.log" Apr 20 12:59:47.159615 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:47.159590 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-cf8lw_261e5e12-8ebc-4a49-9b69-be511d818e12/iptables-alerter/0.log" Apr 20 12:59:47.831360 ip-10-0-137-91 kubenswrapper[2580]: I0420 12:59:47.831332 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-csctt_f8dc897f-a714-4520-8393-707949cd3be7/tuned/0.log"