Apr 16 20:27:36.613606 ip-10-0-139-150 systemd[1]: Starting Kubernetes Kubelet... Apr 16 20:27:37.049127 ip-10-0-139-150 kubenswrapper[2565]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:27:37.049127 ip-10-0-139-150 kubenswrapper[2565]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 20:27:37.049127 ip-10-0-139-150 kubenswrapper[2565]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:27:37.049127 ip-10-0-139-150 kubenswrapper[2565]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 20:27:37.049127 ip-10-0-139-150 kubenswrapper[2565]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:27:37.049849 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.049756 2565 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 20:27:37.055645 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055612 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:27:37.055743 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055725 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:27:37.055743 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055731 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:27:37.055743 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055734 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:27:37.055743 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055737 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:27:37.055743 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055741 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:27:37.055743 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055744 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:27:37.055902 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055750 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:27:37.055902 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055753 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:27:37.055902 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055755 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:27:37.055902 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055760 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:27:37.055902 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055765 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:27:37.055902 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055768 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:27:37.055902 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055771 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:27:37.055902 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055774 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:27:37.055902 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055777 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:27:37.055902 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055779 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:27:37.055902 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055782 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:27:37.055902 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055785 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:27:37.055902 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055787 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:27:37.055902 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055790 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:27:37.055902 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055793 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:27:37.055902 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055796 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:27:37.055902 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055798 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:27:37.055902 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055801 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:27:37.055902 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055804 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:27:37.056458 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055808 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:27:37.056458 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055811 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:27:37.056458 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055814 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:27:37.056458 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055817 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:27:37.056458 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055819 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:27:37.056458 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055822 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:27:37.056458 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055825 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:27:37.056458 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055828 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:27:37.056458 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055830 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:27:37.056458 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055833 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:27:37.056458 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055835 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:27:37.056458 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055839 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:27:37.056458 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055841 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:27:37.056458 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055844 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:27:37.056458 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055848 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:27:37.056458 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055853 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:27:37.056458 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055856 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:27:37.056458 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055859 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:27:37.056458 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055862 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:27:37.056458 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055864 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:27:37.056950 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055867 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:27:37.056950 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055869 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:27:37.056950 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055882 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:27:37.056950 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055887 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:27:37.056950 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055889 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:27:37.056950 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055892 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:27:37.056950 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055895 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:27:37.056950 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055897 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:27:37.056950 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055900 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:27:37.056950 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055903 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:27:37.056950 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055905 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:27:37.056950 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055908 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:27:37.056950 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055911 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:27:37.056950 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055914 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:27:37.056950 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055917 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:27:37.056950 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055919 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:27:37.056950 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055922 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:27:37.056950 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055924 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:27:37.056950 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055927 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:27:37.056950 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055929 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:27:37.057456 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055932 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:27:37.057456 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055934 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:27:37.057456 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055937 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:27:37.057456 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055940 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:27:37.057456 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055942 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:27:37.057456 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055945 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:27:37.057456 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055948 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:27:37.057456 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055951 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:27:37.057456 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055953 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:27:37.057456 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055956 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:27:37.057456 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055958 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:27:37.057456 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055961 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:27:37.057456 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055967 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:27:37.057456 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055971 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:27:37.057456 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055975 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:27:37.057456 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055978 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:27:37.057456 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055981 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:27:37.057456 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055983 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:27:37.057456 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055986 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:27:37.057913 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.055989 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:27:37.057913 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056413 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:27:37.057913 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056419 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:27:37.057913 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056423 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:27:37.057913 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056427 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:27:37.057913 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056431 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:27:37.057913 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056433 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:27:37.057913 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056436 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:27:37.057913 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056439 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:27:37.057913 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056442 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:27:37.057913 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056444 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:27:37.057913 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056447 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:27:37.057913 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056451 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:27:37.057913 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056453 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:27:37.057913 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056456 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:27:37.057913 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056460 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:27:37.057913 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056463 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:27:37.057913 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056466 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:27:37.057913 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056468 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:27:37.057913 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056471 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:27:37.058420 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056474 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:27:37.058420 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056477 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:27:37.058420 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056480 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:27:37.058420 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056482 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:27:37.058420 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056485 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:27:37.058420 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056489 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:27:37.058420 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056491 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:27:37.058420 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056494 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:27:37.058420 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056496 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:27:37.058420 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056499 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:27:37.058420 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056501 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:27:37.058420 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056504 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:27:37.058420 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056506 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:27:37.058420 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056510 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:27:37.058420 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056512 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:27:37.058420 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056515 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:27:37.058420 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056518 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:27:37.058420 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056520 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:27:37.058420 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056523 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:27:37.058420 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056525 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:27:37.058921 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056528 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:27:37.058921 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056530 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:27:37.058921 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056533 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:27:37.058921 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056535 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:27:37.058921 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056538 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:27:37.058921 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056540 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:27:37.058921 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056543 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:27:37.058921 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056545 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:27:37.058921 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056548 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:27:37.058921 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056550 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:27:37.058921 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056553 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:27:37.058921 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056556 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:27:37.058921 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056558 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:27:37.058921 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056561 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:27:37.058921 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056563 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:27:37.058921 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056566 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:27:37.058921 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056569 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:27:37.058921 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056571 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:27:37.058921 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056574 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:27:37.058921 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056577 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:27:37.059430 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056579 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:27:37.059430 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056582 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:27:37.059430 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056585 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:27:37.059430 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056588 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:27:37.059430 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056590 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:27:37.059430 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056593 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:27:37.059430 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056595 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:27:37.059430 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056598 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:27:37.059430 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056600 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:27:37.059430 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056603 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:27:37.059430 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056605 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:27:37.059430 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056608 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:27:37.059430 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056610 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:27:37.059430 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056613 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:27:37.059430 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056615 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:27:37.059430 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056618 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:27:37.059430 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056620 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:27:37.059430 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056624 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:27:37.059430 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056626 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:27:37.059904 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056630 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:27:37.059904 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056633 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:27:37.059904 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056637 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:27:37.059904 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056639 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:27:37.059904 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056643 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:27:37.059904 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056645 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:27:37.059904 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056648 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:27:37.059904 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.056651 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:27:37.059904 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056721 2565 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 20:27:37.059904 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056728 2565 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 20:27:37.059904 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056734 2565 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 20:27:37.059904 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056739 2565 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 20:27:37.059904 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056744 2565 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 20:27:37.059904 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056747 2565 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 20:27:37.059904 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056752 2565 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 20:27:37.059904 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056757 2565 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 20:27:37.059904 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056760 2565 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 20:27:37.059904 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056764 2565 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 20:27:37.059904 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056767 2565 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 20:27:37.059904 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056770 2565 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 20:27:37.059904 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056774 2565 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 20:27:37.060426 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056777 2565 flags.go:64] FLAG: --cgroup-root="" Apr 16 20:27:37.060426 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056779 2565 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 20:27:37.060426 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056782 2565 flags.go:64] FLAG: --client-ca-file="" Apr 16 20:27:37.060426 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056785 2565 flags.go:64] FLAG: --cloud-config="" Apr 16 20:27:37.060426 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056788 2565 flags.go:64] FLAG: --cloud-provider="external" Apr 16 20:27:37.060426 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056791 2565 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 20:27:37.060426 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056796 2565 flags.go:64] FLAG: --cluster-domain="" Apr 16 20:27:37.060426 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056798 2565 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 20:27:37.060426 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056802 2565 flags.go:64] FLAG: --config-dir="" Apr 16 20:27:37.060426 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056804 2565 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 20:27:37.060426 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056808 2565 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 20:27:37.060426 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056812 2565 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 20:27:37.060426 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056814 2565 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 20:27:37.060426 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056818 2565 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 20:27:37.060426 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056821 2565 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 20:27:37.060426 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056824 2565 flags.go:64] FLAG: --contention-profiling="false" Apr 16 20:27:37.060426 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056827 2565 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 20:27:37.060426 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056830 2565 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 20:27:37.060426 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056833 2565 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 20:27:37.060426 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056836 2565 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 20:27:37.060426 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056841 2565 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 20:27:37.060426 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056844 2565 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 20:27:37.060426 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056847 2565 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 20:27:37.060426 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056850 2565 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 20:27:37.060426 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056853 2565 flags.go:64] FLAG: --enable-server="true" Apr 16 20:27:37.061052 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056856 2565 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 20:27:37.061052 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056861 2565 flags.go:64] FLAG: --event-burst="100" Apr 16 20:27:37.061052 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056864 2565 flags.go:64] FLAG: --event-qps="50" Apr 16 20:27:37.061052 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056867 2565 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 20:27:37.061052 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056869 2565 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 20:27:37.061052 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056872 2565 flags.go:64] FLAG: --eviction-hard="" Apr 16 20:27:37.061052 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056877 2565 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 20:27:37.061052 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056880 2565 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 20:27:37.061052 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056883 2565 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 20:27:37.061052 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056885 2565 flags.go:64] FLAG: --eviction-soft="" Apr 16 20:27:37.061052 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056888 2565 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 20:27:37.061052 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056891 2565 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 20:27:37.061052 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056894 2565 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 20:27:37.061052 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056897 2565 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 20:27:37.061052 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056900 2565 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 20:27:37.061052 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056903 2565 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 20:27:37.061052 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056906 2565 flags.go:64] FLAG: --feature-gates="" Apr 16 20:27:37.061052 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056909 2565 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 20:27:37.061052 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056913 2565 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 20:27:37.061052 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056915 2565 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 20:27:37.061052 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056919 2565 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 20:27:37.061052 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056922 2565 flags.go:64] FLAG: --healthz-port="10248" Apr 16 20:27:37.061052 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056925 2565 flags.go:64] FLAG: --help="false" Apr 16 20:27:37.061052 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056928 2565 flags.go:64] FLAG: --hostname-override="ip-10-0-139-150.ec2.internal" Apr 16 20:27:37.061052 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056931 2565 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 20:27:37.061666 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056934 2565 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 20:27:37.061666 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056937 2565 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 20:27:37.061666 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056941 2565 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 20:27:37.061666 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056944 2565 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 20:27:37.061666 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056947 2565 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 20:27:37.061666 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056950 2565 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 20:27:37.061666 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056953 2565 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 20:27:37.061666 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056959 2565 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 20:27:37.061666 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056963 2565 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 20:27:37.061666 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056966 2565 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 20:27:37.061666 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056969 2565 flags.go:64] FLAG: --kube-reserved="" Apr 16 20:27:37.061666 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056972 2565 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 20:27:37.061666 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056975 2565 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 20:27:37.061666 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056978 2565 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 20:27:37.061666 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056981 2565 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 20:27:37.061666 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056984 2565 flags.go:64] FLAG: --lock-file="" Apr 16 20:27:37.061666 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056987 2565 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 20:27:37.061666 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056990 2565 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 20:27:37.061666 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056993 2565 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 20:27:37.061666 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.056998 2565 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 20:27:37.061666 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057001 2565 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 20:27:37.061666 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057004 2565 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 20:27:37.061666 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057007 2565 flags.go:64] FLAG: --logging-format="text" Apr 16 20:27:37.062269 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057010 2565 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 20:27:37.062269 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057014 2565 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 20:27:37.062269 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057016 2565 flags.go:64] FLAG: --manifest-url="" Apr 16 20:27:37.062269 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057019 2565 flags.go:64] FLAG: --manifest-url-header="" Apr 16 20:27:37.062269 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057023 2565 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 20:27:37.062269 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057026 2565 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 20:27:37.062269 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057031 2565 flags.go:64] FLAG: --max-pods="110" Apr 16 20:27:37.062269 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057034 2565 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 20:27:37.062269 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057037 2565 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 20:27:37.062269 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057040 2565 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 20:27:37.062269 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057046 2565 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 20:27:37.062269 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057049 2565 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 20:27:37.062269 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057052 2565 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 20:27:37.062269 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057055 2565 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 20:27:37.062269 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057063 2565 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 20:27:37.062269 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057066 2565 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 20:27:37.062269 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057069 2565 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 20:27:37.062269 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057073 2565 flags.go:64] FLAG: --pod-cidr="" Apr 16 20:27:37.062269 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057076 2565 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 20:27:37.062269 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057081 2565 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 20:27:37.062269 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057084 2565 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 20:27:37.062269 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057087 2565 flags.go:64] FLAG: --pods-per-core="0" Apr 16 20:27:37.062269 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057090 2565 flags.go:64] FLAG: --port="10250" Apr 16 20:27:37.062269 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057093 2565 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 20:27:37.062870 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057097 2565 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-05eb78152ba38f651" Apr 16 20:27:37.062870 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057100 2565 flags.go:64] FLAG: --qos-reserved="" Apr 16 20:27:37.062870 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057103 2565 flags.go:64] FLAG: --read-only-port="10255" Apr 16 20:27:37.062870 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057106 2565 flags.go:64] FLAG: --register-node="true" Apr 16 20:27:37.062870 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057109 2565 flags.go:64] FLAG: --register-schedulable="true" Apr 16 20:27:37.062870 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057112 2565 flags.go:64] FLAG: --register-with-taints="" Apr 16 20:27:37.062870 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057116 2565 flags.go:64] FLAG: --registry-burst="10" Apr 16 20:27:37.062870 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057119 2565 flags.go:64] FLAG: --registry-qps="5" Apr 16 20:27:37.062870 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057121 2565 flags.go:64] FLAG: --reserved-cpus="" Apr 16 20:27:37.062870 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057127 2565 flags.go:64] FLAG: --reserved-memory="" Apr 16 20:27:37.062870 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057131 2565 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 20:27:37.062870 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057135 2565 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 20:27:37.062870 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057138 2565 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 20:27:37.062870 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057141 2565 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 20:27:37.062870 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057144 2565 flags.go:64] FLAG: --runonce="false" Apr 16 20:27:37.062870 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057147 2565 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 20:27:37.062870 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057150 2565 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 20:27:37.062870 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057153 2565 flags.go:64] FLAG: --seccomp-default="false" Apr 16 20:27:37.062870 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057158 2565 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 20:27:37.062870 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057161 2565 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 20:27:37.062870 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057164 2565 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 20:27:37.062870 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057167 2565 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 20:27:37.062870 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057170 2565 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 20:27:37.062870 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057173 2565 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 20:27:37.062870 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057176 2565 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 20:27:37.062870 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057179 2565 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 20:27:37.063516 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057183 2565 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 20:27:37.063516 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057186 2565 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 20:27:37.063516 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057189 2565 flags.go:64] FLAG: --system-cgroups="" Apr 16 20:27:37.063516 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057191 2565 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 20:27:37.063516 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057196 2565 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 20:27:37.063516 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057199 2565 flags.go:64] FLAG: --tls-cert-file="" Apr 16 20:27:37.063516 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057202 2565 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 20:27:37.063516 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057207 2565 flags.go:64] FLAG: --tls-min-version="" Apr 16 20:27:37.063516 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057210 2565 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 20:27:37.063516 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057212 2565 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 20:27:37.063516 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057215 2565 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 20:27:37.063516 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057218 2565 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 20:27:37.063516 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057221 2565 flags.go:64] FLAG: --v="2" Apr 16 20:27:37.063516 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057225 2565 flags.go:64] FLAG: --version="false" Apr 16 20:27:37.063516 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057229 2565 flags.go:64] FLAG: --vmodule="" Apr 16 20:27:37.063516 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057235 2565 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 20:27:37.063516 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.057238 2565 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 20:27:37.063516 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057361 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:27:37.063516 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057366 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:27:37.063516 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057369 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:27:37.063516 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057372 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:27:37.063516 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057375 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:27:37.063516 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057378 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:27:37.063516 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057381 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:27:37.064162 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057385 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:27:37.064162 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057388 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:27:37.064162 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057391 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:27:37.064162 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057394 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:27:37.064162 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057397 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:27:37.064162 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057401 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:27:37.064162 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057405 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:27:37.064162 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057408 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:27:37.064162 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057411 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:27:37.064162 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057414 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:27:37.064162 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057417 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:27:37.064162 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057419 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:27:37.064162 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057422 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:27:37.064162 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057424 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:27:37.064162 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057427 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:27:37.064162 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057430 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:27:37.064162 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057432 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:27:37.064162 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057434 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:27:37.064162 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057437 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:27:37.064663 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057440 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:27:37.064663 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057442 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:27:37.064663 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057445 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:27:37.064663 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057447 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:27:37.064663 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057451 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:27:37.064663 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057454 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:27:37.064663 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057456 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:27:37.064663 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057459 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:27:37.064663 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057462 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:27:37.064663 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057465 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:27:37.064663 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057468 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:27:37.064663 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057471 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:27:37.064663 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057473 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:27:37.064663 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057477 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:27:37.064663 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057480 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:27:37.064663 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057482 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:27:37.064663 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057485 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:27:37.064663 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057487 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:27:37.064663 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057490 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:27:37.064663 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057493 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:27:37.065165 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057495 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:27:37.065165 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057498 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:27:37.065165 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057500 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:27:37.065165 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057503 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:27:37.065165 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057505 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:27:37.065165 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057508 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:27:37.065165 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057510 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:27:37.065165 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057513 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:27:37.065165 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057515 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:27:37.065165 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057518 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:27:37.065165 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057520 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:27:37.065165 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057523 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:27:37.065165 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057525 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:27:37.065165 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057528 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:27:37.065165 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057531 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:27:37.065165 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057533 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:27:37.065165 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057537 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:27:37.065165 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057539 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:27:37.065165 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057543 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:27:37.065637 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057546 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:27:37.065637 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057550 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:27:37.065637 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057552 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:27:37.065637 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057555 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:27:37.065637 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057558 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:27:37.065637 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057560 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:27:37.065637 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057564 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:27:37.065637 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057566 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:27:37.065637 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057569 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:27:37.065637 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057572 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:27:37.065637 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057574 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:27:37.065637 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057577 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:27:37.065637 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057579 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:27:37.065637 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057582 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:27:37.065637 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057584 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:27:37.065637 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057587 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:27:37.065637 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057589 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:27:37.065637 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057592 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:27:37.065637 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057595 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:27:37.066106 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057597 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:27:37.066106 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.057600 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:27:37.066106 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.058460 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:27:37.066220 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.066200 2565 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 20:27:37.066251 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.066221 2565 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 20:27:37.066295 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066269 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:27:37.066295 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066291 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:27:37.066295 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066295 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:27:37.066379 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066299 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:27:37.066379 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066304 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:27:37.066379 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066308 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:27:37.066379 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066310 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:27:37.066379 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066313 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:27:37.066379 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066316 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:27:37.066379 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066319 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:27:37.066379 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066322 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:27:37.066379 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066324 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:27:37.066379 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066327 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:27:37.066379 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066330 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:27:37.066379 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066332 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:27:37.066379 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066336 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:27:37.066379 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066338 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:27:37.066379 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066341 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:27:37.066379 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066343 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:27:37.066379 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066346 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:27:37.066379 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066349 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:27:37.066379 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066351 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:27:37.066379 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066354 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:27:37.066861 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066356 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:27:37.066861 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066359 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:27:37.066861 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066362 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:27:37.066861 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066365 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:27:37.066861 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066368 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:27:37.066861 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066370 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:27:37.066861 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066373 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:27:37.066861 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066376 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:27:37.066861 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066379 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:27:37.066861 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066381 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:27:37.066861 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066384 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:27:37.066861 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066387 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:27:37.066861 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066390 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:27:37.066861 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066393 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:27:37.066861 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066396 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:27:37.066861 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066400 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:27:37.066861 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066403 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:27:37.066861 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066406 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:27:37.066861 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066409 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:27:37.066861 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066412 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:27:37.067443 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066414 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:27:37.067443 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066417 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:27:37.067443 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066419 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:27:37.067443 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066422 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:27:37.067443 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066424 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:27:37.067443 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066427 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:27:37.067443 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066429 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:27:37.067443 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066432 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:27:37.067443 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066434 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:27:37.067443 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066437 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:27:37.067443 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066439 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:27:37.067443 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066442 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:27:37.067443 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066444 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:27:37.067443 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066448 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:27:37.067443 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066451 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:27:37.067443 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066454 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:27:37.067443 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066456 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:27:37.067443 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066459 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:27:37.067443 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066462 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:27:37.067923 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066464 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:27:37.067923 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066467 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:27:37.067923 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066470 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:27:37.067923 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066472 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:27:37.067923 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066475 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:27:37.067923 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066477 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:27:37.067923 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066480 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:27:37.067923 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066483 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:27:37.067923 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066485 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:27:37.067923 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066488 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:27:37.067923 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066490 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:27:37.067923 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066492 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:27:37.067923 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066495 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:27:37.067923 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066498 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:27:37.067923 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066500 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:27:37.067923 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066503 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:27:37.067923 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066505 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:27:37.067923 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066508 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:27:37.067923 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066510 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:27:37.067923 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066513 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:27:37.068420 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066516 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:27:37.068420 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066518 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:27:37.068420 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066521 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:27:37.068420 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066524 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:27:37.068420 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.066529 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:27:37.068420 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066623 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:27:37.068420 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066628 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:27:37.068420 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066632 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:27:37.068420 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066635 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:27:37.068420 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066639 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:27:37.068420 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066641 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:27:37.068420 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066644 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:27:37.068420 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066647 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:27:37.068420 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066649 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:27:37.068420 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066652 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:27:37.068801 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066655 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:27:37.068801 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066657 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:27:37.068801 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066660 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:27:37.068801 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066662 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:27:37.068801 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066665 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:27:37.068801 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066668 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:27:37.068801 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066671 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:27:37.068801 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066673 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:27:37.068801 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066676 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:27:37.068801 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066678 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:27:37.068801 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066680 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:27:37.068801 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066683 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:27:37.068801 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066686 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:27:37.068801 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066690 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:27:37.068801 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066693 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:27:37.068801 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066695 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:27:37.068801 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066698 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:27:37.068801 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066700 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:27:37.068801 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066703 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:27:37.068801 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066705 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:27:37.069289 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066707 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:27:37.069289 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066710 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:27:37.069289 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066712 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:27:37.069289 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066715 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:27:37.069289 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066718 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:27:37.069289 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066720 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:27:37.069289 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066723 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:27:37.069289 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066726 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:27:37.069289 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066728 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:27:37.069289 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066731 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:27:37.069289 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066733 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:27:37.069289 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066736 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:27:37.069289 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066739 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:27:37.069289 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066741 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:27:37.069289 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066744 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:27:37.069289 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066746 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:27:37.069289 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066749 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:27:37.069289 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066752 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:27:37.069289 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066754 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:27:37.069289 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066757 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:27:37.069767 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066759 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:27:37.069767 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066762 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:27:37.069767 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066764 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:27:37.069767 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066767 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:27:37.069767 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066769 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:27:37.069767 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066772 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:27:37.069767 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066774 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:27:37.069767 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066777 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:27:37.069767 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066779 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:27:37.069767 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066782 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:27:37.069767 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066784 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:27:37.069767 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066787 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:27:37.069767 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066789 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:27:37.069767 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066791 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:27:37.069767 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066794 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:27:37.069767 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066797 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:27:37.069767 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066799 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:27:37.069767 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066802 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:27:37.069767 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066804 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:27:37.069767 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066806 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:27:37.070253 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066809 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:27:37.070253 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066812 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:27:37.070253 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066814 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:27:37.070253 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066817 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:27:37.070253 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066819 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:27:37.070253 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066822 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:27:37.070253 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066824 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:27:37.070253 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066828 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:27:37.070253 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066832 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:27:37.070253 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066836 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:27:37.070253 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066841 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:27:37.070253 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066843 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:27:37.070253 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066846 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:27:37.070253 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066848 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:27:37.070253 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066851 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:27:37.070253 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:37.066853 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:27:37.070651 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.066858 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:27:37.070651 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.067594 2565 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 20:27:37.071260 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.071244 2565 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 20:27:37.072203 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.072191 2565 server.go:1019] "Starting client certificate rotation" Apr 16 20:27:37.072321 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.072303 2565 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 20:27:37.072359 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.072341 2565 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 20:27:37.094637 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.094619 2565 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 20:27:37.096476 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.096462 2565 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 20:27:37.110669 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.110648 2565 log.go:25] "Validated CRI v1 runtime API" Apr 16 20:27:37.116106 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.116090 2565 log.go:25] "Validated CRI v1 image API" Apr 16 20:27:37.118696 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.118681 2565 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 20:27:37.124122 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.124098 2565 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 9a627242-0d5a-4edd-906f-605afca2a599:/dev/nvme0n1p4 a4a6470d-9bf0-430f-9067-333082141028:/dev/nvme0n1p3] Apr 16 20:27:37.124176 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.124122 2565 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 20:27:37.128251 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.128230 2565 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 20:27:37.129846 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.129700 2565 manager.go:217] Machine: {Timestamp:2026-04-16 20:27:37.12786583 +0000 UTC m=+0.395608273 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3077631 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec288844d47d25bb87f06604c1632072 SystemUUID:ec288844-d47d-25bb-87f0-6604c1632072 BootID:1b376850-5eee-496d-a4f1-cbc1132761d8 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:73:d5:fa:be:5f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:73:d5:fa:be:5f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:9a:32:ac:15:68:94 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 20:27:37.129846 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.129837 2565 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 20:27:37.129976 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.129926 2565 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 20:27:37.130939 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.130914 2565 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 20:27:37.131112 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.130941 2565 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-150.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 20:27:37.131196 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.131125 2565 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 20:27:37.131196 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.131139 2565 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 20:27:37.131196 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.131157 2565 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 20:27:37.131970 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.131958 2565 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 20:27:37.133326 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.133314 2565 state_mem.go:36] "Initialized new in-memory state store" Apr 16 20:27:37.133454 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.133443 2565 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 20:27:37.136349 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.136337 2565 kubelet.go:491] "Attempting to sync node with API server" Apr 16 20:27:37.136419 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.136362 2565 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 20:27:37.136419 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.136378 2565 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 20:27:37.136419 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.136393 2565 kubelet.go:397] "Adding apiserver pod source" Apr 16 20:27:37.136419 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.136406 2565 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 20:27:37.137526 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.137513 2565 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 20:27:37.137590 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.137535 2565 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 20:27:37.140587 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.140565 2565 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 20:27:37.141899 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.141886 2565 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 20:27:37.143664 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.143652 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 20:27:37.143705 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.143670 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 20:27:37.143705 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.143676 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 20:27:37.143705 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.143682 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 20:27:37.143705 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.143687 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 20:27:37.143705 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.143693 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 20:27:37.143705 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.143699 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 20:27:37.143705 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.143704 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 20:27:37.143895 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.143713 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 20:27:37.143895 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.143719 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 20:27:37.143895 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.143729 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 20:27:37.143895 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.143737 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 20:27:37.144942 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.144932 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 20:27:37.144942 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.144942 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 20:27:37.145418 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.145399 2565 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-tzs27" Apr 16 20:27:37.148448 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.148436 2565 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 20:27:37.148496 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.148475 2565 server.go:1295] "Started kubelet" Apr 16 20:27:37.148621 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.148587 2565 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 20:27:37.148985 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.148943 2565 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 20:27:37.149050 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.149010 2565 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 20:27:37.149254 ip-10-0-139-150 systemd[1]: Started Kubernetes Kubelet. Apr 16 20:27:37.150087 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.150071 2565 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 20:27:37.150390 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.150351 2565 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-tzs27" Apr 16 20:27:37.150678 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.150663 2565 server.go:317] "Adding debug handlers to kubelet server" Apr 16 20:27:37.154807 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:37.154755 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-139-150.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 20:27:37.155582 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.155557 2565 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-139-150.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 20:27:37.155698 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:37.155631 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 20:27:37.159664 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.159644 2565 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 20:27:37.160310 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.160294 2565 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 20:27:37.160660 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:37.160637 2565 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 20:27:37.160919 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.160903 2565 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 20:27:37.160978 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.160922 2565 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 20:27:37.160978 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.160906 2565 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 20:27:37.160978 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.160950 2565 factory.go:55] Registering systemd factory Apr 16 20:27:37.160978 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.160905 2565 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 20:27:37.160978 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.160959 2565 factory.go:223] Registration of the systemd container factory successfully Apr 16 20:27:37.161199 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.161006 2565 reconstruct.go:97] "Volume reconstruction finished" Apr 16 20:27:37.161199 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.161015 2565 reconciler.go:26] "Reconciler: start to sync state" Apr 16 20:27:37.161199 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:37.161122 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-150.ec2.internal\" not found" Apr 16 20:27:37.161345 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.161201 2565 factory.go:153] Registering CRI-O factory Apr 16 20:27:37.161345 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.161213 2565 factory.go:223] Registration of the crio container factory successfully Apr 16 20:27:37.161345 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.161233 2565 factory.go:103] Registering Raw factory Apr 16 20:27:37.161345 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.161242 2565 manager.go:1196] Started watching for new ooms in manager Apr 16 20:27:37.161668 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.161651 2565 manager.go:319] Starting recovery of all containers Apr 16 20:27:37.163406 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.163387 2565 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:27:37.165310 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:37.165264 2565 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-139-150.ec2.internal\" not found" node="ip-10-0-139-150.ec2.internal" Apr 16 20:27:37.172864 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.172725 2565 manager.go:324] Recovery completed Apr 16 20:27:37.177329 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.177317 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:27:37.180764 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.180750 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-150.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:27:37.180823 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.180776 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-150.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:27:37.180823 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.180792 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-150.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:27:37.181249 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.181238 2565 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 20:27:37.181249 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.181248 2565 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 20:27:37.181351 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.181263 2565 state_mem.go:36] "Initialized new in-memory state store" Apr 16 20:27:37.183502 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.183491 2565 policy_none.go:49] "None policy: Start" Apr 16 20:27:37.183546 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.183506 2565 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 20:27:37.183546 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.183515 2565 state_mem.go:35] "Initializing new in-memory state store" Apr 16 20:27:37.220872 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.220856 2565 manager.go:341] "Starting Device Plugin manager" Apr 16 20:27:37.237796 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:37.220897 2565 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 20:27:37.237796 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.220910 2565 server.go:85] "Starting device plugin registration server" Apr 16 20:27:37.237796 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.221167 2565 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 20:27:37.237796 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.221178 2565 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 20:27:37.237796 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.221349 2565 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 20:27:37.237796 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.221423 2565 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 20:27:37.237796 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.221432 2565 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 20:27:37.237796 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:37.222333 2565 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 20:27:37.237796 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:37.222371 2565 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-150.ec2.internal\" not found" Apr 16 20:27:37.305962 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.305889 2565 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 20:27:37.307097 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.307080 2565 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 20:27:37.307180 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.307105 2565 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 20:27:37.307180 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.307128 2565 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 20:27:37.307180 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.307137 2565 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 20:27:37.307308 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:37.307177 2565 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 20:27:37.309329 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.309307 2565 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:27:37.321982 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.321966 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:27:37.322904 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.322889 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-150.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:27:37.322971 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.322917 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-150.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:27:37.322971 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.322928 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-150.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:27:37.322971 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.322950 2565 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-150.ec2.internal" Apr 16 20:27:37.330808 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.330795 2565 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-150.ec2.internal" Apr 16 20:27:37.330851 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:37.330816 2565 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-139-150.ec2.internal\": node \"ip-10-0-139-150.ec2.internal\" not found" Apr 16 20:27:37.347866 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:37.347847 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-150.ec2.internal\" not found" Apr 16 20:27:37.407770 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.407750 2565 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-150.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-139-150.ec2.internal"] Apr 16 20:27:37.407822 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.407810 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:27:37.408607 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.408591 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-150.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:27:37.408669 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.408616 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-150.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:27:37.408669 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.408625 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-150.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:27:37.409820 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.409807 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:27:37.409957 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.409943 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-150.ec2.internal" Apr 16 20:27:37.409997 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.409971 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:27:37.410477 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.410459 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-150.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:27:37.410477 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.410469 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-150.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:27:37.410589 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.410489 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-150.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:27:37.410589 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.410491 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-150.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:27:37.410589 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.410515 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-150.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:27:37.410589 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.410502 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-150.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:27:37.411819 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.411801 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-150.ec2.internal" Apr 16 20:27:37.411912 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.411824 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:27:37.412941 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.412927 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-150.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:27:37.413019 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.412961 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-150.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:27:37.413019 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.412978 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-150.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:27:37.439766 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:37.439744 2565 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-150.ec2.internal\" not found" node="ip-10-0-139-150.ec2.internal" Apr 16 20:27:37.444217 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:37.444201 2565 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-150.ec2.internal\" not found" node="ip-10-0-139-150.ec2.internal" Apr 16 20:27:37.448248 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:37.448234 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-150.ec2.internal\" not found" Apr 16 20:27:37.548592 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:37.548565 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-150.ec2.internal\" not found" Apr 16 20:27:37.562074 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.562019 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/15450d4a1cffd30a1009fca9edec82d3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-150.ec2.internal\" (UID: \"15450d4a1cffd30a1009fca9edec82d3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-150.ec2.internal" Apr 16 20:27:37.562074 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.562043 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/15450d4a1cffd30a1009fca9edec82d3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-150.ec2.internal\" (UID: \"15450d4a1cffd30a1009fca9edec82d3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-150.ec2.internal" Apr 16 20:27:37.562074 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.562061 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/babf34b6062f196401e1c6b676b7c7db-config\") pod \"kube-apiserver-proxy-ip-10-0-139-150.ec2.internal\" (UID: \"babf34b6062f196401e1c6b676b7c7db\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-150.ec2.internal" Apr 16 20:27:37.649253 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:37.649218 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-150.ec2.internal\" not found" Apr 16 20:27:37.662606 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.662580 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/babf34b6062f196401e1c6b676b7c7db-config\") pod \"kube-apiserver-proxy-ip-10-0-139-150.ec2.internal\" (UID: \"babf34b6062f196401e1c6b676b7c7db\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-150.ec2.internal" Apr 16 20:27:37.662606 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.662593 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/babf34b6062f196401e1c6b676b7c7db-config\") pod \"kube-apiserver-proxy-ip-10-0-139-150.ec2.internal\" (UID: \"babf34b6062f196401e1c6b676b7c7db\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-150.ec2.internal" Apr 16 20:27:37.662702 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.662635 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/15450d4a1cffd30a1009fca9edec82d3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-150.ec2.internal\" (UID: \"15450d4a1cffd30a1009fca9edec82d3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-150.ec2.internal" Apr 16 20:27:37.662702 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.662657 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/15450d4a1cffd30a1009fca9edec82d3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-150.ec2.internal\" (UID: \"15450d4a1cffd30a1009fca9edec82d3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-150.ec2.internal" Apr 16 20:27:37.662702 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.662680 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/15450d4a1cffd30a1009fca9edec82d3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-150.ec2.internal\" (UID: \"15450d4a1cffd30a1009fca9edec82d3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-150.ec2.internal" Apr 16 20:27:37.662792 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.662705 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/15450d4a1cffd30a1009fca9edec82d3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-150.ec2.internal\" (UID: \"15450d4a1cffd30a1009fca9edec82d3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-150.ec2.internal" Apr 16 20:27:37.741731 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.741701 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-150.ec2.internal" Apr 16 20:27:37.746214 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.746194 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-150.ec2.internal" Apr 16 20:27:37.749803 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:37.749781 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-150.ec2.internal\" not found" Apr 16 20:27:37.849862 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:37.849837 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-150.ec2.internal\" not found" Apr 16 20:27:37.950334 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:37.950310 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-150.ec2.internal\" not found" Apr 16 20:27:37.992903 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:37.992885 2565 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:27:38.060805 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.060780 2565 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-150.ec2.internal" Apr 16 20:27:38.071798 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.071778 2565 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 20:27:38.071930 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.071913 2565 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 20:27:38.072027 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.071932 2565 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 20:27:38.072027 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.071936 2565 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 20:27:38.072027 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:38.071933 2565 request.go:1360] "Unexpected error when reading response body" err="read tcp 10.0.139.150:38464->100.49.226.102:6443: use of closed network connection" Apr 16 20:27:38.072027 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:38.072010 2565 kubelet.go:3342] "Failed creating a mirror pod" err="unexpected error when reading response body. Please retry. Original error: read tcp 10.0.139.150:38464->100.49.226.102:6443: use of closed network connection" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-150.ec2.internal" Apr 16 20:27:38.072027 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.072027 2565 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-150.ec2.internal" Apr 16 20:27:38.088021 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.087997 2565 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 20:27:38.137122 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.136822 2565 apiserver.go:52] "Watching apiserver" Apr 16 20:27:38.144161 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.144138 2565 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 20:27:38.144621 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.144595 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-k9tth","openshift-ovn-kubernetes/ovnkube-node-59rwg","openshift-cluster-node-tuning-operator/tuned-vjq6x","openshift-dns/node-resolver-7qfs5","openshift-image-registry/node-ca-n2tj2","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-150.ec2.internal","openshift-multus/multus-additional-cni-plugins-hthg8","openshift-multus/multus-cb7jp","openshift-network-operator/iptables-alerter-9pkk2","kube-system/konnectivity-agent-z6p8j","kube-system/kube-apiserver-proxy-ip-10-0-139-150.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzp6r","openshift-multus/network-metrics-daemon-jzdqd"] Apr 16 20:27:38.146366 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.146347 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hthg8" Apr 16 20:27:38.147987 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.147961 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.148583 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.148559 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 20:27:38.148583 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.148564 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 20:27:38.148714 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.148655 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 20:27:38.148714 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.148691 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 20:27:38.148984 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.148966 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 20:27:38.149069 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.149002 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-t4m24\"" Apr 16 20:27:38.149360 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.149346 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.150076 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.150063 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 20:27:38.150345 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.150327 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 20:27:38.150551 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.150537 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 20:27:38.153420 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.151143 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7qfs5" Apr 16 20:27:38.153420 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.151668 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 20:27:38.153420 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.152142 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-q49nx\"" Apr 16 20:27:38.153420 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.152433 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 20:27:38.153420 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.152448 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 20:27:38.153420 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.152465 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:27:38.153420 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.152864 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 20:27:38.153420 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.153256 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-4hfkv\"" Apr 16 20:27:38.154385 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.154366 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 20:27:38.154920 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.154895 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 20:22:37 +0000 UTC" deadline="2027-10-28 04:05:43.594243661 +0000 UTC" Apr 16 20:27:38.154920 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.154919 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13423h38m5.439327607s" Apr 16 20:27:38.155205 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.155178 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 20:27:38.155255 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.155221 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-mxqm9\"" Apr 16 20:27:38.155510 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.155492 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-n2tj2" Apr 16 20:27:38.155596 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.155580 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9tth" Apr 16 20:27:38.155735 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:38.155659 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9tth" podUID="ea0bcc7a-3a29-4d83-8da0-05ec471a2764" Apr 16 20:27:38.157108 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.157091 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.157521 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.157506 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 20:27:38.157820 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.157801 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 20:27:38.157903 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.157864 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 20:27:38.157966 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.157919 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-6lc9h\"" Apr 16 20:27:38.158423 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.158410 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-9pkk2" Apr 16 20:27:38.159202 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.159185 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 20:27:38.159360 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.159329 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-54nkv\"" Apr 16 20:27:38.159495 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.159476 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-z6p8j" Apr 16 20:27:38.159744 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.159716 2565 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 20:27:38.160466 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.160452 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 20:27:38.160551 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.160513 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 20:27:38.160786 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.160768 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:27:38.160877 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.160841 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-wqkxr\"" Apr 16 20:27:38.161055 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.161036 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzp6r" Apr 16 20:27:38.161580 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.161519 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-fm5nh\"" Apr 16 20:27:38.161788 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.161770 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 20:27:38.161947 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.161935 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 20:27:38.162290 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.162251 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jzdqd" Apr 16 20:27:38.162379 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:38.162326 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jzdqd" podUID="90b993f2-207d-4894-bbdf-e2219dbf690b" Apr 16 20:27:38.163230 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.163212 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 20:27:38.163402 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.163228 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 20:27:38.163402 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.163376 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-7ncss\"" Apr 16 20:27:38.163512 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.163475 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 20:27:38.164968 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.164952 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmc9w\" (UniqueName: \"kubernetes.io/projected/dd8e84a9-042c-4346-8ef7-68dcca064683-kube-api-access-xmc9w\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.165071 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.164981 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/334e7790-0d97-4116-a3ac-9e7dab8e33e3-host-slash\") pod \"iptables-alerter-9pkk2\" (UID: \"334e7790-0d97-4116-a3ac-9e7dab8e33e3\") " pod="openshift-network-operator/iptables-alerter-9pkk2" Apr 16 20:27:38.165071 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.165013 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-run-ovn\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.165071 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.165040 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcxt8\" (UniqueName: \"kubernetes.io/projected/97e2c184-5abb-438f-8f9b-2df48f93e465-kube-api-access-wcxt8\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.165182 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.165075 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/07a95f45-4b58-43ee-b613-5effa3c9ce71-etc-sysctl-d\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.165182 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.165126 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/07a95f45-4b58-43ee-b613-5effa3c9ce71-etc-sysctl-conf\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.165182 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.165153 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/07a95f45-4b58-43ee-b613-5effa3c9ce71-run\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.165344 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.165180 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-system-cni-dir\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.165344 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.165209 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-multus-socket-dir-parent\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.165344 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.165233 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/22753460-b103-4969-8aca-1ea39040795b-hosts-file\") pod \"node-resolver-7qfs5\" (UID: \"22753460-b103-4969-8aca-1ea39040795b\") " pod="openshift-dns/node-resolver-7qfs5" Apr 16 20:27:38.165344 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.165259 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-host-cni-bin\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.165344 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.165306 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/07a95f45-4b58-43ee-b613-5effa3c9ce71-var-lib-kubelet\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.165344 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.165323 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/07a95f45-4b58-43ee-b613-5effa3c9ce71-tmp\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.165344 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.165338 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-host-run-netns\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.165660 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.165360 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-host-var-lib-cni-bin\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.165660 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.165389 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/334e7790-0d97-4116-a3ac-9e7dab8e33e3-iptables-alerter-script\") pod \"iptables-alerter-9pkk2\" (UID: \"334e7790-0d97-4116-a3ac-9e7dab8e33e3\") " pod="openshift-network-operator/iptables-alerter-9pkk2" Apr 16 20:27:38.165660 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.165406 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d95a170f-35f2-4732-bbb5-f8e2f6768efc-konnectivity-ca\") pod \"konnectivity-agent-z6p8j\" (UID: \"d95a170f-35f2-4732-bbb5-f8e2f6768efc\") " pod="kube-system/konnectivity-agent-z6p8j" Apr 16 20:27:38.165660 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.165425 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-node-log\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.165660 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.165475 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/97e2c184-5abb-438f-8f9b-2df48f93e465-ovn-node-metrics-cert\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.165660 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.165515 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wv5m\" (UniqueName: \"kubernetes.io/projected/1aac658f-e4b3-4b53-a125-6cae725d6fcd-kube-api-access-7wv5m\") pod \"multus-additional-cni-plugins-hthg8\" (UID: \"1aac658f-e4b3-4b53-a125-6cae725d6fcd\") " pod="openshift-multus/multus-additional-cni-plugins-hthg8" Apr 16 20:27:38.165660 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.165550 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n7xl\" (UniqueName: \"kubernetes.io/projected/22753460-b103-4969-8aca-1ea39040795b-kube-api-access-4n7xl\") pod \"node-resolver-7qfs5\" (UID: \"22753460-b103-4969-8aca-1ea39040795b\") " pod="openshift-dns/node-resolver-7qfs5" Apr 16 20:27:38.165660 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.165600 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/73afd102-e8ce-40ea-847a-3f73f708f41d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rzp6r\" (UID: \"73afd102-e8ce-40ea-847a-3f73f708f41d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzp6r" Apr 16 20:27:38.165660 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.165643 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-host-run-ovn-kubernetes\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.166034 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.165671 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/07a95f45-4b58-43ee-b613-5effa3c9ce71-etc-sysconfig\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.166034 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.165691 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-cnibin\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.166034 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.165707 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1aac658f-e4b3-4b53-a125-6cae725d6fcd-os-release\") pod \"multus-additional-cni-plugins-hthg8\" (UID: \"1aac658f-e4b3-4b53-a125-6cae725d6fcd\") " pod="openshift-multus/multus-additional-cni-plugins-hthg8" Apr 16 20:27:38.166034 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.165723 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-host-run-netns\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.166034 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.165744 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07a95f45-4b58-43ee-b613-5effa3c9ce71-etc-kubernetes\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.166034 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.165762 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07a95f45-4b58-43ee-b613-5effa3c9ce71-host\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.166034 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.165777 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90b993f2-207d-4894-bbdf-e2219dbf690b-metrics-certs\") pod \"network-metrics-daemon-jzdqd\" (UID: \"90b993f2-207d-4894-bbdf-e2219dbf690b\") " pod="openshift-multus/network-metrics-daemon-jzdqd" Apr 16 20:27:38.166034 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.165804 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkwlx\" (UniqueName: \"kubernetes.io/projected/90b993f2-207d-4894-bbdf-e2219dbf690b-kube-api-access-kkwlx\") pod \"network-metrics-daemon-jzdqd\" (UID: \"90b993f2-207d-4894-bbdf-e2219dbf690b\") " pod="openshift-multus/network-metrics-daemon-jzdqd" Apr 16 20:27:38.166034 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.165832 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/73afd102-e8ce-40ea-847a-3f73f708f41d-device-dir\") pod \"aws-ebs-csi-driver-node-rzp6r\" (UID: \"73afd102-e8ce-40ea-847a-3f73f708f41d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzp6r" Apr 16 20:27:38.166034 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.165857 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-systemd-units\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.166034 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.165872 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-host-slash\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.166034 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.165886 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-etc-openvswitch\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.166034 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.165922 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/07a95f45-4b58-43ee-b613-5effa3c9ce71-etc-modprobe-d\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.166034 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.165937 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07a95f45-4b58-43ee-b613-5effa3c9ce71-sys\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.166034 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.165951 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1aac658f-e4b3-4b53-a125-6cae725d6fcd-system-cni-dir\") pod \"multus-additional-cni-plugins-hthg8\" (UID: \"1aac658f-e4b3-4b53-a125-6cae725d6fcd\") " pod="openshift-multus/multus-additional-cni-plugins-hthg8" Apr 16 20:27:38.166034 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.165991 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1aac658f-e4b3-4b53-a125-6cae725d6fcd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hthg8\" (UID: \"1aac658f-e4b3-4b53-a125-6cae725d6fcd\") " pod="openshift-multus/multus-additional-cni-plugins-hthg8" Apr 16 20:27:38.166672 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.166007 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/73afd102-e8ce-40ea-847a-3f73f708f41d-socket-dir\") pod \"aws-ebs-csi-driver-node-rzp6r\" (UID: \"73afd102-e8ce-40ea-847a-3f73f708f41d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzp6r" Apr 16 20:27:38.166672 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.166026 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv9k4\" (UniqueName: \"kubernetes.io/projected/73afd102-e8ce-40ea-847a-3f73f708f41d-kube-api-access-fv9k4\") pod \"aws-ebs-csi-driver-node-rzp6r\" (UID: \"73afd102-e8ce-40ea-847a-3f73f708f41d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzp6r" Apr 16 20:27:38.166672 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.166060 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-run-openvswitch\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.166672 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.166074 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-multus-conf-dir\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.166672 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.166089 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-etc-kubernetes\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.166672 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.166121 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/73afd102-e8ce-40ea-847a-3f73f708f41d-etc-selinux\") pod \"aws-ebs-csi-driver-node-rzp6r\" (UID: \"73afd102-e8ce-40ea-847a-3f73f708f41d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzp6r" Apr 16 20:27:38.166672 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.166174 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-host-cni-netd\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.166672 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.166209 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.166672 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.166235 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/97e2c184-5abb-438f-8f9b-2df48f93e465-env-overrides\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.166672 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.166291 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-os-release\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.166672 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.166330 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/72204d28-677e-4d89-a353-b087ce28c38f-serviceca\") pod \"node-ca-n2tj2\" (UID: \"72204d28-677e-4d89-a353-b087ce28c38f\") " pod="openshift-image-registry/node-ca-n2tj2" Apr 16 20:27:38.166672 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.166356 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn78s\" (UniqueName: \"kubernetes.io/projected/72204d28-677e-4d89-a353-b087ce28c38f-kube-api-access-xn78s\") pod \"node-ca-n2tj2\" (UID: \"72204d28-677e-4d89-a353-b087ce28c38f\") " pod="openshift-image-registry/node-ca-n2tj2" Apr 16 20:27:38.166672 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.166380 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/73afd102-e8ce-40ea-847a-3f73f708f41d-sys-fs\") pod \"aws-ebs-csi-driver-node-rzp6r\" (UID: \"73afd102-e8ce-40ea-847a-3f73f708f41d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzp6r" Apr 16 20:27:38.166672 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.166403 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-run-systemd\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.166672 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.166430 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-var-lib-openvswitch\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.166672 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.166460 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d95a170f-35f2-4732-bbb5-f8e2f6768efc-agent-certs\") pod \"konnectivity-agent-z6p8j\" (UID: \"d95a170f-35f2-4732-bbb5-f8e2f6768efc\") " pod="kube-system/konnectivity-agent-z6p8j" Apr 16 20:27:38.167322 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.166488 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/73afd102-e8ce-40ea-847a-3f73f708f41d-registration-dir\") pod \"aws-ebs-csi-driver-node-rzp6r\" (UID: \"73afd102-e8ce-40ea-847a-3f73f708f41d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzp6r" Apr 16 20:27:38.167322 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.166521 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dd8e84a9-042c-4346-8ef7-68dcca064683-cni-binary-copy\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.167322 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.166545 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-host-run-multus-certs\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.167322 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.166579 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1aac658f-e4b3-4b53-a125-6cae725d6fcd-cni-binary-copy\") pod \"multus-additional-cni-plugins-hthg8\" (UID: \"1aac658f-e4b3-4b53-a125-6cae725d6fcd\") " pod="openshift-multus/multus-additional-cni-plugins-hthg8" Apr 16 20:27:38.167322 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.166605 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1aac658f-e4b3-4b53-a125-6cae725d6fcd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hthg8\" (UID: \"1aac658f-e4b3-4b53-a125-6cae725d6fcd\") " pod="openshift-multus/multus-additional-cni-plugins-hthg8" Apr 16 20:27:38.167322 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.166638 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tfsh\" (UniqueName: \"kubernetes.io/projected/334e7790-0d97-4116-a3ac-9e7dab8e33e3-kube-api-access-9tfsh\") pod \"iptables-alerter-9pkk2\" (UID: \"334e7790-0d97-4116-a3ac-9e7dab8e33e3\") " pod="openshift-network-operator/iptables-alerter-9pkk2" Apr 16 20:27:38.167322 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.166676 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-host-kubelet\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.167322 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.166700 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/07a95f45-4b58-43ee-b613-5effa3c9ce71-lib-modules\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.167322 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.166724 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q6qj\" (UniqueName: \"kubernetes.io/projected/07a95f45-4b58-43ee-b613-5effa3c9ce71-kube-api-access-6q6qj\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.167322 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.166776 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-host-run-k8s-cni-cncf-io\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.167322 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.166802 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1aac658f-e4b3-4b53-a125-6cae725d6fcd-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hthg8\" (UID: \"1aac658f-e4b3-4b53-a125-6cae725d6fcd\") " pod="openshift-multus/multus-additional-cni-plugins-hthg8" Apr 16 20:27:38.167322 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.166835 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-log-socket\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.167322 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.166870 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/97e2c184-5abb-438f-8f9b-2df48f93e465-ovnkube-config\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.167322 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.166893 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-multus-cni-dir\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.167322 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.166916 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-hostroot\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.167322 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.166939 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/07a95f45-4b58-43ee-b613-5effa3c9ce71-etc-tuned\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.167917 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.166964 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rn7c\" (UniqueName: \"kubernetes.io/projected/ea0bcc7a-3a29-4d83-8da0-05ec471a2764-kube-api-access-4rn7c\") pod \"network-check-target-k9tth\" (UID: \"ea0bcc7a-3a29-4d83-8da0-05ec471a2764\") " pod="openshift-network-diagnostics/network-check-target-k9tth" Apr 16 20:27:38.167917 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.166999 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-host-var-lib-kubelet\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.167917 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.167036 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/72204d28-677e-4d89-a353-b087ce28c38f-host\") pod \"node-ca-n2tj2\" (UID: \"72204d28-677e-4d89-a353-b087ce28c38f\") " pod="openshift-image-registry/node-ca-n2tj2" Apr 16 20:27:38.167917 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.167058 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/dd8e84a9-042c-4346-8ef7-68dcca064683-multus-daemon-config\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.167917 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.167083 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1aac658f-e4b3-4b53-a125-6cae725d6fcd-cnibin\") pod \"multus-additional-cni-plugins-hthg8\" (UID: \"1aac658f-e4b3-4b53-a125-6cae725d6fcd\") " pod="openshift-multus/multus-additional-cni-plugins-hthg8" Apr 16 20:27:38.167917 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.167105 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/22753460-b103-4969-8aca-1ea39040795b-tmp-dir\") pod \"node-resolver-7qfs5\" (UID: \"22753460-b103-4969-8aca-1ea39040795b\") " pod="openshift-dns/node-resolver-7qfs5" Apr 16 20:27:38.167917 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.167127 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/97e2c184-5abb-438f-8f9b-2df48f93e465-ovnkube-script-lib\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.167917 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.167157 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/07a95f45-4b58-43ee-b613-5effa3c9ce71-etc-systemd\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.167917 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.167171 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-host-var-lib-cni-multus\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.173609 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.173588 2565 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 20:27:38.191392 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.191374 2565 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-gfs9h" Apr 16 20:27:38.198730 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.198712 2565 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-gfs9h" Apr 16 20:27:38.245179 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:38.245153 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15450d4a1cffd30a1009fca9edec82d3.slice/crio-a9c1cbef932dd6b901aede88b39057a369a289c7de59002a2a84fee5df66f61a WatchSource:0}: Error finding container a9c1cbef932dd6b901aede88b39057a369a289c7de59002a2a84fee5df66f61a: Status 404 returned error can't find the container with id a9c1cbef932dd6b901aede88b39057a369a289c7de59002a2a84fee5df66f61a Apr 16 20:27:38.245473 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:38.245446 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbabf34b6062f196401e1c6b676b7c7db.slice/crio-6243ca774f2abfe9958a60644c87dedb4b11fed5309bc39630e91c924d75289c WatchSource:0}: Error finding container 6243ca774f2abfe9958a60644c87dedb4b11fed5309bc39630e91c924d75289c: Status 404 returned error can't find the container with id 6243ca774f2abfe9958a60644c87dedb4b11fed5309bc39630e91c924d75289c Apr 16 20:27:38.249980 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.249957 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:27:38.261809 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.261784 2565 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 20:27:38.267908 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.267890 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xmc9w\" (UniqueName: \"kubernetes.io/projected/dd8e84a9-042c-4346-8ef7-68dcca064683-kube-api-access-xmc9w\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.267993 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.267916 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/334e7790-0d97-4116-a3ac-9e7dab8e33e3-host-slash\") pod \"iptables-alerter-9pkk2\" (UID: \"334e7790-0d97-4116-a3ac-9e7dab8e33e3\") " pod="openshift-network-operator/iptables-alerter-9pkk2" Apr 16 20:27:38.267993 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.267931 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-run-ovn\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.267993 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.267953 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wcxt8\" (UniqueName: \"kubernetes.io/projected/97e2c184-5abb-438f-8f9b-2df48f93e465-kube-api-access-wcxt8\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.267993 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.267976 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/07a95f45-4b58-43ee-b613-5effa3c9ce71-etc-sysctl-d\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.268178 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.267998 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/07a95f45-4b58-43ee-b613-5effa3c9ce71-etc-sysctl-conf\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.268178 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268008 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/334e7790-0d97-4116-a3ac-9e7dab8e33e3-host-slash\") pod \"iptables-alerter-9pkk2\" (UID: \"334e7790-0d97-4116-a3ac-9e7dab8e33e3\") " pod="openshift-network-operator/iptables-alerter-9pkk2" Apr 16 20:27:38.268178 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268020 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/07a95f45-4b58-43ee-b613-5effa3c9ce71-run\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.268178 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268010 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-run-ovn\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.268178 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268042 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-system-cni-dir\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.268178 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268068 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-multus-socket-dir-parent\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.268178 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268091 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/22753460-b103-4969-8aca-1ea39040795b-hosts-file\") pod \"node-resolver-7qfs5\" (UID: \"22753460-b103-4969-8aca-1ea39040795b\") " pod="openshift-dns/node-resolver-7qfs5" Apr 16 20:27:38.268178 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268116 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-host-cni-bin\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.268178 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268118 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-system-cni-dir\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.268178 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268139 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/07a95f45-4b58-43ee-b613-5effa3c9ce71-var-lib-kubelet\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.268178 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268141 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/07a95f45-4b58-43ee-b613-5effa3c9ce71-etc-sysctl-conf\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.268178 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268153 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/07a95f45-4b58-43ee-b613-5effa3c9ce71-etc-sysctl-d\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.268178 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268112 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/07a95f45-4b58-43ee-b613-5effa3c9ce71-run\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.268178 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268169 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/22753460-b103-4969-8aca-1ea39040795b-hosts-file\") pod \"node-resolver-7qfs5\" (UID: \"22753460-b103-4969-8aca-1ea39040795b\") " pod="openshift-dns/node-resolver-7qfs5" Apr 16 20:27:38.268178 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268174 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-multus-socket-dir-parent\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.268178 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268175 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-host-cni-bin\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.268887 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268206 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/07a95f45-4b58-43ee-b613-5effa3c9ce71-tmp\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.268887 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268218 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/07a95f45-4b58-43ee-b613-5effa3c9ce71-var-lib-kubelet\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.268887 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268224 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-host-run-netns\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.268887 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268250 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-host-var-lib-cni-bin\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.268887 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268269 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-host-run-netns\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.268887 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268288 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/334e7790-0d97-4116-a3ac-9e7dab8e33e3-iptables-alerter-script\") pod \"iptables-alerter-9pkk2\" (UID: \"334e7790-0d97-4116-a3ac-9e7dab8e33e3\") " pod="openshift-network-operator/iptables-alerter-9pkk2" Apr 16 20:27:38.268887 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268307 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-host-var-lib-cni-bin\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.268887 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268313 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d95a170f-35f2-4732-bbb5-f8e2f6768efc-konnectivity-ca\") pod \"konnectivity-agent-z6p8j\" (UID: \"d95a170f-35f2-4732-bbb5-f8e2f6768efc\") " pod="kube-system/konnectivity-agent-z6p8j" Apr 16 20:27:38.268887 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268337 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-node-log\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.268887 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268354 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/97e2c184-5abb-438f-8f9b-2df48f93e465-ovn-node-metrics-cert\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.268887 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268370 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7wv5m\" (UniqueName: \"kubernetes.io/projected/1aac658f-e4b3-4b53-a125-6cae725d6fcd-kube-api-access-7wv5m\") pod \"multus-additional-cni-plugins-hthg8\" (UID: \"1aac658f-e4b3-4b53-a125-6cae725d6fcd\") " pod="openshift-multus/multus-additional-cni-plugins-hthg8" Apr 16 20:27:38.268887 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268382 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-node-log\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.268887 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268393 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4n7xl\" (UniqueName: \"kubernetes.io/projected/22753460-b103-4969-8aca-1ea39040795b-kube-api-access-4n7xl\") pod \"node-resolver-7qfs5\" (UID: \"22753460-b103-4969-8aca-1ea39040795b\") " pod="openshift-dns/node-resolver-7qfs5" Apr 16 20:27:38.268887 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268418 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/73afd102-e8ce-40ea-847a-3f73f708f41d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rzp6r\" (UID: \"73afd102-e8ce-40ea-847a-3f73f708f41d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzp6r" Apr 16 20:27:38.268887 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268435 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-host-run-ovn-kubernetes\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.268887 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268458 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/07a95f45-4b58-43ee-b613-5effa3c9ce71-etc-sysconfig\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.268887 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268483 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-cnibin\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.269596 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268508 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1aac658f-e4b3-4b53-a125-6cae725d6fcd-os-release\") pod \"multus-additional-cni-plugins-hthg8\" (UID: \"1aac658f-e4b3-4b53-a125-6cae725d6fcd\") " pod="openshift-multus/multus-additional-cni-plugins-hthg8" Apr 16 20:27:38.269596 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268525 2565 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 20:27:38.269596 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268569 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-host-run-netns\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.269596 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268604 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/07a95f45-4b58-43ee-b613-5effa3c9ce71-etc-sysconfig\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.269596 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268620 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-cnibin\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.269596 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268649 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/73afd102-e8ce-40ea-847a-3f73f708f41d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rzp6r\" (UID: \"73afd102-e8ce-40ea-847a-3f73f708f41d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzp6r" Apr 16 20:27:38.269596 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268676 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1aac658f-e4b3-4b53-a125-6cae725d6fcd-os-release\") pod \"multus-additional-cni-plugins-hthg8\" (UID: \"1aac658f-e4b3-4b53-a125-6cae725d6fcd\") " pod="openshift-multus/multus-additional-cni-plugins-hthg8" Apr 16 20:27:38.269596 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268692 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-host-run-ovn-kubernetes\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.269596 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268532 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-host-run-netns\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.269596 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268749 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07a95f45-4b58-43ee-b613-5effa3c9ce71-etc-kubernetes\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.269596 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268771 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07a95f45-4b58-43ee-b613-5effa3c9ce71-host\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.269596 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268787 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90b993f2-207d-4894-bbdf-e2219dbf690b-metrics-certs\") pod \"network-metrics-daemon-jzdqd\" (UID: \"90b993f2-207d-4894-bbdf-e2219dbf690b\") " pod="openshift-multus/network-metrics-daemon-jzdqd" Apr 16 20:27:38.269596 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268805 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kkwlx\" (UniqueName: \"kubernetes.io/projected/90b993f2-207d-4894-bbdf-e2219dbf690b-kube-api-access-kkwlx\") pod \"network-metrics-daemon-jzdqd\" (UID: \"90b993f2-207d-4894-bbdf-e2219dbf690b\") " pod="openshift-multus/network-metrics-daemon-jzdqd" Apr 16 20:27:38.269596 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268827 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/73afd102-e8ce-40ea-847a-3f73f708f41d-device-dir\") pod \"aws-ebs-csi-driver-node-rzp6r\" (UID: \"73afd102-e8ce-40ea-847a-3f73f708f41d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzp6r" Apr 16 20:27:38.269596 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268833 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07a95f45-4b58-43ee-b613-5effa3c9ce71-etc-kubernetes\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.269596 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268847 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07a95f45-4b58-43ee-b613-5effa3c9ce71-host\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.269596 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268833 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/334e7790-0d97-4116-a3ac-9e7dab8e33e3-iptables-alerter-script\") pod \"iptables-alerter-9pkk2\" (UID: \"334e7790-0d97-4116-a3ac-9e7dab8e33e3\") " pod="openshift-network-operator/iptables-alerter-9pkk2" Apr 16 20:27:38.269596 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268846 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d95a170f-35f2-4732-bbb5-f8e2f6768efc-konnectivity-ca\") pod \"konnectivity-agent-z6p8j\" (UID: \"d95a170f-35f2-4732-bbb5-f8e2f6768efc\") " pod="kube-system/konnectivity-agent-z6p8j" Apr 16 20:27:38.270456 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268895 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/73afd102-e8ce-40ea-847a-3f73f708f41d-device-dir\") pod \"aws-ebs-csi-driver-node-rzp6r\" (UID: \"73afd102-e8ce-40ea-847a-3f73f708f41d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzp6r" Apr 16 20:27:38.270456 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268903 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-systemd-units\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.270456 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:38.268917 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:27:38.270456 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.268852 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-systemd-units\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.270456 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:38.269009 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90b993f2-207d-4894-bbdf-e2219dbf690b-metrics-certs podName:90b993f2-207d-4894-bbdf-e2219dbf690b nodeName:}" failed. No retries permitted until 2026-04-16 20:27:38.768981169 +0000 UTC m=+2.036723624 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/90b993f2-207d-4894-bbdf-e2219dbf690b-metrics-certs") pod "network-metrics-daemon-jzdqd" (UID: "90b993f2-207d-4894-bbdf-e2219dbf690b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:27:38.270456 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269025 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-host-slash\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.270456 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269048 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-etc-openvswitch\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.270456 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269065 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/07a95f45-4b58-43ee-b613-5effa3c9ce71-etc-modprobe-d\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.270456 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269080 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-host-slash\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.270456 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269098 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-etc-openvswitch\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.270456 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269126 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/07a95f45-4b58-43ee-b613-5effa3c9ce71-etc-modprobe-d\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.270456 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269143 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07a95f45-4b58-43ee-b613-5effa3c9ce71-sys\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.270456 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269190 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1aac658f-e4b3-4b53-a125-6cae725d6fcd-system-cni-dir\") pod \"multus-additional-cni-plugins-hthg8\" (UID: \"1aac658f-e4b3-4b53-a125-6cae725d6fcd\") " pod="openshift-multus/multus-additional-cni-plugins-hthg8" Apr 16 20:27:38.270456 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269231 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07a95f45-4b58-43ee-b613-5effa3c9ce71-sys\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.270456 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269222 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1aac658f-e4b3-4b53-a125-6cae725d6fcd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hthg8\" (UID: \"1aac658f-e4b3-4b53-a125-6cae725d6fcd\") " pod="openshift-multus/multus-additional-cni-plugins-hthg8" Apr 16 20:27:38.270456 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269253 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1aac658f-e4b3-4b53-a125-6cae725d6fcd-system-cni-dir\") pod \"multus-additional-cni-plugins-hthg8\" (UID: \"1aac658f-e4b3-4b53-a125-6cae725d6fcd\") " pod="openshift-multus/multus-additional-cni-plugins-hthg8" Apr 16 20:27:38.270456 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269268 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/73afd102-e8ce-40ea-847a-3f73f708f41d-socket-dir\") pod \"aws-ebs-csi-driver-node-rzp6r\" (UID: \"73afd102-e8ce-40ea-847a-3f73f708f41d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzp6r" Apr 16 20:27:38.271291 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269329 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fv9k4\" (UniqueName: \"kubernetes.io/projected/73afd102-e8ce-40ea-847a-3f73f708f41d-kube-api-access-fv9k4\") pod \"aws-ebs-csi-driver-node-rzp6r\" (UID: \"73afd102-e8ce-40ea-847a-3f73f708f41d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzp6r" Apr 16 20:27:38.271291 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269352 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-run-openvswitch\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.271291 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269384 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1aac658f-e4b3-4b53-a125-6cae725d6fcd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hthg8\" (UID: \"1aac658f-e4b3-4b53-a125-6cae725d6fcd\") " pod="openshift-multus/multus-additional-cni-plugins-hthg8" Apr 16 20:27:38.271291 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269391 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-multus-conf-dir\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.271291 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269426 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/73afd102-e8ce-40ea-847a-3f73f708f41d-socket-dir\") pod \"aws-ebs-csi-driver-node-rzp6r\" (UID: \"73afd102-e8ce-40ea-847a-3f73f708f41d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzp6r" Apr 16 20:27:38.271291 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269414 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-etc-kubernetes\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.271291 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269453 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-run-openvswitch\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.271291 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269463 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/73afd102-e8ce-40ea-847a-3f73f708f41d-etc-selinux\") pod \"aws-ebs-csi-driver-node-rzp6r\" (UID: \"73afd102-e8ce-40ea-847a-3f73f708f41d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzp6r" Apr 16 20:27:38.271291 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269486 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-host-cni-netd\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.271291 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269504 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-multus-conf-dir\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.271291 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269509 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.271291 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269541 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/97e2c184-5abb-438f-8f9b-2df48f93e465-env-overrides\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.271291 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269561 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-os-release\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.271291 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269580 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/72204d28-677e-4d89-a353-b087ce28c38f-serviceca\") pod \"node-ca-n2tj2\" (UID: \"72204d28-677e-4d89-a353-b087ce28c38f\") " pod="openshift-image-registry/node-ca-n2tj2" Apr 16 20:27:38.271291 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269583 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/73afd102-e8ce-40ea-847a-3f73f708f41d-etc-selinux\") pod \"aws-ebs-csi-driver-node-rzp6r\" (UID: \"73afd102-e8ce-40ea-847a-3f73f708f41d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzp6r" Apr 16 20:27:38.271291 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269597 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-etc-kubernetes\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.271291 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269560 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-host-cni-netd\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.272053 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269614 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xn78s\" (UniqueName: \"kubernetes.io/projected/72204d28-677e-4d89-a353-b087ce28c38f-kube-api-access-xn78s\") pod \"node-ca-n2tj2\" (UID: \"72204d28-677e-4d89-a353-b087ce28c38f\") " pod="openshift-image-registry/node-ca-n2tj2" Apr 16 20:27:38.272053 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269637 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.272053 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269639 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/73afd102-e8ce-40ea-847a-3f73f708f41d-sys-fs\") pod \"aws-ebs-csi-driver-node-rzp6r\" (UID: \"73afd102-e8ce-40ea-847a-3f73f708f41d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzp6r" Apr 16 20:27:38.272053 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269658 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-os-release\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.272053 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269677 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/73afd102-e8ce-40ea-847a-3f73f708f41d-sys-fs\") pod \"aws-ebs-csi-driver-node-rzp6r\" (UID: \"73afd102-e8ce-40ea-847a-3f73f708f41d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzp6r" Apr 16 20:27:38.272053 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269694 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-run-systemd\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.272053 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269721 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-var-lib-openvswitch\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.272053 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269746 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d95a170f-35f2-4732-bbb5-f8e2f6768efc-agent-certs\") pod \"konnectivity-agent-z6p8j\" (UID: \"d95a170f-35f2-4732-bbb5-f8e2f6768efc\") " pod="kube-system/konnectivity-agent-z6p8j" Apr 16 20:27:38.272053 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269774 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/73afd102-e8ce-40ea-847a-3f73f708f41d-registration-dir\") pod \"aws-ebs-csi-driver-node-rzp6r\" (UID: \"73afd102-e8ce-40ea-847a-3f73f708f41d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzp6r" Apr 16 20:27:38.272053 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269797 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dd8e84a9-042c-4346-8ef7-68dcca064683-cni-binary-copy\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.272053 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269823 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-host-run-multus-certs\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.272053 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269847 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1aac658f-e4b3-4b53-a125-6cae725d6fcd-cni-binary-copy\") pod \"multus-additional-cni-plugins-hthg8\" (UID: \"1aac658f-e4b3-4b53-a125-6cae725d6fcd\") " pod="openshift-multus/multus-additional-cni-plugins-hthg8" Apr 16 20:27:38.272053 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269875 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1aac658f-e4b3-4b53-a125-6cae725d6fcd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hthg8\" (UID: \"1aac658f-e4b3-4b53-a125-6cae725d6fcd\") " pod="openshift-multus/multus-additional-cni-plugins-hthg8" Apr 16 20:27:38.272053 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269900 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9tfsh\" (UniqueName: \"kubernetes.io/projected/334e7790-0d97-4116-a3ac-9e7dab8e33e3-kube-api-access-9tfsh\") pod \"iptables-alerter-9pkk2\" (UID: \"334e7790-0d97-4116-a3ac-9e7dab8e33e3\") " pod="openshift-network-operator/iptables-alerter-9pkk2" Apr 16 20:27:38.272053 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269924 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-host-kubelet\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.272053 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269950 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/07a95f45-4b58-43ee-b613-5effa3c9ce71-lib-modules\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.272053 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.269976 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6q6qj\" (UniqueName: \"kubernetes.io/projected/07a95f45-4b58-43ee-b613-5effa3c9ce71-kube-api-access-6q6qj\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.272818 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.270000 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-host-run-k8s-cni-cncf-io\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.272818 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.270026 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1aac658f-e4b3-4b53-a125-6cae725d6fcd-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hthg8\" (UID: \"1aac658f-e4b3-4b53-a125-6cae725d6fcd\") " pod="openshift-multus/multus-additional-cni-plugins-hthg8" Apr 16 20:27:38.272818 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.270038 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/97e2c184-5abb-438f-8f9b-2df48f93e465-env-overrides\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.272818 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.270053 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-log-socket\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.272818 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.270070 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/72204d28-677e-4d89-a353-b087ce28c38f-serviceca\") pod \"node-ca-n2tj2\" (UID: \"72204d28-677e-4d89-a353-b087ce28c38f\") " pod="openshift-image-registry/node-ca-n2tj2" Apr 16 20:27:38.272818 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.270080 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/97e2c184-5abb-438f-8f9b-2df48f93e465-ovnkube-config\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.272818 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.270087 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-run-systemd\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.272818 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.270116 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-multus-cni-dir\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.272818 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.270142 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-hostroot\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.272818 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.270130 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/73afd102-e8ce-40ea-847a-3f73f708f41d-registration-dir\") pod \"aws-ebs-csi-driver-node-rzp6r\" (UID: \"73afd102-e8ce-40ea-847a-3f73f708f41d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzp6r" Apr 16 20:27:38.272818 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.270660 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-host-run-multus-certs\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.272818 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.270924 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/07a95f45-4b58-43ee-b613-5effa3c9ce71-etc-tuned\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.272818 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.271001 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4rn7c\" (UniqueName: \"kubernetes.io/projected/ea0bcc7a-3a29-4d83-8da0-05ec471a2764-kube-api-access-4rn7c\") pod \"network-check-target-k9tth\" (UID: \"ea0bcc7a-3a29-4d83-8da0-05ec471a2764\") " pod="openshift-network-diagnostics/network-check-target-k9tth" Apr 16 20:27:38.272818 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.271036 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-host-var-lib-kubelet\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.272818 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.271071 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/72204d28-677e-4d89-a353-b087ce28c38f-host\") pod \"node-ca-n2tj2\" (UID: \"72204d28-677e-4d89-a353-b087ce28c38f\") " pod="openshift-image-registry/node-ca-n2tj2" Apr 16 20:27:38.272818 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.271105 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/dd8e84a9-042c-4346-8ef7-68dcca064683-multus-daemon-config\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.272818 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.271173 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1aac658f-e4b3-4b53-a125-6cae725d6fcd-cnibin\") pod \"multus-additional-cni-plugins-hthg8\" (UID: \"1aac658f-e4b3-4b53-a125-6cae725d6fcd\") " pod="openshift-multus/multus-additional-cni-plugins-hthg8" Apr 16 20:27:38.273587 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.271198 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1aac658f-e4b3-4b53-a125-6cae725d6fcd-cni-binary-copy\") pod \"multus-additional-cni-plugins-hthg8\" (UID: \"1aac658f-e4b3-4b53-a125-6cae725d6fcd\") " pod="openshift-multus/multus-additional-cni-plugins-hthg8" Apr 16 20:27:38.273587 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.271212 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/22753460-b103-4969-8aca-1ea39040795b-tmp-dir\") pod \"node-resolver-7qfs5\" (UID: \"22753460-b103-4969-8aca-1ea39040795b\") " pod="openshift-dns/node-resolver-7qfs5" Apr 16 20:27:38.273587 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.271241 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/97e2c184-5abb-438f-8f9b-2df48f93e465-ovnkube-script-lib\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.273587 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.271291 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/97e2c184-5abb-438f-8f9b-2df48f93e465-ovnkube-config\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.273587 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.271303 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/07a95f45-4b58-43ee-b613-5effa3c9ce71-etc-systemd\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.273587 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.271351 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-host-var-lib-cni-multus\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.273587 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.271449 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/07a95f45-4b58-43ee-b613-5effa3c9ce71-etc-systemd\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.273587 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.271487 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-host-var-lib-kubelet\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.273587 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.271493 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-host-var-lib-cni-multus\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.273587 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.271543 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/72204d28-677e-4d89-a353-b087ce28c38f-host\") pod \"node-ca-n2tj2\" (UID: \"72204d28-677e-4d89-a353-b087ce28c38f\") " pod="openshift-image-registry/node-ca-n2tj2" Apr 16 20:27:38.273587 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.271722 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/07a95f45-4b58-43ee-b613-5effa3c9ce71-tmp\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.273587 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.271751 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1aac658f-e4b3-4b53-a125-6cae725d6fcd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hthg8\" (UID: \"1aac658f-e4b3-4b53-a125-6cae725d6fcd\") " pod="openshift-multus/multus-additional-cni-plugins-hthg8" Apr 16 20:27:38.273587 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.271815 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-host-kubelet\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.273587 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.271925 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/07a95f45-4b58-43ee-b613-5effa3c9ce71-lib-modules\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.273587 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.271978 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-hostroot\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.273587 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.272038 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-multus-cni-dir\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.273587 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.272070 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/dd8e84a9-042c-4346-8ef7-68dcca064683-host-run-k8s-cni-cncf-io\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.273587 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.272085 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/dd8e84a9-042c-4346-8ef7-68dcca064683-multus-daemon-config\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.274190 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.272169 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-log-socket\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.274190 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.272419 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/22753460-b103-4969-8aca-1ea39040795b-tmp-dir\") pod \"node-resolver-7qfs5\" (UID: \"22753460-b103-4969-8aca-1ea39040795b\") " pod="openshift-dns/node-resolver-7qfs5" Apr 16 20:27:38.274190 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.272686 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1aac658f-e4b3-4b53-a125-6cae725d6fcd-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hthg8\" (UID: \"1aac658f-e4b3-4b53-a125-6cae725d6fcd\") " pod="openshift-multus/multus-additional-cni-plugins-hthg8" Apr 16 20:27:38.274190 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.272713 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97e2c184-5abb-438f-8f9b-2df48f93e465-var-lib-openvswitch\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.274190 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.272770 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1aac658f-e4b3-4b53-a125-6cae725d6fcd-cnibin\") pod \"multus-additional-cni-plugins-hthg8\" (UID: \"1aac658f-e4b3-4b53-a125-6cae725d6fcd\") " pod="openshift-multus/multus-additional-cni-plugins-hthg8" Apr 16 20:27:38.274190 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.272980 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/97e2c184-5abb-438f-8f9b-2df48f93e465-ovnkube-script-lib\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.274190 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.273048 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/07a95f45-4b58-43ee-b613-5effa3c9ce71-etc-tuned\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.274190 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.273139 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dd8e84a9-042c-4346-8ef7-68dcca064683-cni-binary-copy\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.274548 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.274327 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/97e2c184-5abb-438f-8f9b-2df48f93e465-ovn-node-metrics-cert\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.276325 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.276304 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d95a170f-35f2-4732-bbb5-f8e2f6768efc-agent-certs\") pod \"konnectivity-agent-z6p8j\" (UID: \"d95a170f-35f2-4732-bbb5-f8e2f6768efc\") " pod="kube-system/konnectivity-agent-z6p8j" Apr 16 20:27:38.276418 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.276329 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n7xl\" (UniqueName: \"kubernetes.io/projected/22753460-b103-4969-8aca-1ea39040795b-kube-api-access-4n7xl\") pod \"node-resolver-7qfs5\" (UID: \"22753460-b103-4969-8aca-1ea39040795b\") " pod="openshift-dns/node-resolver-7qfs5" Apr 16 20:27:38.276767 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.276743 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcxt8\" (UniqueName: \"kubernetes.io/projected/97e2c184-5abb-438f-8f9b-2df48f93e465-kube-api-access-wcxt8\") pod \"ovnkube-node-59rwg\" (UID: \"97e2c184-5abb-438f-8f9b-2df48f93e465\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.277079 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.277063 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkwlx\" (UniqueName: \"kubernetes.io/projected/90b993f2-207d-4894-bbdf-e2219dbf690b-kube-api-access-kkwlx\") pod \"network-metrics-daemon-jzdqd\" (UID: \"90b993f2-207d-4894-bbdf-e2219dbf690b\") " pod="openshift-multus/network-metrics-daemon-jzdqd" Apr 16 20:27:38.277184 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.277168 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmc9w\" (UniqueName: \"kubernetes.io/projected/dd8e84a9-042c-4346-8ef7-68dcca064683-kube-api-access-xmc9w\") pod \"multus-cb7jp\" (UID: \"dd8e84a9-042c-4346-8ef7-68dcca064683\") " pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.277242 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.277210 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wv5m\" (UniqueName: \"kubernetes.io/projected/1aac658f-e4b3-4b53-a125-6cae725d6fcd-kube-api-access-7wv5m\") pod \"multus-additional-cni-plugins-hthg8\" (UID: \"1aac658f-e4b3-4b53-a125-6cae725d6fcd\") " pod="openshift-multus/multus-additional-cni-plugins-hthg8" Apr 16 20:27:38.279569 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.279550 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv9k4\" (UniqueName: \"kubernetes.io/projected/73afd102-e8ce-40ea-847a-3f73f708f41d-kube-api-access-fv9k4\") pod \"aws-ebs-csi-driver-node-rzp6r\" (UID: \"73afd102-e8ce-40ea-847a-3f73f708f41d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzp6r" Apr 16 20:27:38.279741 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.279727 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn78s\" (UniqueName: \"kubernetes.io/projected/72204d28-677e-4d89-a353-b087ce28c38f-kube-api-access-xn78s\") pod \"node-ca-n2tj2\" (UID: \"72204d28-677e-4d89-a353-b087ce28c38f\") " pod="openshift-image-registry/node-ca-n2tj2" Apr 16 20:27:38.281196 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:38.281184 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:27:38.281245 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:38.281200 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:27:38.281245 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:38.281209 2565 projected.go:194] Error preparing data for projected volume kube-api-access-4rn7c for pod openshift-network-diagnostics/network-check-target-k9tth: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:27:38.281384 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:38.281258 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ea0bcc7a-3a29-4d83-8da0-05ec471a2764-kube-api-access-4rn7c podName:ea0bcc7a-3a29-4d83-8da0-05ec471a2764 nodeName:}" failed. No retries permitted until 2026-04-16 20:27:38.78124611 +0000 UTC m=+2.048988542 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-4rn7c" (UniqueName: "kubernetes.io/projected/ea0bcc7a-3a29-4d83-8da0-05ec471a2764-kube-api-access-4rn7c") pod "network-check-target-k9tth" (UID: "ea0bcc7a-3a29-4d83-8da0-05ec471a2764") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:27:38.283017 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.282998 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q6qj\" (UniqueName: \"kubernetes.io/projected/07a95f45-4b58-43ee-b613-5effa3c9ce71-kube-api-access-6q6qj\") pod \"tuned-vjq6x\" (UID: \"07a95f45-4b58-43ee-b613-5effa3c9ce71\") " pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.283074 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.283061 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tfsh\" (UniqueName: \"kubernetes.io/projected/334e7790-0d97-4116-a3ac-9e7dab8e33e3-kube-api-access-9tfsh\") pod \"iptables-alerter-9pkk2\" (UID: \"334e7790-0d97-4116-a3ac-9e7dab8e33e3\") " pod="openshift-network-operator/iptables-alerter-9pkk2" Apr 16 20:27:38.309649 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.309606 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-150.ec2.internal" event={"ID":"babf34b6062f196401e1c6b676b7c7db","Type":"ContainerStarted","Data":"6243ca774f2abfe9958a60644c87dedb4b11fed5309bc39630e91c924d75289c"} Apr 16 20:27:38.310454 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.310438 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-150.ec2.internal" event={"ID":"15450d4a1cffd30a1009fca9edec82d3","Type":"ContainerStarted","Data":"a9c1cbef932dd6b901aede88b39057a369a289c7de59002a2a84fee5df66f61a"} Apr 16 20:27:38.480376 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.480306 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hthg8" Apr 16 20:27:38.486498 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:38.486478 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1aac658f_e4b3_4b53_a125_6cae725d6fcd.slice/crio-b804b7340ffeaf1d5a3dadec4dfd9feae2e4f2bf2ddc940c6fcad1d47405b00c WatchSource:0}: Error finding container b804b7340ffeaf1d5a3dadec4dfd9feae2e4f2bf2ddc940c6fcad1d47405b00c: Status 404 returned error can't find the container with id b804b7340ffeaf1d5a3dadec4dfd9feae2e4f2bf2ddc940c6fcad1d47405b00c Apr 16 20:27:38.496814 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.496795 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:27:38.502390 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.502368 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" Apr 16 20:27:38.503081 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:38.502976 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97e2c184_5abb_438f_8f9b_2df48f93e465.slice/crio-2fa8f5dac0d5c311b3312471159dc41e85844bab106f467c7bcec9790f68da0d WatchSource:0}: Error finding container 2fa8f5dac0d5c311b3312471159dc41e85844bab106f467c7bcec9790f68da0d: Status 404 returned error can't find the container with id 2fa8f5dac0d5c311b3312471159dc41e85844bab106f467c7bcec9790f68da0d Apr 16 20:27:38.508268 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:38.508251 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07a95f45_4b58_43ee_b613_5effa3c9ce71.slice/crio-4fb94a1877b1c97ab344f206b8f2746af77e74a47a799e254f4db9ab7e241268 WatchSource:0}: Error finding container 4fb94a1877b1c97ab344f206b8f2746af77e74a47a799e254f4db9ab7e241268: Status 404 returned error can't find the container with id 4fb94a1877b1c97ab344f206b8f2746af77e74a47a799e254f4db9ab7e241268 Apr 16 20:27:38.520616 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.520596 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7qfs5" Apr 16 20:27:38.527772 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:38.527754 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22753460_b103_4969_8aca_1ea39040795b.slice/crio-c6abbbdeeffb079a50d4935173481706cdebfe20f99eb3a4d08da39b75edb4d3 WatchSource:0}: Error finding container c6abbbdeeffb079a50d4935173481706cdebfe20f99eb3a4d08da39b75edb4d3: Status 404 returned error can't find the container with id c6abbbdeeffb079a50d4935173481706cdebfe20f99eb3a4d08da39b75edb4d3 Apr 16 20:27:38.542425 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.542410 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-n2tj2" Apr 16 20:27:38.548540 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:38.548521 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72204d28_677e_4d89_a353_b087ce28c38f.slice/crio-b9dd11823f3760df13fb1003068170fea533de3333919e72c94e46ea26954a8a WatchSource:0}: Error finding container b9dd11823f3760df13fb1003068170fea533de3333919e72c94e46ea26954a8a: Status 404 returned error can't find the container with id b9dd11823f3760df13fb1003068170fea533de3333919e72c94e46ea26954a8a Apr 16 20:27:38.551194 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.551182 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cb7jp" Apr 16 20:27:38.556798 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:38.556779 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd8e84a9_042c_4346_8ef7_68dcca064683.slice/crio-424b0de85e4990740bffcf074803c28a44d59f9ba21b3aa7a940f5afa5d3e544 WatchSource:0}: Error finding container 424b0de85e4990740bffcf074803c28a44d59f9ba21b3aa7a940f5afa5d3e544: Status 404 returned error can't find the container with id 424b0de85e4990740bffcf074803c28a44d59f9ba21b3aa7a940f5afa5d3e544 Apr 16 20:27:38.557347 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.557327 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-9pkk2" Apr 16 20:27:38.562956 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.562941 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-z6p8j" Apr 16 20:27:38.563156 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:38.563132 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod334e7790_0d97_4116_a3ac_9e7dab8e33e3.slice/crio-b05004aa46cbfd66e7d40d38d5aba7aafe0d6ae9b705aef498df8ea085b0706d WatchSource:0}: Error finding container b05004aa46cbfd66e7d40d38d5aba7aafe0d6ae9b705aef498df8ea085b0706d: Status 404 returned error can't find the container with id b05004aa46cbfd66e7d40d38d5aba7aafe0d6ae9b705aef498df8ea085b0706d Apr 16 20:27:38.567937 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.567920 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzp6r" Apr 16 20:27:38.568229 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:38.568206 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd95a170f_35f2_4732_bbb5_f8e2f6768efc.slice/crio-5b186bdf0c5f59f5c3532e23b182fd05be0ff53f5b5988a948ef9a49586d87b4 WatchSource:0}: Error finding container 5b186bdf0c5f59f5c3532e23b182fd05be0ff53f5b5988a948ef9a49586d87b4: Status 404 returned error can't find the container with id 5b186bdf0c5f59f5c3532e23b182fd05be0ff53f5b5988a948ef9a49586d87b4 Apr 16 20:27:38.573871 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:27:38.573848 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73afd102_e8ce_40ea_847a_3f73f708f41d.slice/crio-9e18239d11a56fbf1efb772dc5accde33bf2a414f990cf3d7a686bd9f7eba10a WatchSource:0}: Error finding container 9e18239d11a56fbf1efb772dc5accde33bf2a414f990cf3d7a686bd9f7eba10a: Status 404 returned error can't find the container with id 9e18239d11a56fbf1efb772dc5accde33bf2a414f990cf3d7a686bd9f7eba10a Apr 16 20:27:38.614369 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.614345 2565 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:27:38.775420 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.775339 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90b993f2-207d-4894-bbdf-e2219dbf690b-metrics-certs\") pod \"network-metrics-daemon-jzdqd\" (UID: \"90b993f2-207d-4894-bbdf-e2219dbf690b\") " pod="openshift-multus/network-metrics-daemon-jzdqd" Apr 16 20:27:38.775580 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:38.775495 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:27:38.775580 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:38.775571 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90b993f2-207d-4894-bbdf-e2219dbf690b-metrics-certs podName:90b993f2-207d-4894-bbdf-e2219dbf690b nodeName:}" failed. No retries permitted until 2026-04-16 20:27:39.775550119 +0000 UTC m=+3.043292566 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/90b993f2-207d-4894-bbdf-e2219dbf690b-metrics-certs") pod "network-metrics-daemon-jzdqd" (UID: "90b993f2-207d-4894-bbdf-e2219dbf690b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:27:38.876533 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:38.876322 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4rn7c\" (UniqueName: \"kubernetes.io/projected/ea0bcc7a-3a29-4d83-8da0-05ec471a2764-kube-api-access-4rn7c\") pod \"network-check-target-k9tth\" (UID: \"ea0bcc7a-3a29-4d83-8da0-05ec471a2764\") " pod="openshift-network-diagnostics/network-check-target-k9tth" Apr 16 20:27:38.876533 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:38.876453 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:27:38.876533 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:38.876472 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:27:38.876533 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:38.876486 2565 projected.go:194] Error preparing data for projected volume kube-api-access-4rn7c for pod openshift-network-diagnostics/network-check-target-k9tth: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:27:38.876533 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:38.876545 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ea0bcc7a-3a29-4d83-8da0-05ec471a2764-kube-api-access-4rn7c podName:ea0bcc7a-3a29-4d83-8da0-05ec471a2764 nodeName:}" failed. No retries permitted until 2026-04-16 20:27:39.876526064 +0000 UTC m=+3.144268498 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-4rn7c" (UniqueName: "kubernetes.io/projected/ea0bcc7a-3a29-4d83-8da0-05ec471a2764-kube-api-access-4rn7c") pod "network-check-target-k9tth" (UID: "ea0bcc7a-3a29-4d83-8da0-05ec471a2764") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:27:39.199572 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:39.199478 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 20:22:38 +0000 UTC" deadline="2027-10-02 23:42:28.572289873 +0000 UTC" Apr 16 20:27:39.199572 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:39.199515 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12819h14m49.372778012s" Apr 16 20:27:39.310744 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:39.310651 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9tth" Apr 16 20:27:39.310920 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:39.310763 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9tth" podUID="ea0bcc7a-3a29-4d83-8da0-05ec471a2764" Apr 16 20:27:39.329629 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:39.329592 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-z6p8j" event={"ID":"d95a170f-35f2-4732-bbb5-f8e2f6768efc","Type":"ContainerStarted","Data":"5b186bdf0c5f59f5c3532e23b182fd05be0ff53f5b5988a948ef9a49586d87b4"} Apr 16 20:27:39.335126 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:39.335096 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-9pkk2" event={"ID":"334e7790-0d97-4116-a3ac-9e7dab8e33e3","Type":"ContainerStarted","Data":"b05004aa46cbfd66e7d40d38d5aba7aafe0d6ae9b705aef498df8ea085b0706d"} Apr 16 20:27:39.340298 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:39.340255 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cb7jp" event={"ID":"dd8e84a9-042c-4346-8ef7-68dcca064683","Type":"ContainerStarted","Data":"424b0de85e4990740bffcf074803c28a44d59f9ba21b3aa7a940f5afa5d3e544"} Apr 16 20:27:39.348346 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:39.348320 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-n2tj2" event={"ID":"72204d28-677e-4d89-a353-b087ce28c38f","Type":"ContainerStarted","Data":"b9dd11823f3760df13fb1003068170fea533de3333919e72c94e46ea26954a8a"} Apr 16 20:27:39.354240 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:39.354029 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzp6r" event={"ID":"73afd102-e8ce-40ea-847a-3f73f708f41d","Type":"ContainerStarted","Data":"9e18239d11a56fbf1efb772dc5accde33bf2a414f990cf3d7a686bd9f7eba10a"} Apr 16 20:27:39.361080 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:39.361053 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7qfs5" event={"ID":"22753460-b103-4969-8aca-1ea39040795b","Type":"ContainerStarted","Data":"c6abbbdeeffb079a50d4935173481706cdebfe20f99eb3a4d08da39b75edb4d3"} Apr 16 20:27:39.381118 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:39.381093 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" event={"ID":"07a95f45-4b58-43ee-b613-5effa3c9ce71","Type":"ContainerStarted","Data":"4fb94a1877b1c97ab344f206b8f2746af77e74a47a799e254f4db9ab7e241268"} Apr 16 20:27:39.391262 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:39.391238 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" event={"ID":"97e2c184-5abb-438f-8f9b-2df48f93e465","Type":"ContainerStarted","Data":"2fa8f5dac0d5c311b3312471159dc41e85844bab106f467c7bcec9790f68da0d"} Apr 16 20:27:39.395952 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:39.395929 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hthg8" event={"ID":"1aac658f-e4b3-4b53-a125-6cae725d6fcd","Type":"ContainerStarted","Data":"b804b7340ffeaf1d5a3dadec4dfd9feae2e4f2bf2ddc940c6fcad1d47405b00c"} Apr 16 20:27:39.485535 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:39.485459 2565 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:27:39.526226 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:39.526197 2565 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:27:39.783581 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:39.783494 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90b993f2-207d-4894-bbdf-e2219dbf690b-metrics-certs\") pod \"network-metrics-daemon-jzdqd\" (UID: \"90b993f2-207d-4894-bbdf-e2219dbf690b\") " pod="openshift-multus/network-metrics-daemon-jzdqd" Apr 16 20:27:39.791477 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:39.791433 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:27:39.791651 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:39.791540 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90b993f2-207d-4894-bbdf-e2219dbf690b-metrics-certs podName:90b993f2-207d-4894-bbdf-e2219dbf690b nodeName:}" failed. No retries permitted until 2026-04-16 20:27:41.791518739 +0000 UTC m=+5.059261177 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/90b993f2-207d-4894-bbdf-e2219dbf690b-metrics-certs") pod "network-metrics-daemon-jzdqd" (UID: "90b993f2-207d-4894-bbdf-e2219dbf690b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:27:39.884096 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:39.884055 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4rn7c\" (UniqueName: \"kubernetes.io/projected/ea0bcc7a-3a29-4d83-8da0-05ec471a2764-kube-api-access-4rn7c\") pod \"network-check-target-k9tth\" (UID: \"ea0bcc7a-3a29-4d83-8da0-05ec471a2764\") " pod="openshift-network-diagnostics/network-check-target-k9tth" Apr 16 20:27:39.884296 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:39.884221 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:27:39.884296 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:39.884242 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:27:39.884296 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:39.884255 2565 projected.go:194] Error preparing data for projected volume kube-api-access-4rn7c for pod openshift-network-diagnostics/network-check-target-k9tth: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:27:39.884469 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:39.884325 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ea0bcc7a-3a29-4d83-8da0-05ec471a2764-kube-api-access-4rn7c podName:ea0bcc7a-3a29-4d83-8da0-05ec471a2764 nodeName:}" failed. No retries permitted until 2026-04-16 20:27:41.884307059 +0000 UTC m=+5.152049496 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-4rn7c" (UniqueName: "kubernetes.io/projected/ea0bcc7a-3a29-4d83-8da0-05ec471a2764-kube-api-access-4rn7c") pod "network-check-target-k9tth" (UID: "ea0bcc7a-3a29-4d83-8da0-05ec471a2764") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:27:40.200299 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:40.200189 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 20:22:38 +0000 UTC" deadline="2027-09-09 22:27:49.218324366 +0000 UTC" Apr 16 20:27:40.200299 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:40.200230 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12266h0m9.018097974s" Apr 16 20:27:40.308136 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:40.308105 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jzdqd" Apr 16 20:27:40.308341 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:40.308245 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jzdqd" podUID="90b993f2-207d-4894-bbdf-e2219dbf690b" Apr 16 20:27:40.870450 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:40.870228 2565 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:27:41.310828 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:41.310303 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9tth" Apr 16 20:27:41.310828 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:41.310424 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9tth" podUID="ea0bcc7a-3a29-4d83-8da0-05ec471a2764" Apr 16 20:27:41.799105 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:41.799013 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90b993f2-207d-4894-bbdf-e2219dbf690b-metrics-certs\") pod \"network-metrics-daemon-jzdqd\" (UID: \"90b993f2-207d-4894-bbdf-e2219dbf690b\") " pod="openshift-multus/network-metrics-daemon-jzdqd" Apr 16 20:27:41.799264 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:41.799159 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:27:41.799264 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:41.799229 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90b993f2-207d-4894-bbdf-e2219dbf690b-metrics-certs podName:90b993f2-207d-4894-bbdf-e2219dbf690b nodeName:}" failed. No retries permitted until 2026-04-16 20:27:45.799206436 +0000 UTC m=+9.066948880 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/90b993f2-207d-4894-bbdf-e2219dbf690b-metrics-certs") pod "network-metrics-daemon-jzdqd" (UID: "90b993f2-207d-4894-bbdf-e2219dbf690b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:27:41.900864 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:41.900172 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4rn7c\" (UniqueName: \"kubernetes.io/projected/ea0bcc7a-3a29-4d83-8da0-05ec471a2764-kube-api-access-4rn7c\") pod \"network-check-target-k9tth\" (UID: \"ea0bcc7a-3a29-4d83-8da0-05ec471a2764\") " pod="openshift-network-diagnostics/network-check-target-k9tth" Apr 16 20:27:41.900864 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:41.900404 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:27:41.900864 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:41.900427 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:27:41.900864 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:41.900440 2565 projected.go:194] Error preparing data for projected volume kube-api-access-4rn7c for pod openshift-network-diagnostics/network-check-target-k9tth: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:27:41.900864 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:41.900503 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ea0bcc7a-3a29-4d83-8da0-05ec471a2764-kube-api-access-4rn7c podName:ea0bcc7a-3a29-4d83-8da0-05ec471a2764 nodeName:}" failed. No retries permitted until 2026-04-16 20:27:45.900482961 +0000 UTC m=+9.168225409 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-4rn7c" (UniqueName: "kubernetes.io/projected/ea0bcc7a-3a29-4d83-8da0-05ec471a2764-kube-api-access-4rn7c") pod "network-check-target-k9tth" (UID: "ea0bcc7a-3a29-4d83-8da0-05ec471a2764") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:27:42.308840 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:42.308326 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jzdqd" Apr 16 20:27:42.308840 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:42.308461 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jzdqd" podUID="90b993f2-207d-4894-bbdf-e2219dbf690b" Apr 16 20:27:43.308260 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:43.308220 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9tth" Apr 16 20:27:43.308828 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:43.308388 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9tth" podUID="ea0bcc7a-3a29-4d83-8da0-05ec471a2764" Apr 16 20:27:44.308247 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:44.307893 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jzdqd" Apr 16 20:27:44.308247 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:44.308031 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jzdqd" podUID="90b993f2-207d-4894-bbdf-e2219dbf690b" Apr 16 20:27:45.307853 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:45.307475 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9tth" Apr 16 20:27:45.307853 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:45.307617 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9tth" podUID="ea0bcc7a-3a29-4d83-8da0-05ec471a2764" Apr 16 20:27:45.831344 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:45.830793 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90b993f2-207d-4894-bbdf-e2219dbf690b-metrics-certs\") pod \"network-metrics-daemon-jzdqd\" (UID: \"90b993f2-207d-4894-bbdf-e2219dbf690b\") " pod="openshift-multus/network-metrics-daemon-jzdqd" Apr 16 20:27:45.831344 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:45.830931 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:27:45.831344 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:45.830993 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90b993f2-207d-4894-bbdf-e2219dbf690b-metrics-certs podName:90b993f2-207d-4894-bbdf-e2219dbf690b nodeName:}" failed. No retries permitted until 2026-04-16 20:27:53.830973271 +0000 UTC m=+17.098715753 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/90b993f2-207d-4894-bbdf-e2219dbf690b-metrics-certs") pod "network-metrics-daemon-jzdqd" (UID: "90b993f2-207d-4894-bbdf-e2219dbf690b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:27:45.931609 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:45.931574 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4rn7c\" (UniqueName: \"kubernetes.io/projected/ea0bcc7a-3a29-4d83-8da0-05ec471a2764-kube-api-access-4rn7c\") pod \"network-check-target-k9tth\" (UID: \"ea0bcc7a-3a29-4d83-8da0-05ec471a2764\") " pod="openshift-network-diagnostics/network-check-target-k9tth" Apr 16 20:27:45.931797 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:45.931767 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:27:45.931797 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:45.931788 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:27:45.931918 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:45.931800 2565 projected.go:194] Error preparing data for projected volume kube-api-access-4rn7c for pod openshift-network-diagnostics/network-check-target-k9tth: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:27:45.931918 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:45.931855 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ea0bcc7a-3a29-4d83-8da0-05ec471a2764-kube-api-access-4rn7c podName:ea0bcc7a-3a29-4d83-8da0-05ec471a2764 nodeName:}" failed. No retries permitted until 2026-04-16 20:27:53.931838209 +0000 UTC m=+17.199580644 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-4rn7c" (UniqueName: "kubernetes.io/projected/ea0bcc7a-3a29-4d83-8da0-05ec471a2764-kube-api-access-4rn7c") pod "network-check-target-k9tth" (UID: "ea0bcc7a-3a29-4d83-8da0-05ec471a2764") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:27:46.308286 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:46.308240 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jzdqd" Apr 16 20:27:46.308431 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:46.308401 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jzdqd" podUID="90b993f2-207d-4894-bbdf-e2219dbf690b" Apr 16 20:27:47.308725 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:47.308690 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9tth" Apr 16 20:27:47.309152 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:47.308793 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9tth" podUID="ea0bcc7a-3a29-4d83-8da0-05ec471a2764" Apr 16 20:27:48.307737 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:48.307702 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jzdqd" Apr 16 20:27:48.307907 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:48.307814 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jzdqd" podUID="90b993f2-207d-4894-bbdf-e2219dbf690b" Apr 16 20:27:49.308419 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:49.308387 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9tth" Apr 16 20:27:49.308843 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:49.308496 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9tth" podUID="ea0bcc7a-3a29-4d83-8da0-05ec471a2764" Apr 16 20:27:50.308085 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:50.308058 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jzdqd" Apr 16 20:27:50.308240 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:50.308169 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jzdqd" podUID="90b993f2-207d-4894-bbdf-e2219dbf690b" Apr 16 20:27:51.308163 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:51.308126 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9tth" Apr 16 20:27:51.308592 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:51.308243 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9tth" podUID="ea0bcc7a-3a29-4d83-8da0-05ec471a2764" Apr 16 20:27:52.307681 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:52.307643 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jzdqd" Apr 16 20:27:52.307861 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:52.307795 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jzdqd" podUID="90b993f2-207d-4894-bbdf-e2219dbf690b" Apr 16 20:27:53.307896 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:53.307855 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9tth" Apr 16 20:27:53.308388 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:53.307967 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9tth" podUID="ea0bcc7a-3a29-4d83-8da0-05ec471a2764" Apr 16 20:27:53.896485 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:53.896456 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90b993f2-207d-4894-bbdf-e2219dbf690b-metrics-certs\") pod \"network-metrics-daemon-jzdqd\" (UID: \"90b993f2-207d-4894-bbdf-e2219dbf690b\") " pod="openshift-multus/network-metrics-daemon-jzdqd" Apr 16 20:27:53.896690 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:53.896577 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:27:53.896690 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:53.896630 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90b993f2-207d-4894-bbdf-e2219dbf690b-metrics-certs podName:90b993f2-207d-4894-bbdf-e2219dbf690b nodeName:}" failed. No retries permitted until 2026-04-16 20:28:09.89661381 +0000 UTC m=+33.164356242 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/90b993f2-207d-4894-bbdf-e2219dbf690b-metrics-certs") pod "network-metrics-daemon-jzdqd" (UID: "90b993f2-207d-4894-bbdf-e2219dbf690b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:27:53.997556 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:53.997522 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4rn7c\" (UniqueName: \"kubernetes.io/projected/ea0bcc7a-3a29-4d83-8da0-05ec471a2764-kube-api-access-4rn7c\") pod \"network-check-target-k9tth\" (UID: \"ea0bcc7a-3a29-4d83-8da0-05ec471a2764\") " pod="openshift-network-diagnostics/network-check-target-k9tth" Apr 16 20:27:53.997701 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:53.997681 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:27:53.997701 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:53.997700 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:27:53.997815 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:53.997710 2565 projected.go:194] Error preparing data for projected volume kube-api-access-4rn7c for pod openshift-network-diagnostics/network-check-target-k9tth: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:27:53.997815 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:53.997760 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ea0bcc7a-3a29-4d83-8da0-05ec471a2764-kube-api-access-4rn7c podName:ea0bcc7a-3a29-4d83-8da0-05ec471a2764 nodeName:}" failed. No retries permitted until 2026-04-16 20:28:09.997742766 +0000 UTC m=+33.265485197 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-4rn7c" (UniqueName: "kubernetes.io/projected/ea0bcc7a-3a29-4d83-8da0-05ec471a2764-kube-api-access-4rn7c") pod "network-check-target-k9tth" (UID: "ea0bcc7a-3a29-4d83-8da0-05ec471a2764") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:27:54.308164 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:54.308133 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jzdqd" Apr 16 20:27:54.308636 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:54.308252 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jzdqd" podUID="90b993f2-207d-4894-bbdf-e2219dbf690b" Apr 16 20:27:55.308144 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:55.308101 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9tth" Apr 16 20:27:55.308356 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:55.308237 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9tth" podUID="ea0bcc7a-3a29-4d83-8da0-05ec471a2764" Apr 16 20:27:56.307660 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:56.307627 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jzdqd" Apr 16 20:27:56.307842 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:56.307756 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jzdqd" podUID="90b993f2-207d-4894-bbdf-e2219dbf690b" Apr 16 20:27:57.308920 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:57.308781 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9tth" Apr 16 20:27:57.309556 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:57.308995 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9tth" podUID="ea0bcc7a-3a29-4d83-8da0-05ec471a2764" Apr 16 20:27:57.433336 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:57.432981 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-150.ec2.internal" event={"ID":"babf34b6062f196401e1c6b676b7c7db","Type":"ContainerStarted","Data":"38e130e5f877a27d41e3ca869fe540f7e982b459aee6872f0b3ccaf7969df66c"} Apr 16 20:27:57.435332 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:57.435263 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cb7jp" event={"ID":"dd8e84a9-042c-4346-8ef7-68dcca064683","Type":"ContainerStarted","Data":"e2962bfafc24cd412b7c8934020c24a49d6102eb2de697c3f7279c5fc6030117"} Apr 16 20:27:57.437441 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:57.437412 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" event={"ID":"07a95f45-4b58-43ee-b613-5effa3c9ce71","Type":"ContainerStarted","Data":"fa5941301f3466e86f5736f66f6a79cf154e4582b877564404f1a1e939359c9a"} Apr 16 20:27:57.442308 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:57.442265 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" event={"ID":"97e2c184-5abb-438f-8f9b-2df48f93e465","Type":"ContainerStarted","Data":"c598cc5aafeb2ae44c23bedf5247004539ded0c75d837a7693045124ef0a5e29"} Apr 16 20:27:57.442414 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:57.442318 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" event={"ID":"97e2c184-5abb-438f-8f9b-2df48f93e465","Type":"ContainerStarted","Data":"d2b088edd88bf44f2dcc80c5c6594b32134630f9324beda943ebf34f60de2c7a"} Apr 16 20:27:57.442414 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:57.442333 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" event={"ID":"97e2c184-5abb-438f-8f9b-2df48f93e465","Type":"ContainerStarted","Data":"2c4e98c2b2ee0a735ac479b8a254a9000ef428d111a9ff14b9185e8c0cfd3e6b"} Apr 16 20:27:57.442414 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:57.442346 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" event={"ID":"97e2c184-5abb-438f-8f9b-2df48f93e465","Type":"ContainerStarted","Data":"fd9e5f02b0ba8261a78175423ec0053c1ac1ca459a278b5b299659b2c0dbeb93"} Apr 16 20:27:57.442414 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:57.442359 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" event={"ID":"97e2c184-5abb-438f-8f9b-2df48f93e465","Type":"ContainerStarted","Data":"d177536ed57190e188b835baf2dd6713bed741d243d468ede34fae02d7f1eafe"} Apr 16 20:27:57.446028 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:57.445957 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-150.ec2.internal" podStartSLOduration=19.445942314 podStartE2EDuration="19.445942314s" podCreationTimestamp="2026-04-16 20:27:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:27:57.445631004 +0000 UTC m=+20.713373457" watchObservedRunningTime="2026-04-16 20:27:57.445942314 +0000 UTC m=+20.713684767" Apr 16 20:27:57.461742 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:57.461705 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-vjq6x" podStartSLOduration=2.53438518 podStartE2EDuration="20.461696418s" podCreationTimestamp="2026-04-16 20:27:37 +0000 UTC" firstStartedPulling="2026-04-16 20:27:38.509716861 +0000 UTC m=+1.777459292" lastFinishedPulling="2026-04-16 20:27:56.437028092 +0000 UTC m=+19.704770530" observedRunningTime="2026-04-16 20:27:57.461064146 +0000 UTC m=+20.728806604" watchObservedRunningTime="2026-04-16 20:27:57.461696418 +0000 UTC m=+20.729438870" Apr 16 20:27:57.479994 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:57.479949 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-cb7jp" podStartSLOduration=2.319625085 podStartE2EDuration="20.479936899s" podCreationTimestamp="2026-04-16 20:27:37 +0000 UTC" firstStartedPulling="2026-04-16 20:27:38.558141548 +0000 UTC m=+1.825883982" lastFinishedPulling="2026-04-16 20:27:56.718453361 +0000 UTC m=+19.986195796" observedRunningTime="2026-04-16 20:27:57.47963026 +0000 UTC m=+20.747372727" watchObservedRunningTime="2026-04-16 20:27:57.479936899 +0000 UTC m=+20.747679353" Apr 16 20:27:58.307780 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:58.307499 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jzdqd" Apr 16 20:27:58.307914 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:58.307858 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jzdqd" podUID="90b993f2-207d-4894-bbdf-e2219dbf690b" Apr 16 20:27:58.445315 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:58.445260 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-n2tj2" event={"ID":"72204d28-677e-4d89-a353-b087ce28c38f","Type":"ContainerStarted","Data":"a84eed0057a15da94e47346c7cb858aab65f63dd9c8e3aa8a0376fbfda34d731"} Apr 16 20:27:58.446512 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:58.446485 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzp6r" event={"ID":"73afd102-e8ce-40ea-847a-3f73f708f41d","Type":"ContainerStarted","Data":"02cb3244b32da30af2f4433b45613e0cff6fefe599d473442aa355e696df20ef"} Apr 16 20:27:58.448846 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:58.448817 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7qfs5" event={"ID":"22753460-b103-4969-8aca-1ea39040795b","Type":"ContainerStarted","Data":"d0b383922aadc67a27998647c4a50562a53feedc8b38a9b2f551aceadfa94013"} Apr 16 20:27:58.451772 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:58.451745 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" event={"ID":"97e2c184-5abb-438f-8f9b-2df48f93e465","Type":"ContainerStarted","Data":"09dae9ed6b8c9ee4a0e045990047f1fc36c964e05fc2409f1f749798563b237a"} Apr 16 20:27:58.452943 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:58.452921 2565 generic.go:358] "Generic (PLEG): container finished" podID="1aac658f-e4b3-4b53-a125-6cae725d6fcd" containerID="699873d70b356f5c717c1337fbdddf858101689da0d813782b4b8d823a33246b" exitCode=0 Apr 16 20:27:58.453032 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:58.452979 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hthg8" event={"ID":"1aac658f-e4b3-4b53-a125-6cae725d6fcd","Type":"ContainerDied","Data":"699873d70b356f5c717c1337fbdddf858101689da0d813782b4b8d823a33246b"} Apr 16 20:27:58.454301 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:58.454245 2565 generic.go:358] "Generic (PLEG): container finished" podID="15450d4a1cffd30a1009fca9edec82d3" containerID="51d7c60615b625ae5439fae01763de4ed3c4727464fefdc4f5a6cef5e1104900" exitCode=0 Apr 16 20:27:58.454359 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:58.454333 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-150.ec2.internal" event={"ID":"15450d4a1cffd30a1009fca9edec82d3","Type":"ContainerDied","Data":"51d7c60615b625ae5439fae01763de4ed3c4727464fefdc4f5a6cef5e1104900"} Apr 16 20:27:58.455749 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:58.455709 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-z6p8j" event={"ID":"d95a170f-35f2-4732-bbb5-f8e2f6768efc","Type":"ContainerStarted","Data":"915ca9e41623e7afa9fa182f859f66403acd48ae1afd4b4b3fabf8929f8704ba"} Apr 16 20:27:58.457000 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:58.456976 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-9pkk2" event={"ID":"334e7790-0d97-4116-a3ac-9e7dab8e33e3","Type":"ContainerStarted","Data":"7f2555b835f44964e2e82b25ed0f95ba26f08862e307e33247c3e447b859a43c"} Apr 16 20:27:58.472017 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:58.471981 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-n2tj2" podStartSLOduration=3.58480428 podStartE2EDuration="21.471967301s" podCreationTimestamp="2026-04-16 20:27:37 +0000 UTC" firstStartedPulling="2026-04-16 20:27:38.549862063 +0000 UTC m=+1.817604494" lastFinishedPulling="2026-04-16 20:27:56.437025081 +0000 UTC m=+19.704767515" observedRunningTime="2026-04-16 20:27:58.471406988 +0000 UTC m=+21.739149440" watchObservedRunningTime="2026-04-16 20:27:58.471967301 +0000 UTC m=+21.739709799" Apr 16 20:27:58.493337 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:58.493200 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-z6p8j" podStartSLOduration=3.627094615 podStartE2EDuration="21.493184562s" podCreationTimestamp="2026-04-16 20:27:37 +0000 UTC" firstStartedPulling="2026-04-16 20:27:38.570930706 +0000 UTC m=+1.838673137" lastFinishedPulling="2026-04-16 20:27:56.437020649 +0000 UTC m=+19.704763084" observedRunningTime="2026-04-16 20:27:58.492812267 +0000 UTC m=+21.760554721" watchObservedRunningTime="2026-04-16 20:27:58.493184562 +0000 UTC m=+21.760927016" Apr 16 20:27:58.520572 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:58.517930 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-7qfs5" podStartSLOduration=3.609974613 podStartE2EDuration="21.51791556s" podCreationTimestamp="2026-04-16 20:27:37 +0000 UTC" firstStartedPulling="2026-04-16 20:27:38.529037837 +0000 UTC m=+1.796780268" lastFinishedPulling="2026-04-16 20:27:56.43697877 +0000 UTC m=+19.704721215" observedRunningTime="2026-04-16 20:27:58.517860593 +0000 UTC m=+21.785603047" watchObservedRunningTime="2026-04-16 20:27:58.51791556 +0000 UTC m=+21.785658013" Apr 16 20:27:58.568524 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:58.568441 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-9pkk2" podStartSLOduration=3.659625346 podStartE2EDuration="21.5684267s" podCreationTimestamp="2026-04-16 20:27:37 +0000 UTC" firstStartedPulling="2026-04-16 20:27:38.565529086 +0000 UTC m=+1.833271520" lastFinishedPulling="2026-04-16 20:27:56.474330436 +0000 UTC m=+19.742072874" observedRunningTime="2026-04-16 20:27:58.567966332 +0000 UTC m=+21.835708785" watchObservedRunningTime="2026-04-16 20:27:58.5684267 +0000 UTC m=+21.836169153" Apr 16 20:27:58.626911 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:58.626885 2565 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 20:27:59.232205 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:59.232068 2565 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T20:27:58.626904986Z","UUID":"9f0e8f4c-1b82-480d-8fff-22e9c51fcebe","Handler":null,"Name":"","Endpoint":""} Apr 16 20:27:59.235701 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:59.235678 2565 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 20:27:59.235827 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:59.235708 2565 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 20:27:59.308105 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:59.308062 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9tth" Apr 16 20:27:59.308259 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:27:59.308191 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9tth" podUID="ea0bcc7a-3a29-4d83-8da0-05ec471a2764" Apr 16 20:27:59.460786 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:59.460744 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzp6r" event={"ID":"73afd102-e8ce-40ea-847a-3f73f708f41d","Type":"ContainerStarted","Data":"c4abca3fb72bd471015dc0fb31e61e939bd3e60fc5feac3203b1b1033e17af9d"} Apr 16 20:27:59.462470 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:59.462431 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-150.ec2.internal" event={"ID":"15450d4a1cffd30a1009fca9edec82d3","Type":"ContainerStarted","Data":"2a9422d6694cfeca8b87599a6426623bf96875b7d3f49a8fdc8ffd0892cfd3ab"} Apr 16 20:27:59.476691 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:27:59.476651 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-150.ec2.internal" podStartSLOduration=21.476639129 podStartE2EDuration="21.476639129s" podCreationTimestamp="2026-04-16 20:27:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:27:59.476391329 +0000 UTC m=+22.744133786" watchObservedRunningTime="2026-04-16 20:27:59.476639129 +0000 UTC m=+22.744381582" Apr 16 20:28:00.307863 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:00.307831 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jzdqd" Apr 16 20:28:00.308045 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:00.307979 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jzdqd" podUID="90b993f2-207d-4894-bbdf-e2219dbf690b" Apr 16 20:28:00.466539 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:00.466491 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzp6r" event={"ID":"73afd102-e8ce-40ea-847a-3f73f708f41d","Type":"ContainerStarted","Data":"a8837cf8557b8e3d739fcc35745cd2beeaa6a0eab70a42bbbb40a321d2b23c81"} Apr 16 20:28:00.469907 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:00.469876 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" event={"ID":"97e2c184-5abb-438f-8f9b-2df48f93e465","Type":"ContainerStarted","Data":"83de92760d66284c375d775e4c019fd6392b6a98da8740eebe259380675c788f"} Apr 16 20:28:00.483861 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:00.483819 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rzp6r" podStartSLOduration=2.574739541 podStartE2EDuration="23.483805578s" podCreationTimestamp="2026-04-16 20:27:37 +0000 UTC" firstStartedPulling="2026-04-16 20:27:38.575140115 +0000 UTC m=+1.842882551" lastFinishedPulling="2026-04-16 20:27:59.484206142 +0000 UTC m=+22.751948588" observedRunningTime="2026-04-16 20:28:00.483373601 +0000 UTC m=+23.751116056" watchObservedRunningTime="2026-04-16 20:28:00.483805578 +0000 UTC m=+23.751548035" Apr 16 20:28:01.308578 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:01.308352 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9tth" Apr 16 20:28:01.308759 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:01.308684 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9tth" podUID="ea0bcc7a-3a29-4d83-8da0-05ec471a2764" Apr 16 20:28:02.308303 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:02.308262 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jzdqd" Apr 16 20:28:02.308710 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:02.308387 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jzdqd" podUID="90b993f2-207d-4894-bbdf-e2219dbf690b" Apr 16 20:28:02.841095 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:02.840905 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-z6p8j" Apr 16 20:28:02.841501 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:02.841484 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-z6p8j" Apr 16 20:28:03.307592 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:03.307572 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9tth" Apr 16 20:28:03.307715 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:03.307655 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9tth" podUID="ea0bcc7a-3a29-4d83-8da0-05ec471a2764" Apr 16 20:28:03.477052 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:03.477023 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" event={"ID":"97e2c184-5abb-438f-8f9b-2df48f93e465","Type":"ContainerStarted","Data":"de6b58d10cc4dc712023a8406d50c9e4eb083e96b114b83e42ef6d931c54ee19"} Apr 16 20:28:03.477684 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:03.477324 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:28:03.478618 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:03.478592 2565 generic.go:358] "Generic (PLEG): container finished" podID="1aac658f-e4b3-4b53-a125-6cae725d6fcd" containerID="5a4add420d83d9c0386cb9c00993c8da7fb3b401218e8eb66a8dbfba9605afee" exitCode=0 Apr 16 20:28:03.478729 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:03.478659 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hthg8" event={"ID":"1aac658f-e4b3-4b53-a125-6cae725d6fcd","Type":"ContainerDied","Data":"5a4add420d83d9c0386cb9c00993c8da7fb3b401218e8eb66a8dbfba9605afee"} Apr 16 20:28:03.478867 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:03.478848 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-z6p8j" Apr 16 20:28:03.479492 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:03.479349 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-z6p8j" Apr 16 20:28:03.491859 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:03.491840 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:28:03.501917 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:03.501883 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" podStartSLOduration=7.912447437 podStartE2EDuration="26.501872745s" podCreationTimestamp="2026-04-16 20:27:37 +0000 UTC" firstStartedPulling="2026-04-16 20:27:38.505379916 +0000 UTC m=+1.773122347" lastFinishedPulling="2026-04-16 20:27:57.094805209 +0000 UTC m=+20.362547655" observedRunningTime="2026-04-16 20:28:03.501364324 +0000 UTC m=+26.769106782" watchObservedRunningTime="2026-04-16 20:28:03.501872745 +0000 UTC m=+26.769615198" Apr 16 20:28:04.307694 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:04.307672 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jzdqd" Apr 16 20:28:04.307815 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:04.307772 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jzdqd" podUID="90b993f2-207d-4894-bbdf-e2219dbf690b" Apr 16 20:28:04.442119 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:04.442092 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-k9tth"] Apr 16 20:28:04.442270 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:04.442209 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9tth" Apr 16 20:28:04.442376 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:04.442304 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9tth" podUID="ea0bcc7a-3a29-4d83-8da0-05ec471a2764" Apr 16 20:28:04.447038 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:04.447013 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jzdqd"] Apr 16 20:28:04.482813 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:04.482744 2565 generic.go:358] "Generic (PLEG): container finished" podID="1aac658f-e4b3-4b53-a125-6cae725d6fcd" containerID="2f5c530a9a8c6c3a660517ce30718cdda7a9c2fc71e981a1e0457b144f29e057" exitCode=0 Apr 16 20:28:04.483201 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:04.482837 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hthg8" event={"ID":"1aac658f-e4b3-4b53-a125-6cae725d6fcd","Type":"ContainerDied","Data":"2f5c530a9a8c6c3a660517ce30718cdda7a9c2fc71e981a1e0457b144f29e057"} Apr 16 20:28:04.483201 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:04.483171 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:28:04.483734 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:04.483622 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jzdqd" Apr 16 20:28:04.483797 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:04.483735 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jzdqd" podUID="90b993f2-207d-4894-bbdf-e2219dbf690b" Apr 16 20:28:04.484004 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:04.483989 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:28:04.498619 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:04.498598 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:28:05.486544 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:05.486507 2565 generic.go:358] "Generic (PLEG): container finished" podID="1aac658f-e4b3-4b53-a125-6cae725d6fcd" containerID="4b77a3bb1103b65b0ff22bcd9feadb43d3157f095402b7e0021f418e1f5afa07" exitCode=0 Apr 16 20:28:05.486982 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:05.486553 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hthg8" event={"ID":"1aac658f-e4b3-4b53-a125-6cae725d6fcd","Type":"ContainerDied","Data":"4b77a3bb1103b65b0ff22bcd9feadb43d3157f095402b7e0021f418e1f5afa07"} Apr 16 20:28:06.307671 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:06.307598 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9tth" Apr 16 20:28:06.307823 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:06.307608 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jzdqd" Apr 16 20:28:06.307823 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:06.307739 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9tth" podUID="ea0bcc7a-3a29-4d83-8da0-05ec471a2764" Apr 16 20:28:06.307823 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:06.307795 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jzdqd" podUID="90b993f2-207d-4894-bbdf-e2219dbf690b" Apr 16 20:28:08.308406 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:08.308218 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jzdqd" Apr 16 20:28:08.308806 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:08.308232 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9tth" Apr 16 20:28:08.308806 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:08.308496 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jzdqd" podUID="90b993f2-207d-4894-bbdf-e2219dbf690b" Apr 16 20:28:08.308806 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:08.308544 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k9tth" podUID="ea0bcc7a-3a29-4d83-8da0-05ec471a2764" Apr 16 20:28:09.913504 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:09.913474 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90b993f2-207d-4894-bbdf-e2219dbf690b-metrics-certs\") pod \"network-metrics-daemon-jzdqd\" (UID: \"90b993f2-207d-4894-bbdf-e2219dbf690b\") " pod="openshift-multus/network-metrics-daemon-jzdqd" Apr 16 20:28:09.914050 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:09.913608 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:28:09.914050 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:09.913658 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90b993f2-207d-4894-bbdf-e2219dbf690b-metrics-certs podName:90b993f2-207d-4894-bbdf-e2219dbf690b nodeName:}" failed. No retries permitted until 2026-04-16 20:28:41.913645524 +0000 UTC m=+65.181387955 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/90b993f2-207d-4894-bbdf-e2219dbf690b-metrics-certs") pod "network-metrics-daemon-jzdqd" (UID: "90b993f2-207d-4894-bbdf-e2219dbf690b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:28:10.014586 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.014550 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4rn7c\" (UniqueName: \"kubernetes.io/projected/ea0bcc7a-3a29-4d83-8da0-05ec471a2764-kube-api-access-4rn7c\") pod \"network-check-target-k9tth\" (UID: \"ea0bcc7a-3a29-4d83-8da0-05ec471a2764\") " pod="openshift-network-diagnostics/network-check-target-k9tth" Apr 16 20:28:10.014733 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:10.014712 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:28:10.014733 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:10.014731 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:28:10.014853 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:10.014744 2565 projected.go:194] Error preparing data for projected volume kube-api-access-4rn7c for pod openshift-network-diagnostics/network-check-target-k9tth: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:28:10.014853 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:10.014804 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ea0bcc7a-3a29-4d83-8da0-05ec471a2764-kube-api-access-4rn7c podName:ea0bcc7a-3a29-4d83-8da0-05ec471a2764 nodeName:}" failed. No retries permitted until 2026-04-16 20:28:42.014783469 +0000 UTC m=+65.282525902 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-4rn7c" (UniqueName: "kubernetes.io/projected/ea0bcc7a-3a29-4d83-8da0-05ec471a2764-kube-api-access-4rn7c") pod "network-check-target-k9tth" (UID: "ea0bcc7a-3a29-4d83-8da0-05ec471a2764") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:28:10.098483 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.098454 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-150.ec2.internal" event="NodeReady" Apr 16 20:28:10.098639 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.098590 2565 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 20:28:10.140045 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.140015 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-9gqr8"] Apr 16 20:28:10.181149 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.181077 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-bcshd"] Apr 16 20:28:10.181311 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.181248 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9gqr8" Apr 16 20:28:10.183874 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.183853 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 20:28:10.184001 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.183854 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 20:28:10.184001 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.183936 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-w999j\"" Apr 16 20:28:10.208048 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.208022 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bcshd"] Apr 16 20:28:10.208048 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.208048 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9gqr8"] Apr 16 20:28:10.208190 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.208146 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bcshd" Apr 16 20:28:10.210841 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.210761 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 20:28:10.210841 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.210770 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 20:28:10.211026 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.210894 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-2cgzs\"" Apr 16 20:28:10.211084 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.211040 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 20:28:10.307995 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.307966 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9tth" Apr 16 20:28:10.308156 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.307968 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jzdqd" Apr 16 20:28:10.310989 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.310807 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 20:28:10.310989 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.310848 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 20:28:10.310989 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.310871 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-v8bd2\"" Apr 16 20:28:10.310989 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.310860 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jwldw\"" Apr 16 20:28:10.310989 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.310911 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 20:28:10.315854 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.315826 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/394dcb43-4d46-4c81-bcad-73d0aadfc01c-cert\") pod \"ingress-canary-bcshd\" (UID: \"394dcb43-4d46-4c81-bcad-73d0aadfc01c\") " pod="openshift-ingress-canary/ingress-canary-bcshd" Apr 16 20:28:10.315978 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.315927 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ec99b398-3371-4b5d-b2f7-ce06fed2c67c-config-volume\") pod \"dns-default-9gqr8\" (UID: \"ec99b398-3371-4b5d-b2f7-ce06fed2c67c\") " pod="openshift-dns/dns-default-9gqr8" Apr 16 20:28:10.316038 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.315990 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffwg9\" (UniqueName: \"kubernetes.io/projected/394dcb43-4d46-4c81-bcad-73d0aadfc01c-kube-api-access-ffwg9\") pod \"ingress-canary-bcshd\" (UID: \"394dcb43-4d46-4c81-bcad-73d0aadfc01c\") " pod="openshift-ingress-canary/ingress-canary-bcshd" Apr 16 20:28:10.316038 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.316033 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52vbf\" (UniqueName: \"kubernetes.io/projected/ec99b398-3371-4b5d-b2f7-ce06fed2c67c-kube-api-access-52vbf\") pod \"dns-default-9gqr8\" (UID: \"ec99b398-3371-4b5d-b2f7-ce06fed2c67c\") " pod="openshift-dns/dns-default-9gqr8" Apr 16 20:28:10.316128 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.316067 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ec99b398-3371-4b5d-b2f7-ce06fed2c67c-metrics-tls\") pod \"dns-default-9gqr8\" (UID: \"ec99b398-3371-4b5d-b2f7-ce06fed2c67c\") " pod="openshift-dns/dns-default-9gqr8" Apr 16 20:28:10.316128 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.316090 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ec99b398-3371-4b5d-b2f7-ce06fed2c67c-tmp-dir\") pod \"dns-default-9gqr8\" (UID: \"ec99b398-3371-4b5d-b2f7-ce06fed2c67c\") " pod="openshift-dns/dns-default-9gqr8" Apr 16 20:28:10.416752 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.416722 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ec99b398-3371-4b5d-b2f7-ce06fed2c67c-metrics-tls\") pod \"dns-default-9gqr8\" (UID: \"ec99b398-3371-4b5d-b2f7-ce06fed2c67c\") " pod="openshift-dns/dns-default-9gqr8" Apr 16 20:28:10.416752 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.416758 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ec99b398-3371-4b5d-b2f7-ce06fed2c67c-tmp-dir\") pod \"dns-default-9gqr8\" (UID: \"ec99b398-3371-4b5d-b2f7-ce06fed2c67c\") " pod="openshift-dns/dns-default-9gqr8" Apr 16 20:28:10.416960 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.416779 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/394dcb43-4d46-4c81-bcad-73d0aadfc01c-cert\") pod \"ingress-canary-bcshd\" (UID: \"394dcb43-4d46-4c81-bcad-73d0aadfc01c\") " pod="openshift-ingress-canary/ingress-canary-bcshd" Apr 16 20:28:10.416960 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:10.416860 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:28:10.416960 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.416872 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ec99b398-3371-4b5d-b2f7-ce06fed2c67c-config-volume\") pod \"dns-default-9gqr8\" (UID: \"ec99b398-3371-4b5d-b2f7-ce06fed2c67c\") " pod="openshift-dns/dns-default-9gqr8" Apr 16 20:28:10.416960 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:10.416926 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/394dcb43-4d46-4c81-bcad-73d0aadfc01c-cert podName:394dcb43-4d46-4c81-bcad-73d0aadfc01c nodeName:}" failed. No retries permitted until 2026-04-16 20:28:10.916907617 +0000 UTC m=+34.184650071 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/394dcb43-4d46-4c81-bcad-73d0aadfc01c-cert") pod "ingress-canary-bcshd" (UID: "394dcb43-4d46-4c81-bcad-73d0aadfc01c") : secret "canary-serving-cert" not found Apr 16 20:28:10.416960 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.416953 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ffwg9\" (UniqueName: \"kubernetes.io/projected/394dcb43-4d46-4c81-bcad-73d0aadfc01c-kube-api-access-ffwg9\") pod \"ingress-canary-bcshd\" (UID: \"394dcb43-4d46-4c81-bcad-73d0aadfc01c\") " pod="openshift-ingress-canary/ingress-canary-bcshd" Apr 16 20:28:10.417265 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:10.416967 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:28:10.417265 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.416996 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-52vbf\" (UniqueName: \"kubernetes.io/projected/ec99b398-3371-4b5d-b2f7-ce06fed2c67c-kube-api-access-52vbf\") pod \"dns-default-9gqr8\" (UID: \"ec99b398-3371-4b5d-b2f7-ce06fed2c67c\") " pod="openshift-dns/dns-default-9gqr8" Apr 16 20:28:10.417265 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:10.417025 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec99b398-3371-4b5d-b2f7-ce06fed2c67c-metrics-tls podName:ec99b398-3371-4b5d-b2f7-ce06fed2c67c nodeName:}" failed. No retries permitted until 2026-04-16 20:28:10.917006138 +0000 UTC m=+34.184748571 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ec99b398-3371-4b5d-b2f7-ce06fed2c67c-metrics-tls") pod "dns-default-9gqr8" (UID: "ec99b398-3371-4b5d-b2f7-ce06fed2c67c") : secret "dns-default-metrics-tls" not found Apr 16 20:28:10.417265 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.417090 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ec99b398-3371-4b5d-b2f7-ce06fed2c67c-tmp-dir\") pod \"dns-default-9gqr8\" (UID: \"ec99b398-3371-4b5d-b2f7-ce06fed2c67c\") " pod="openshift-dns/dns-default-9gqr8" Apr 16 20:28:10.417516 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.417437 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ec99b398-3371-4b5d-b2f7-ce06fed2c67c-config-volume\") pod \"dns-default-9gqr8\" (UID: \"ec99b398-3371-4b5d-b2f7-ce06fed2c67c\") " pod="openshift-dns/dns-default-9gqr8" Apr 16 20:28:10.429311 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.429263 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-52vbf\" (UniqueName: \"kubernetes.io/projected/ec99b398-3371-4b5d-b2f7-ce06fed2c67c-kube-api-access-52vbf\") pod \"dns-default-9gqr8\" (UID: \"ec99b398-3371-4b5d-b2f7-ce06fed2c67c\") " pod="openshift-dns/dns-default-9gqr8" Apr 16 20:28:10.429425 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.429355 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffwg9\" (UniqueName: \"kubernetes.io/projected/394dcb43-4d46-4c81-bcad-73d0aadfc01c-kube-api-access-ffwg9\") pod \"ingress-canary-bcshd\" (UID: \"394dcb43-4d46-4c81-bcad-73d0aadfc01c\") " pod="openshift-ingress-canary/ingress-canary-bcshd" Apr 16 20:28:10.919532 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.919489 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ec99b398-3371-4b5d-b2f7-ce06fed2c67c-metrics-tls\") pod \"dns-default-9gqr8\" (UID: \"ec99b398-3371-4b5d-b2f7-ce06fed2c67c\") " pod="openshift-dns/dns-default-9gqr8" Apr 16 20:28:10.920196 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:10.919549 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/394dcb43-4d46-4c81-bcad-73d0aadfc01c-cert\") pod \"ingress-canary-bcshd\" (UID: \"394dcb43-4d46-4c81-bcad-73d0aadfc01c\") " pod="openshift-ingress-canary/ingress-canary-bcshd" Apr 16 20:28:10.920196 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:10.919596 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:28:10.920196 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:10.919677 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec99b398-3371-4b5d-b2f7-ce06fed2c67c-metrics-tls podName:ec99b398-3371-4b5d-b2f7-ce06fed2c67c nodeName:}" failed. No retries permitted until 2026-04-16 20:28:11.919658395 +0000 UTC m=+35.187400849 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ec99b398-3371-4b5d-b2f7-ce06fed2c67c-metrics-tls") pod "dns-default-9gqr8" (UID: "ec99b398-3371-4b5d-b2f7-ce06fed2c67c") : secret "dns-default-metrics-tls" not found Apr 16 20:28:10.920196 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:10.919707 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:28:10.920196 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:10.919784 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/394dcb43-4d46-4c81-bcad-73d0aadfc01c-cert podName:394dcb43-4d46-4c81-bcad-73d0aadfc01c nodeName:}" failed. No retries permitted until 2026-04-16 20:28:11.919766044 +0000 UTC m=+35.187508488 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/394dcb43-4d46-4c81-bcad-73d0aadfc01c-cert") pod "ingress-canary-bcshd" (UID: "394dcb43-4d46-4c81-bcad-73d0aadfc01c") : secret "canary-serving-cert" not found Apr 16 20:28:11.925732 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:11.925696 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ec99b398-3371-4b5d-b2f7-ce06fed2c67c-metrics-tls\") pod \"dns-default-9gqr8\" (UID: \"ec99b398-3371-4b5d-b2f7-ce06fed2c67c\") " pod="openshift-dns/dns-default-9gqr8" Apr 16 20:28:11.925732 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:11.925735 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/394dcb43-4d46-4c81-bcad-73d0aadfc01c-cert\") pod \"ingress-canary-bcshd\" (UID: \"394dcb43-4d46-4c81-bcad-73d0aadfc01c\") " pod="openshift-ingress-canary/ingress-canary-bcshd" Apr 16 20:28:11.926245 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:11.925837 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:28:11.926245 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:11.925857 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:28:11.926245 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:11.925886 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/394dcb43-4d46-4c81-bcad-73d0aadfc01c-cert podName:394dcb43-4d46-4c81-bcad-73d0aadfc01c nodeName:}" failed. No retries permitted until 2026-04-16 20:28:13.925872807 +0000 UTC m=+37.193615238 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/394dcb43-4d46-4c81-bcad-73d0aadfc01c-cert") pod "ingress-canary-bcshd" (UID: "394dcb43-4d46-4c81-bcad-73d0aadfc01c") : secret "canary-serving-cert" not found Apr 16 20:28:11.926245 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:11.925928 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec99b398-3371-4b5d-b2f7-ce06fed2c67c-metrics-tls podName:ec99b398-3371-4b5d-b2f7-ce06fed2c67c nodeName:}" failed. No retries permitted until 2026-04-16 20:28:13.925910629 +0000 UTC m=+37.193653060 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ec99b398-3371-4b5d-b2f7-ce06fed2c67c-metrics-tls") pod "dns-default-9gqr8" (UID: "ec99b398-3371-4b5d-b2f7-ce06fed2c67c") : secret "dns-default-metrics-tls" not found Apr 16 20:28:12.501546 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:12.501516 2565 generic.go:358] "Generic (PLEG): container finished" podID="1aac658f-e4b3-4b53-a125-6cae725d6fcd" containerID="3f51c9d74c21a827e4e920205d8ff39a64c5b60c2ec64bac40c248cec9034082" exitCode=0 Apr 16 20:28:12.501712 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:12.501564 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hthg8" event={"ID":"1aac658f-e4b3-4b53-a125-6cae725d6fcd","Type":"ContainerDied","Data":"3f51c9d74c21a827e4e920205d8ff39a64c5b60c2ec64bac40c248cec9034082"} Apr 16 20:28:13.505453 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:13.505423 2565 generic.go:358] "Generic (PLEG): container finished" podID="1aac658f-e4b3-4b53-a125-6cae725d6fcd" containerID="a5421c2e453a3598971cb9be57d6478b1a5b10bf346fba8203326d6c9eb04a51" exitCode=0 Apr 16 20:28:13.505865 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:13.505469 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hthg8" event={"ID":"1aac658f-e4b3-4b53-a125-6cae725d6fcd","Type":"ContainerDied","Data":"a5421c2e453a3598971cb9be57d6478b1a5b10bf346fba8203326d6c9eb04a51"} Apr 16 20:28:13.939896 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:13.939738 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ec99b398-3371-4b5d-b2f7-ce06fed2c67c-metrics-tls\") pod \"dns-default-9gqr8\" (UID: \"ec99b398-3371-4b5d-b2f7-ce06fed2c67c\") " pod="openshift-dns/dns-default-9gqr8" Apr 16 20:28:13.940021 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:13.939906 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/394dcb43-4d46-4c81-bcad-73d0aadfc01c-cert\") pod \"ingress-canary-bcshd\" (UID: \"394dcb43-4d46-4c81-bcad-73d0aadfc01c\") " pod="openshift-ingress-canary/ingress-canary-bcshd" Apr 16 20:28:13.940021 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:13.939869 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:28:13.940021 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:13.940015 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:28:13.940146 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:13.940037 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec99b398-3371-4b5d-b2f7-ce06fed2c67c-metrics-tls podName:ec99b398-3371-4b5d-b2f7-ce06fed2c67c nodeName:}" failed. No retries permitted until 2026-04-16 20:28:17.940015794 +0000 UTC m=+41.207758238 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ec99b398-3371-4b5d-b2f7-ce06fed2c67c-metrics-tls") pod "dns-default-9gqr8" (UID: "ec99b398-3371-4b5d-b2f7-ce06fed2c67c") : secret "dns-default-metrics-tls" not found Apr 16 20:28:13.940146 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:13.940058 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/394dcb43-4d46-4c81-bcad-73d0aadfc01c-cert podName:394dcb43-4d46-4c81-bcad-73d0aadfc01c nodeName:}" failed. No retries permitted until 2026-04-16 20:28:17.940048358 +0000 UTC m=+41.207790789 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/394dcb43-4d46-4c81-bcad-73d0aadfc01c-cert") pod "ingress-canary-bcshd" (UID: "394dcb43-4d46-4c81-bcad-73d0aadfc01c") : secret "canary-serving-cert" not found Apr 16 20:28:14.509758 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:14.509729 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hthg8" event={"ID":"1aac658f-e4b3-4b53-a125-6cae725d6fcd","Type":"ContainerStarted","Data":"30310e1466611f243de0c343726305a13e75f5d8a3f9f5d02edfedb3d546c520"} Apr 16 20:28:14.531667 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:14.531620 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-hthg8" podStartSLOduration=4.600166878 podStartE2EDuration="37.531585408s" podCreationTimestamp="2026-04-16 20:27:37 +0000 UTC" firstStartedPulling="2026-04-16 20:27:38.487995232 +0000 UTC m=+1.755737663" lastFinishedPulling="2026-04-16 20:28:11.419413748 +0000 UTC m=+34.687156193" observedRunningTime="2026-04-16 20:28:14.530008957 +0000 UTC m=+37.797751410" watchObservedRunningTime="2026-04-16 20:28:14.531585408 +0000 UTC m=+37.799327861" Apr 16 20:28:17.965314 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:17.965269 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/394dcb43-4d46-4c81-bcad-73d0aadfc01c-cert\") pod \"ingress-canary-bcshd\" (UID: \"394dcb43-4d46-4c81-bcad-73d0aadfc01c\") " pod="openshift-ingress-canary/ingress-canary-bcshd" Apr 16 20:28:17.965713 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:17.965357 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ec99b398-3371-4b5d-b2f7-ce06fed2c67c-metrics-tls\") pod \"dns-default-9gqr8\" (UID: \"ec99b398-3371-4b5d-b2f7-ce06fed2c67c\") " pod="openshift-dns/dns-default-9gqr8" Apr 16 20:28:17.965713 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:17.965365 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:28:17.965713 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:17.965416 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/394dcb43-4d46-4c81-bcad-73d0aadfc01c-cert podName:394dcb43-4d46-4c81-bcad-73d0aadfc01c nodeName:}" failed. No retries permitted until 2026-04-16 20:28:25.965402352 +0000 UTC m=+49.233144788 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/394dcb43-4d46-4c81-bcad-73d0aadfc01c-cert") pod "ingress-canary-bcshd" (UID: "394dcb43-4d46-4c81-bcad-73d0aadfc01c") : secret "canary-serving-cert" not found Apr 16 20:28:17.965713 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:17.965438 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:28:17.965713 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:17.965472 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec99b398-3371-4b5d-b2f7-ce06fed2c67c-metrics-tls podName:ec99b398-3371-4b5d-b2f7-ce06fed2c67c nodeName:}" failed. No retries permitted until 2026-04-16 20:28:25.965461749 +0000 UTC m=+49.233204180 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ec99b398-3371-4b5d-b2f7-ce06fed2c67c-metrics-tls") pod "dns-default-9gqr8" (UID: "ec99b398-3371-4b5d-b2f7-ce06fed2c67c") : secret "dns-default-metrics-tls" not found Apr 16 20:28:26.021926 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:26.021885 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ec99b398-3371-4b5d-b2f7-ce06fed2c67c-metrics-tls\") pod \"dns-default-9gqr8\" (UID: \"ec99b398-3371-4b5d-b2f7-ce06fed2c67c\") " pod="openshift-dns/dns-default-9gqr8" Apr 16 20:28:26.021926 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:26.021927 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/394dcb43-4d46-4c81-bcad-73d0aadfc01c-cert\") pod \"ingress-canary-bcshd\" (UID: \"394dcb43-4d46-4c81-bcad-73d0aadfc01c\") " pod="openshift-ingress-canary/ingress-canary-bcshd" Apr 16 20:28:26.022542 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:26.022031 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:28:26.022542 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:26.022051 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:28:26.022542 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:26.022088 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/394dcb43-4d46-4c81-bcad-73d0aadfc01c-cert podName:394dcb43-4d46-4c81-bcad-73d0aadfc01c nodeName:}" failed. No retries permitted until 2026-04-16 20:28:42.022071534 +0000 UTC m=+65.289813967 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/394dcb43-4d46-4c81-bcad-73d0aadfc01c-cert") pod "ingress-canary-bcshd" (UID: "394dcb43-4d46-4c81-bcad-73d0aadfc01c") : secret "canary-serving-cert" not found Apr 16 20:28:26.022542 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:26.022117 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec99b398-3371-4b5d-b2f7-ce06fed2c67c-metrics-tls podName:ec99b398-3371-4b5d-b2f7-ce06fed2c67c nodeName:}" failed. No retries permitted until 2026-04-16 20:28:42.022098462 +0000 UTC m=+65.289840899 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ec99b398-3371-4b5d-b2f7-ce06fed2c67c-metrics-tls") pod "dns-default-9gqr8" (UID: "ec99b398-3371-4b5d-b2f7-ce06fed2c67c") : secret "dns-default-metrics-tls" not found Apr 16 20:28:36.498868 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:36.498840 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-59rwg" Apr 16 20:28:41.931091 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:41.931057 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90b993f2-207d-4894-bbdf-e2219dbf690b-metrics-certs\") pod \"network-metrics-daemon-jzdqd\" (UID: \"90b993f2-207d-4894-bbdf-e2219dbf690b\") " pod="openshift-multus/network-metrics-daemon-jzdqd" Apr 16 20:28:41.933956 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:41.933940 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 20:28:41.941912 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:41.941893 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 20:28:41.941998 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:41.941966 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90b993f2-207d-4894-bbdf-e2219dbf690b-metrics-certs podName:90b993f2-207d-4894-bbdf-e2219dbf690b nodeName:}" failed. No retries permitted until 2026-04-16 20:29:45.941946823 +0000 UTC m=+129.209689267 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/90b993f2-207d-4894-bbdf-e2219dbf690b-metrics-certs") pod "network-metrics-daemon-jzdqd" (UID: "90b993f2-207d-4894-bbdf-e2219dbf690b") : secret "metrics-daemon-secret" not found Apr 16 20:28:42.032333 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:42.032300 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4rn7c\" (UniqueName: \"kubernetes.io/projected/ea0bcc7a-3a29-4d83-8da0-05ec471a2764-kube-api-access-4rn7c\") pod \"network-check-target-k9tth\" (UID: \"ea0bcc7a-3a29-4d83-8da0-05ec471a2764\") " pod="openshift-network-diagnostics/network-check-target-k9tth" Apr 16 20:28:42.032450 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:42.032345 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ec99b398-3371-4b5d-b2f7-ce06fed2c67c-metrics-tls\") pod \"dns-default-9gqr8\" (UID: \"ec99b398-3371-4b5d-b2f7-ce06fed2c67c\") " pod="openshift-dns/dns-default-9gqr8" Apr 16 20:28:42.032450 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:42.032365 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/394dcb43-4d46-4c81-bcad-73d0aadfc01c-cert\") pod \"ingress-canary-bcshd\" (UID: \"394dcb43-4d46-4c81-bcad-73d0aadfc01c\") " pod="openshift-ingress-canary/ingress-canary-bcshd" Apr 16 20:28:42.032516 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:42.032459 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:28:42.032516 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:42.032478 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:28:42.032516 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:42.032505 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/394dcb43-4d46-4c81-bcad-73d0aadfc01c-cert podName:394dcb43-4d46-4c81-bcad-73d0aadfc01c nodeName:}" failed. No retries permitted until 2026-04-16 20:29:14.032488259 +0000 UTC m=+97.300230690 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/394dcb43-4d46-4c81-bcad-73d0aadfc01c-cert") pod "ingress-canary-bcshd" (UID: "394dcb43-4d46-4c81-bcad-73d0aadfc01c") : secret "canary-serving-cert" not found Apr 16 20:28:42.032615 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:28:42.032519 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec99b398-3371-4b5d-b2f7-ce06fed2c67c-metrics-tls podName:ec99b398-3371-4b5d-b2f7-ce06fed2c67c nodeName:}" failed. No retries permitted until 2026-04-16 20:29:14.032513149 +0000 UTC m=+97.300255581 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ec99b398-3371-4b5d-b2f7-ce06fed2c67c-metrics-tls") pod "dns-default-9gqr8" (UID: "ec99b398-3371-4b5d-b2f7-ce06fed2c67c") : secret "dns-default-metrics-tls" not found Apr 16 20:28:42.035469 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:42.035455 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 20:28:42.045528 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:42.045510 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 20:28:42.057636 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:42.057613 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rn7c\" (UniqueName: \"kubernetes.io/projected/ea0bcc7a-3a29-4d83-8da0-05ec471a2764-kube-api-access-4rn7c\") pod \"network-check-target-k9tth\" (UID: \"ea0bcc7a-3a29-4d83-8da0-05ec471a2764\") " pod="openshift-network-diagnostics/network-check-target-k9tth" Apr 16 20:28:42.122231 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:42.122196 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jwldw\"" Apr 16 20:28:42.129733 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:42.129706 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k9tth" Apr 16 20:28:42.296025 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:42.295998 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-k9tth"] Apr 16 20:28:42.299649 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:28:42.299624 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea0bcc7a_3a29_4d83_8da0_05ec471a2764.slice/crio-068df7ec921568f3d0654f19b0ee5048a266a70efaeee53e078a620539b57da7 WatchSource:0}: Error finding container 068df7ec921568f3d0654f19b0ee5048a266a70efaeee53e078a620539b57da7: Status 404 returned error can't find the container with id 068df7ec921568f3d0654f19b0ee5048a266a70efaeee53e078a620539b57da7 Apr 16 20:28:42.557074 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:42.556988 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-k9tth" event={"ID":"ea0bcc7a-3a29-4d83-8da0-05ec471a2764","Type":"ContainerStarted","Data":"068df7ec921568f3d0654f19b0ee5048a266a70efaeee53e078a620539b57da7"} Apr 16 20:28:45.562823 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:45.562751 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-k9tth" event={"ID":"ea0bcc7a-3a29-4d83-8da0-05ec471a2764","Type":"ContainerStarted","Data":"965fd6f400d4ba4176315f549f4ec5397b4cb56a69ac25f92d5050e216863dfe"} Apr 16 20:28:45.563151 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:45.562877 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-k9tth" Apr 16 20:28:45.577800 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:28:45.577730 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-k9tth" podStartSLOduration=65.69508346 podStartE2EDuration="1m8.577718355s" podCreationTimestamp="2026-04-16 20:27:37 +0000 UTC" firstStartedPulling="2026-04-16 20:28:42.301430123 +0000 UTC m=+65.569172555" lastFinishedPulling="2026-04-16 20:28:45.184065019 +0000 UTC m=+68.451807450" observedRunningTime="2026-04-16 20:28:45.576653429 +0000 UTC m=+68.844395874" watchObservedRunningTime="2026-04-16 20:28:45.577718355 +0000 UTC m=+68.845460808" Apr 16 20:29:14.041521 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:14.041490 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ec99b398-3371-4b5d-b2f7-ce06fed2c67c-metrics-tls\") pod \"dns-default-9gqr8\" (UID: \"ec99b398-3371-4b5d-b2f7-ce06fed2c67c\") " pod="openshift-dns/dns-default-9gqr8" Apr 16 20:29:14.041521 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:14.041527 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/394dcb43-4d46-4c81-bcad-73d0aadfc01c-cert\") pod \"ingress-canary-bcshd\" (UID: \"394dcb43-4d46-4c81-bcad-73d0aadfc01c\") " pod="openshift-ingress-canary/ingress-canary-bcshd" Apr 16 20:29:14.041944 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:29:14.041632 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:29:14.041944 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:29:14.041635 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:29:14.041944 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:29:14.041686 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/394dcb43-4d46-4c81-bcad-73d0aadfc01c-cert podName:394dcb43-4d46-4c81-bcad-73d0aadfc01c nodeName:}" failed. No retries permitted until 2026-04-16 20:30:18.041671337 +0000 UTC m=+161.309413768 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/394dcb43-4d46-4c81-bcad-73d0aadfc01c-cert") pod "ingress-canary-bcshd" (UID: "394dcb43-4d46-4c81-bcad-73d0aadfc01c") : secret "canary-serving-cert" not found Apr 16 20:29:14.041944 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:29:14.041699 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec99b398-3371-4b5d-b2f7-ce06fed2c67c-metrics-tls podName:ec99b398-3371-4b5d-b2f7-ce06fed2c67c nodeName:}" failed. No retries permitted until 2026-04-16 20:30:18.041692959 +0000 UTC m=+161.309435389 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ec99b398-3371-4b5d-b2f7-ce06fed2c67c-metrics-tls") pod "dns-default-9gqr8" (UID: "ec99b398-3371-4b5d-b2f7-ce06fed2c67c") : secret "dns-default-metrics-tls" not found Apr 16 20:29:16.566605 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:16.566576 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-k9tth" Apr 16 20:29:45.951118 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:45.951077 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90b993f2-207d-4894-bbdf-e2219dbf690b-metrics-certs\") pod \"network-metrics-daemon-jzdqd\" (UID: \"90b993f2-207d-4894-bbdf-e2219dbf690b\") " pod="openshift-multus/network-metrics-daemon-jzdqd" Apr 16 20:29:45.951612 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:29:45.951191 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 20:29:45.951612 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:29:45.951242 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90b993f2-207d-4894-bbdf-e2219dbf690b-metrics-certs podName:90b993f2-207d-4894-bbdf-e2219dbf690b nodeName:}" failed. No retries permitted until 2026-04-16 20:31:47.95122808 +0000 UTC m=+251.218970511 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/90b993f2-207d-4894-bbdf-e2219dbf690b-metrics-certs") pod "network-metrics-daemon-jzdqd" (UID: "90b993f2-207d-4894-bbdf-e2219dbf690b") : secret "metrics-daemon-secret" not found Apr 16 20:29:50.837865 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.837829 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-l9zgr"] Apr 16 20:29:50.840473 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.840456 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-l9zgr" Apr 16 20:29:50.842975 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.842939 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 20:29:50.843101 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.842990 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-wkb2z\"" Apr 16 20:29:50.844141 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.844122 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 20:29:50.844247 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.844155 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 20:29:50.844247 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.844158 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 20:29:50.848295 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.848250 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 20:29:50.849294 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.849256 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-l9zgr"] Apr 16 20:29:50.878715 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.878694 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60ac5c46-c967-4000-bd8b-4f0c90324ecb-serving-cert\") pod \"insights-operator-585dfdc468-l9zgr\" (UID: \"60ac5c46-c967-4000-bd8b-4f0c90324ecb\") " pod="openshift-insights/insights-operator-585dfdc468-l9zgr" Apr 16 20:29:50.878828 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.878723 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60ac5c46-c967-4000-bd8b-4f0c90324ecb-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-l9zgr\" (UID: \"60ac5c46-c967-4000-bd8b-4f0c90324ecb\") " pod="openshift-insights/insights-operator-585dfdc468-l9zgr" Apr 16 20:29:50.878828 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.878744 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60ac5c46-c967-4000-bd8b-4f0c90324ecb-service-ca-bundle\") pod \"insights-operator-585dfdc468-l9zgr\" (UID: \"60ac5c46-c967-4000-bd8b-4f0c90324ecb\") " pod="openshift-insights/insights-operator-585dfdc468-l9zgr" Apr 16 20:29:50.878828 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.878785 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/60ac5c46-c967-4000-bd8b-4f0c90324ecb-tmp\") pod \"insights-operator-585dfdc468-l9zgr\" (UID: \"60ac5c46-c967-4000-bd8b-4f0c90324ecb\") " pod="openshift-insights/insights-operator-585dfdc468-l9zgr" Apr 16 20:29:50.878828 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.878807 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgb6l\" (UniqueName: \"kubernetes.io/projected/60ac5c46-c967-4000-bd8b-4f0c90324ecb-kube-api-access-sgb6l\") pod \"insights-operator-585dfdc468-l9zgr\" (UID: \"60ac5c46-c967-4000-bd8b-4f0c90324ecb\") " pod="openshift-insights/insights-operator-585dfdc468-l9zgr" Apr 16 20:29:50.878971 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.878835 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/60ac5c46-c967-4000-bd8b-4f0c90324ecb-snapshots\") pod \"insights-operator-585dfdc468-l9zgr\" (UID: \"60ac5c46-c967-4000-bd8b-4f0c90324ecb\") " pod="openshift-insights/insights-operator-585dfdc468-l9zgr" Apr 16 20:29:50.938990 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.938964 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5dss8"] Apr 16 20:29:50.941466 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.941444 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5dss8" Apr 16 20:29:50.942979 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.942960 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-shwkq"] Apr 16 20:29:50.944142 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.944125 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 20:29:50.944377 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.944361 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-qjgvz\"" Apr 16 20:29:50.944440 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.944365 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:29:50.945417 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.945401 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-shwkq" Apr 16 20:29:50.947958 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.947784 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 20:29:50.947958 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.947811 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 20:29:50.947958 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.947906 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:29:50.948157 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.948020 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-7wrz4\"" Apr 16 20:29:50.948157 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.948041 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 20:29:50.952202 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.952187 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 20:29:50.956796 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.956776 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5dss8"] Apr 16 20:29:50.971742 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.971720 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-shwkq"] Apr 16 20:29:50.979263 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.979243 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a53ef938-6712-4133-9657-41ecb93318cf-config\") pod \"console-operator-9d4b6777b-shwkq\" (UID: \"a53ef938-6712-4133-9657-41ecb93318cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-shwkq" Apr 16 20:29:50.979374 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.979306 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60ac5c46-c967-4000-bd8b-4f0c90324ecb-serving-cert\") pod \"insights-operator-585dfdc468-l9zgr\" (UID: \"60ac5c46-c967-4000-bd8b-4f0c90324ecb\") " pod="openshift-insights/insights-operator-585dfdc468-l9zgr" Apr 16 20:29:50.979374 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.979330 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f89jf\" (UniqueName: \"kubernetes.io/projected/505458c7-696b-4e52-94fd-8c6620a8cf96-kube-api-access-f89jf\") pod \"volume-data-source-validator-7c6cbb6c87-5dss8\" (UID: \"505458c7-696b-4e52-94fd-8c6620a8cf96\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5dss8" Apr 16 20:29:50.979374 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.979352 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60ac5c46-c967-4000-bd8b-4f0c90324ecb-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-l9zgr\" (UID: \"60ac5c46-c967-4000-bd8b-4f0c90324ecb\") " pod="openshift-insights/insights-operator-585dfdc468-l9zgr" Apr 16 20:29:50.979374 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.979367 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60ac5c46-c967-4000-bd8b-4f0c90324ecb-service-ca-bundle\") pod \"insights-operator-585dfdc468-l9zgr\" (UID: \"60ac5c46-c967-4000-bd8b-4f0c90324ecb\") " pod="openshift-insights/insights-operator-585dfdc468-l9zgr" Apr 16 20:29:50.979532 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.979383 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/60ac5c46-c967-4000-bd8b-4f0c90324ecb-tmp\") pod \"insights-operator-585dfdc468-l9zgr\" (UID: \"60ac5c46-c967-4000-bd8b-4f0c90324ecb\") " pod="openshift-insights/insights-operator-585dfdc468-l9zgr" Apr 16 20:29:50.979532 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.979400 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgb6l\" (UniqueName: \"kubernetes.io/projected/60ac5c46-c967-4000-bd8b-4f0c90324ecb-kube-api-access-sgb6l\") pod \"insights-operator-585dfdc468-l9zgr\" (UID: \"60ac5c46-c967-4000-bd8b-4f0c90324ecb\") " pod="openshift-insights/insights-operator-585dfdc468-l9zgr" Apr 16 20:29:50.979532 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.979443 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a53ef938-6712-4133-9657-41ecb93318cf-serving-cert\") pod \"console-operator-9d4b6777b-shwkq\" (UID: \"a53ef938-6712-4133-9657-41ecb93318cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-shwkq" Apr 16 20:29:50.979532 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.979472 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/60ac5c46-c967-4000-bd8b-4f0c90324ecb-snapshots\") pod \"insights-operator-585dfdc468-l9zgr\" (UID: \"60ac5c46-c967-4000-bd8b-4f0c90324ecb\") " pod="openshift-insights/insights-operator-585dfdc468-l9zgr" Apr 16 20:29:50.979532 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.979495 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t586d\" (UniqueName: \"kubernetes.io/projected/a53ef938-6712-4133-9657-41ecb93318cf-kube-api-access-t586d\") pod \"console-operator-9d4b6777b-shwkq\" (UID: \"a53ef938-6712-4133-9657-41ecb93318cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-shwkq" Apr 16 20:29:50.979532 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.979529 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a53ef938-6712-4133-9657-41ecb93318cf-trusted-ca\") pod \"console-operator-9d4b6777b-shwkq\" (UID: \"a53ef938-6712-4133-9657-41ecb93318cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-shwkq" Apr 16 20:29:50.979815 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.979782 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/60ac5c46-c967-4000-bd8b-4f0c90324ecb-tmp\") pod \"insights-operator-585dfdc468-l9zgr\" (UID: \"60ac5c46-c967-4000-bd8b-4f0c90324ecb\") " pod="openshift-insights/insights-operator-585dfdc468-l9zgr" Apr 16 20:29:50.979914 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.979897 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60ac5c46-c967-4000-bd8b-4f0c90324ecb-service-ca-bundle\") pod \"insights-operator-585dfdc468-l9zgr\" (UID: \"60ac5c46-c967-4000-bd8b-4f0c90324ecb\") " pod="openshift-insights/insights-operator-585dfdc468-l9zgr" Apr 16 20:29:50.980044 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.980028 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/60ac5c46-c967-4000-bd8b-4f0c90324ecb-snapshots\") pod \"insights-operator-585dfdc468-l9zgr\" (UID: \"60ac5c46-c967-4000-bd8b-4f0c90324ecb\") " pod="openshift-insights/insights-operator-585dfdc468-l9zgr" Apr 16 20:29:50.980250 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.980232 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60ac5c46-c967-4000-bd8b-4f0c90324ecb-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-l9zgr\" (UID: \"60ac5c46-c967-4000-bd8b-4f0c90324ecb\") " pod="openshift-insights/insights-operator-585dfdc468-l9zgr" Apr 16 20:29:50.981989 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.981967 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60ac5c46-c967-4000-bd8b-4f0c90324ecb-serving-cert\") pod \"insights-operator-585dfdc468-l9zgr\" (UID: \"60ac5c46-c967-4000-bd8b-4f0c90324ecb\") " pod="openshift-insights/insights-operator-585dfdc468-l9zgr" Apr 16 20:29:50.988792 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:50.988775 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgb6l\" (UniqueName: \"kubernetes.io/projected/60ac5c46-c967-4000-bd8b-4f0c90324ecb-kube-api-access-sgb6l\") pod \"insights-operator-585dfdc468-l9zgr\" (UID: \"60ac5c46-c967-4000-bd8b-4f0c90324ecb\") " pod="openshift-insights/insights-operator-585dfdc468-l9zgr" Apr 16 20:29:51.041958 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.041931 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-96894b6c-psxd6"] Apr 16 20:29:51.046451 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.046433 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-96894b6c-psxd6" Apr 16 20:29:51.048835 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.048814 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 20:29:51.049045 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.049027 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 20:29:51.049121 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.049042 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-tj6tv\"" Apr 16 20:29:51.049121 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.049024 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 20:29:51.055175 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.055143 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-96894b6c-psxd6"] Apr 16 20:29:51.057619 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.057597 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 20:29:51.080682 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.080658 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a53ef938-6712-4133-9657-41ecb93318cf-trusted-ca\") pod \"console-operator-9d4b6777b-shwkq\" (UID: \"a53ef938-6712-4133-9657-41ecb93318cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-shwkq" Apr 16 20:29:51.080767 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.080689 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fb910f93-043f-467a-88bc-ff78901b3eb4-image-registry-private-configuration\") pod \"image-registry-96894b6c-psxd6\" (UID: \"fb910f93-043f-467a-88bc-ff78901b3eb4\") " pod="openshift-image-registry/image-registry-96894b6c-psxd6" Apr 16 20:29:51.080767 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.080725 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a53ef938-6712-4133-9657-41ecb93318cf-config\") pod \"console-operator-9d4b6777b-shwkq\" (UID: \"a53ef938-6712-4133-9657-41ecb93318cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-shwkq" Apr 16 20:29:51.080847 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.080770 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhnqk\" (UniqueName: \"kubernetes.io/projected/fb910f93-043f-467a-88bc-ff78901b3eb4-kube-api-access-fhnqk\") pod \"image-registry-96894b6c-psxd6\" (UID: \"fb910f93-043f-467a-88bc-ff78901b3eb4\") " pod="openshift-image-registry/image-registry-96894b6c-psxd6" Apr 16 20:29:51.080847 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.080800 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fb910f93-043f-467a-88bc-ff78901b3eb4-ca-trust-extracted\") pod \"image-registry-96894b6c-psxd6\" (UID: \"fb910f93-043f-467a-88bc-ff78901b3eb4\") " pod="openshift-image-registry/image-registry-96894b6c-psxd6" Apr 16 20:29:51.080927 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.080854 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f89jf\" (UniqueName: \"kubernetes.io/projected/505458c7-696b-4e52-94fd-8c6620a8cf96-kube-api-access-f89jf\") pod \"volume-data-source-validator-7c6cbb6c87-5dss8\" (UID: \"505458c7-696b-4e52-94fd-8c6620a8cf96\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5dss8" Apr 16 20:29:51.080927 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.080889 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fb910f93-043f-467a-88bc-ff78901b3eb4-registry-certificates\") pod \"image-registry-96894b6c-psxd6\" (UID: \"fb910f93-043f-467a-88bc-ff78901b3eb4\") " pod="openshift-image-registry/image-registry-96894b6c-psxd6" Apr 16 20:29:51.081078 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.080928 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fb910f93-043f-467a-88bc-ff78901b3eb4-registry-tls\") pod \"image-registry-96894b6c-psxd6\" (UID: \"fb910f93-043f-467a-88bc-ff78901b3eb4\") " pod="openshift-image-registry/image-registry-96894b6c-psxd6" Apr 16 20:29:51.081078 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.080964 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a53ef938-6712-4133-9657-41ecb93318cf-serving-cert\") pod \"console-operator-9d4b6777b-shwkq\" (UID: \"a53ef938-6712-4133-9657-41ecb93318cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-shwkq" Apr 16 20:29:51.081078 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.081008 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t586d\" (UniqueName: \"kubernetes.io/projected/a53ef938-6712-4133-9657-41ecb93318cf-kube-api-access-t586d\") pod \"console-operator-9d4b6777b-shwkq\" (UID: \"a53ef938-6712-4133-9657-41ecb93318cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-shwkq" Apr 16 20:29:51.081078 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.081034 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fb910f93-043f-467a-88bc-ff78901b3eb4-trusted-ca\") pod \"image-registry-96894b6c-psxd6\" (UID: \"fb910f93-043f-467a-88bc-ff78901b3eb4\") " pod="openshift-image-registry/image-registry-96894b6c-psxd6" Apr 16 20:29:51.081078 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.081060 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fb910f93-043f-467a-88bc-ff78901b3eb4-installation-pull-secrets\") pod \"image-registry-96894b6c-psxd6\" (UID: \"fb910f93-043f-467a-88bc-ff78901b3eb4\") " pod="openshift-image-registry/image-registry-96894b6c-psxd6" Apr 16 20:29:51.081316 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.081092 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fb910f93-043f-467a-88bc-ff78901b3eb4-bound-sa-token\") pod \"image-registry-96894b6c-psxd6\" (UID: \"fb910f93-043f-467a-88bc-ff78901b3eb4\") " pod="openshift-image-registry/image-registry-96894b6c-psxd6" Apr 16 20:29:51.081316 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.081305 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a53ef938-6712-4133-9657-41ecb93318cf-config\") pod \"console-operator-9d4b6777b-shwkq\" (UID: \"a53ef938-6712-4133-9657-41ecb93318cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-shwkq" Apr 16 20:29:51.081567 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.081550 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a53ef938-6712-4133-9657-41ecb93318cf-trusted-ca\") pod \"console-operator-9d4b6777b-shwkq\" (UID: \"a53ef938-6712-4133-9657-41ecb93318cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-shwkq" Apr 16 20:29:51.083102 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.083086 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a53ef938-6712-4133-9657-41ecb93318cf-serving-cert\") pod \"console-operator-9d4b6777b-shwkq\" (UID: \"a53ef938-6712-4133-9657-41ecb93318cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-shwkq" Apr 16 20:29:51.088339 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.088268 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f89jf\" (UniqueName: \"kubernetes.io/projected/505458c7-696b-4e52-94fd-8c6620a8cf96-kube-api-access-f89jf\") pod \"volume-data-source-validator-7c6cbb6c87-5dss8\" (UID: \"505458c7-696b-4e52-94fd-8c6620a8cf96\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5dss8" Apr 16 20:29:51.088923 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.088906 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t586d\" (UniqueName: \"kubernetes.io/projected/a53ef938-6712-4133-9657-41ecb93318cf-kube-api-access-t586d\") pod \"console-operator-9d4b6777b-shwkq\" (UID: \"a53ef938-6712-4133-9657-41ecb93318cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-shwkq" Apr 16 20:29:51.149753 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.149714 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-l9zgr" Apr 16 20:29:51.181680 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.181644 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fb910f93-043f-467a-88bc-ff78901b3eb4-ca-trust-extracted\") pod \"image-registry-96894b6c-psxd6\" (UID: \"fb910f93-043f-467a-88bc-ff78901b3eb4\") " pod="openshift-image-registry/image-registry-96894b6c-psxd6" Apr 16 20:29:51.181813 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.181703 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fb910f93-043f-467a-88bc-ff78901b3eb4-registry-certificates\") pod \"image-registry-96894b6c-psxd6\" (UID: \"fb910f93-043f-467a-88bc-ff78901b3eb4\") " pod="openshift-image-registry/image-registry-96894b6c-psxd6" Apr 16 20:29:51.181813 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.181739 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fb910f93-043f-467a-88bc-ff78901b3eb4-registry-tls\") pod \"image-registry-96894b6c-psxd6\" (UID: \"fb910f93-043f-467a-88bc-ff78901b3eb4\") " pod="openshift-image-registry/image-registry-96894b6c-psxd6" Apr 16 20:29:51.181813 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.181780 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fb910f93-043f-467a-88bc-ff78901b3eb4-trusted-ca\") pod \"image-registry-96894b6c-psxd6\" (UID: \"fb910f93-043f-467a-88bc-ff78901b3eb4\") " pod="openshift-image-registry/image-registry-96894b6c-psxd6" Apr 16 20:29:51.181972 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:29:51.181866 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:29:51.181972 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:29:51.181886 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-96894b6c-psxd6: secret "image-registry-tls" not found Apr 16 20:29:51.181972 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.181914 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fb910f93-043f-467a-88bc-ff78901b3eb4-installation-pull-secrets\") pod \"image-registry-96894b6c-psxd6\" (UID: \"fb910f93-043f-467a-88bc-ff78901b3eb4\") " pod="openshift-image-registry/image-registry-96894b6c-psxd6" Apr 16 20:29:51.181972 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:29:51.181937 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fb910f93-043f-467a-88bc-ff78901b3eb4-registry-tls podName:fb910f93-043f-467a-88bc-ff78901b3eb4 nodeName:}" failed. No retries permitted until 2026-04-16 20:29:51.681922065 +0000 UTC m=+134.949664495 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fb910f93-043f-467a-88bc-ff78901b3eb4-registry-tls") pod "image-registry-96894b6c-psxd6" (UID: "fb910f93-043f-467a-88bc-ff78901b3eb4") : secret "image-registry-tls" not found Apr 16 20:29:51.182175 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.181975 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fb910f93-043f-467a-88bc-ff78901b3eb4-bound-sa-token\") pod \"image-registry-96894b6c-psxd6\" (UID: \"fb910f93-043f-467a-88bc-ff78901b3eb4\") " pod="openshift-image-registry/image-registry-96894b6c-psxd6" Apr 16 20:29:51.182175 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.182026 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fb910f93-043f-467a-88bc-ff78901b3eb4-image-registry-private-configuration\") pod \"image-registry-96894b6c-psxd6\" (UID: \"fb910f93-043f-467a-88bc-ff78901b3eb4\") " pod="openshift-image-registry/image-registry-96894b6c-psxd6" Apr 16 20:29:51.182175 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.182083 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fhnqk\" (UniqueName: \"kubernetes.io/projected/fb910f93-043f-467a-88bc-ff78901b3eb4-kube-api-access-fhnqk\") pod \"image-registry-96894b6c-psxd6\" (UID: \"fb910f93-043f-467a-88bc-ff78901b3eb4\") " pod="openshift-image-registry/image-registry-96894b6c-psxd6" Apr 16 20:29:51.182365 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.182029 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fb910f93-043f-467a-88bc-ff78901b3eb4-ca-trust-extracted\") pod \"image-registry-96894b6c-psxd6\" (UID: \"fb910f93-043f-467a-88bc-ff78901b3eb4\") " pod="openshift-image-registry/image-registry-96894b6c-psxd6" Apr 16 20:29:51.182419 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.182390 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fb910f93-043f-467a-88bc-ff78901b3eb4-registry-certificates\") pod \"image-registry-96894b6c-psxd6\" (UID: \"fb910f93-043f-467a-88bc-ff78901b3eb4\") " pod="openshift-image-registry/image-registry-96894b6c-psxd6" Apr 16 20:29:51.184300 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.183476 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fb910f93-043f-467a-88bc-ff78901b3eb4-trusted-ca\") pod \"image-registry-96894b6c-psxd6\" (UID: \"fb910f93-043f-467a-88bc-ff78901b3eb4\") " pod="openshift-image-registry/image-registry-96894b6c-psxd6" Apr 16 20:29:51.184678 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.184661 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fb910f93-043f-467a-88bc-ff78901b3eb4-installation-pull-secrets\") pod \"image-registry-96894b6c-psxd6\" (UID: \"fb910f93-043f-467a-88bc-ff78901b3eb4\") " pod="openshift-image-registry/image-registry-96894b6c-psxd6" Apr 16 20:29:51.184750 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.184700 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fb910f93-043f-467a-88bc-ff78901b3eb4-image-registry-private-configuration\") pod \"image-registry-96894b6c-psxd6\" (UID: \"fb910f93-043f-467a-88bc-ff78901b3eb4\") " pod="openshift-image-registry/image-registry-96894b6c-psxd6" Apr 16 20:29:51.189943 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.189923 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fb910f93-043f-467a-88bc-ff78901b3eb4-bound-sa-token\") pod \"image-registry-96894b6c-psxd6\" (UID: \"fb910f93-043f-467a-88bc-ff78901b3eb4\") " pod="openshift-image-registry/image-registry-96894b6c-psxd6" Apr 16 20:29:51.190248 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.190232 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhnqk\" (UniqueName: \"kubernetes.io/projected/fb910f93-043f-467a-88bc-ff78901b3eb4-kube-api-access-fhnqk\") pod \"image-registry-96894b6c-psxd6\" (UID: \"fb910f93-043f-467a-88bc-ff78901b3eb4\") " pod="openshift-image-registry/image-registry-96894b6c-psxd6" Apr 16 20:29:51.251642 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.251614 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5dss8" Apr 16 20:29:51.256344 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.256325 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-shwkq" Apr 16 20:29:51.258991 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.258970 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-l9zgr"] Apr 16 20:29:51.262577 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:29:51.262552 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60ac5c46_c967_4000_bd8b_4f0c90324ecb.slice/crio-7175bbb4edb976ae7b78dcf6f2ecb4ea2d9c1e66724300e8d68d61dc6e8a72da WatchSource:0}: Error finding container 7175bbb4edb976ae7b78dcf6f2ecb4ea2d9c1e66724300e8d68d61dc6e8a72da: Status 404 returned error can't find the container with id 7175bbb4edb976ae7b78dcf6f2ecb4ea2d9c1e66724300e8d68d61dc6e8a72da Apr 16 20:29:51.369835 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.369808 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5dss8"] Apr 16 20:29:51.372725 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:29:51.372697 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod505458c7_696b_4e52_94fd_8c6620a8cf96.slice/crio-c2a08ed52050a075c7d730b17b1bd143a909ad83b8bfae07fb992d18e2505d64 WatchSource:0}: Error finding container c2a08ed52050a075c7d730b17b1bd143a909ad83b8bfae07fb992d18e2505d64: Status 404 returned error can't find the container with id c2a08ed52050a075c7d730b17b1bd143a909ad83b8bfae07fb992d18e2505d64 Apr 16 20:29:51.382537 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.382481 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-shwkq"] Apr 16 20:29:51.386152 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:29:51.386129 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda53ef938_6712_4133_9657_41ecb93318cf.slice/crio-03cc6caf5385e6f6af843923c28d6948fb6ef1f1dc3e593e0117b86ab139b428 WatchSource:0}: Error finding container 03cc6caf5385e6f6af843923c28d6948fb6ef1f1dc3e593e0117b86ab139b428: Status 404 returned error can't find the container with id 03cc6caf5385e6f6af843923c28d6948fb6ef1f1dc3e593e0117b86ab139b428 Apr 16 20:29:51.685834 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.685805 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fb910f93-043f-467a-88bc-ff78901b3eb4-registry-tls\") pod \"image-registry-96894b6c-psxd6\" (UID: \"fb910f93-043f-467a-88bc-ff78901b3eb4\") " pod="openshift-image-registry/image-registry-96894b6c-psxd6" Apr 16 20:29:51.685985 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:29:51.685913 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:29:51.685985 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:29:51.685924 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-96894b6c-psxd6: secret "image-registry-tls" not found Apr 16 20:29:51.685985 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:29:51.685964 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fb910f93-043f-467a-88bc-ff78901b3eb4-registry-tls podName:fb910f93-043f-467a-88bc-ff78901b3eb4 nodeName:}" failed. No retries permitted until 2026-04-16 20:29:52.685952039 +0000 UTC m=+135.953694469 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fb910f93-043f-467a-88bc-ff78901b3eb4-registry-tls") pod "image-registry-96894b6c-psxd6" (UID: "fb910f93-043f-467a-88bc-ff78901b3eb4") : secret "image-registry-tls" not found Apr 16 20:29:51.686158 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.685980 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-l9zgr" event={"ID":"60ac5c46-c967-4000-bd8b-4f0c90324ecb","Type":"ContainerStarted","Data":"7175bbb4edb976ae7b78dcf6f2ecb4ea2d9c1e66724300e8d68d61dc6e8a72da"} Apr 16 20:29:51.686913 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.686893 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-shwkq" event={"ID":"a53ef938-6712-4133-9657-41ecb93318cf","Type":"ContainerStarted","Data":"03cc6caf5385e6f6af843923c28d6948fb6ef1f1dc3e593e0117b86ab139b428"} Apr 16 20:29:51.687756 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:51.687733 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5dss8" event={"ID":"505458c7-696b-4e52-94fd-8c6620a8cf96","Type":"ContainerStarted","Data":"c2a08ed52050a075c7d730b17b1bd143a909ad83b8bfae07fb992d18e2505d64"} Apr 16 20:29:52.692571 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:52.692533 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fb910f93-043f-467a-88bc-ff78901b3eb4-registry-tls\") pod \"image-registry-96894b6c-psxd6\" (UID: \"fb910f93-043f-467a-88bc-ff78901b3eb4\") " pod="openshift-image-registry/image-registry-96894b6c-psxd6" Apr 16 20:29:52.693059 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:29:52.692697 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:29:52.693059 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:29:52.692719 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-96894b6c-psxd6: secret "image-registry-tls" not found Apr 16 20:29:52.693059 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:29:52.692785 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fb910f93-043f-467a-88bc-ff78901b3eb4-registry-tls podName:fb910f93-043f-467a-88bc-ff78901b3eb4 nodeName:}" failed. No retries permitted until 2026-04-16 20:29:54.692764469 +0000 UTC m=+137.960506906 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fb910f93-043f-467a-88bc-ff78901b3eb4-registry-tls") pod "image-registry-96894b6c-psxd6" (UID: "fb910f93-043f-467a-88bc-ff78901b3eb4") : secret "image-registry-tls" not found Apr 16 20:29:53.694305 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:53.694256 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5dss8" event={"ID":"505458c7-696b-4e52-94fd-8c6620a8cf96","Type":"ContainerStarted","Data":"4f43cbba2e0a10c6a23251dca5e9385fce18e8994f8f2f60620965750c251938"} Apr 16 20:29:53.709437 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:53.709374 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5dss8" podStartSLOduration=2.210130298 podStartE2EDuration="3.709360273s" podCreationTimestamp="2026-04-16 20:29:50 +0000 UTC" firstStartedPulling="2026-04-16 20:29:51.374381779 +0000 UTC m=+134.642124213" lastFinishedPulling="2026-04-16 20:29:52.873611751 +0000 UTC m=+136.141354188" observedRunningTime="2026-04-16 20:29:53.708948203 +0000 UTC m=+136.976690657" watchObservedRunningTime="2026-04-16 20:29:53.709360273 +0000 UTC m=+136.977102747" Apr 16 20:29:54.697742 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:54.697699 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-l9zgr" event={"ID":"60ac5c46-c967-4000-bd8b-4f0c90324ecb","Type":"ContainerStarted","Data":"e20b52ae41c0d3cf835a6b7d875fd8f6bc190244ff13e879d44d708696eef505"} Apr 16 20:29:54.699084 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:54.699060 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-shwkq_a53ef938-6712-4133-9657-41ecb93318cf/console-operator/0.log" Apr 16 20:29:54.699214 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:54.699096 2565 generic.go:358] "Generic (PLEG): container finished" podID="a53ef938-6712-4133-9657-41ecb93318cf" containerID="fd375ab7db748294de805986a3f33f36666210e176b675a0cdb347242bb3a703" exitCode=255 Apr 16 20:29:54.699214 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:54.699194 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-shwkq" event={"ID":"a53ef938-6712-4133-9657-41ecb93318cf","Type":"ContainerDied","Data":"fd375ab7db748294de805986a3f33f36666210e176b675a0cdb347242bb3a703"} Apr 16 20:29:54.699418 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:54.699401 2565 scope.go:117] "RemoveContainer" containerID="fd375ab7db748294de805986a3f33f36666210e176b675a0cdb347242bb3a703" Apr 16 20:29:54.706690 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:54.706671 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fb910f93-043f-467a-88bc-ff78901b3eb4-registry-tls\") pod \"image-registry-96894b6c-psxd6\" (UID: \"fb910f93-043f-467a-88bc-ff78901b3eb4\") " pod="openshift-image-registry/image-registry-96894b6c-psxd6" Apr 16 20:29:54.706793 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:29:54.706772 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:29:54.706793 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:29:54.706783 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-96894b6c-psxd6: secret "image-registry-tls" not found Apr 16 20:29:54.706897 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:29:54.706822 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fb910f93-043f-467a-88bc-ff78901b3eb4-registry-tls podName:fb910f93-043f-467a-88bc-ff78901b3eb4 nodeName:}" failed. No retries permitted until 2026-04-16 20:29:58.706810542 +0000 UTC m=+141.974552972 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fb910f93-043f-467a-88bc-ff78901b3eb4-registry-tls") pod "image-registry-96894b6c-psxd6" (UID: "fb910f93-043f-467a-88bc-ff78901b3eb4") : secret "image-registry-tls" not found Apr 16 20:29:54.713115 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:54.713071 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-l9zgr" podStartSLOduration=1.899522042 podStartE2EDuration="4.7130573s" podCreationTimestamp="2026-04-16 20:29:50 +0000 UTC" firstStartedPulling="2026-04-16 20:29:51.264377009 +0000 UTC m=+134.532119444" lastFinishedPulling="2026-04-16 20:29:54.077912267 +0000 UTC m=+137.345654702" observedRunningTime="2026-04-16 20:29:54.711972855 +0000 UTC m=+137.979715321" watchObservedRunningTime="2026-04-16 20:29:54.7130573 +0000 UTC m=+137.980799754" Apr 16 20:29:55.655655 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:55.655624 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-d6hrz"] Apr 16 20:29:55.658570 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:55.658550 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-d6hrz" Apr 16 20:29:55.661434 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:55.661415 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 20:29:55.661518 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:55.661415 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 20:29:55.661518 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:55.661452 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-nhlhv\"" Apr 16 20:29:55.667145 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:55.667122 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-d6hrz"] Apr 16 20:29:55.702011 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:55.701991 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-shwkq_a53ef938-6712-4133-9657-41ecb93318cf/console-operator/1.log" Apr 16 20:29:55.702366 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:55.702352 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-shwkq_a53ef938-6712-4133-9657-41ecb93318cf/console-operator/0.log" Apr 16 20:29:55.702417 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:55.702384 2565 generic.go:358] "Generic (PLEG): container finished" podID="a53ef938-6712-4133-9657-41ecb93318cf" containerID="1b8a364847d6d4151b3d99262ebb7bfc4d74edf037d55ea6f17aff130425ef84" exitCode=255 Apr 16 20:29:55.702497 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:55.702478 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-shwkq" event={"ID":"a53ef938-6712-4133-9657-41ecb93318cf","Type":"ContainerDied","Data":"1b8a364847d6d4151b3d99262ebb7bfc4d74edf037d55ea6f17aff130425ef84"} Apr 16 20:29:55.702541 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:55.702519 2565 scope.go:117] "RemoveContainer" containerID="fd375ab7db748294de805986a3f33f36666210e176b675a0cdb347242bb3a703" Apr 16 20:29:55.702755 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:55.702737 2565 scope.go:117] "RemoveContainer" containerID="1b8a364847d6d4151b3d99262ebb7bfc4d74edf037d55ea6f17aff130425ef84" Apr 16 20:29:55.702938 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:29:55.702920 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-shwkq_openshift-console-operator(a53ef938-6712-4133-9657-41ecb93318cf)\"" pod="openshift-console-operator/console-operator-9d4b6777b-shwkq" podUID="a53ef938-6712-4133-9657-41ecb93318cf" Apr 16 20:29:55.715293 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:55.715264 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7lwh\" (UniqueName: \"kubernetes.io/projected/99557dd7-ec96-4c42-9baf-cbe25a9d29da-kube-api-access-d7lwh\") pod \"migrator-74bb7799d9-d6hrz\" (UID: \"99557dd7-ec96-4c42-9baf-cbe25a9d29da\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-d6hrz" Apr 16 20:29:55.816686 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:55.816636 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d7lwh\" (UniqueName: \"kubernetes.io/projected/99557dd7-ec96-4c42-9baf-cbe25a9d29da-kube-api-access-d7lwh\") pod \"migrator-74bb7799d9-d6hrz\" (UID: \"99557dd7-ec96-4c42-9baf-cbe25a9d29da\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-d6hrz" Apr 16 20:29:55.824483 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:55.824453 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7lwh\" (UniqueName: \"kubernetes.io/projected/99557dd7-ec96-4c42-9baf-cbe25a9d29da-kube-api-access-d7lwh\") pod \"migrator-74bb7799d9-d6hrz\" (UID: \"99557dd7-ec96-4c42-9baf-cbe25a9d29da\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-d6hrz" Apr 16 20:29:55.967439 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:55.967409 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-d6hrz" Apr 16 20:29:56.077749 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:56.077721 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-d6hrz"] Apr 16 20:29:56.082361 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:29:56.082333 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99557dd7_ec96_4c42_9baf_cbe25a9d29da.slice/crio-9504b648f8b9eed16ef27467dfc66e0b3a22b3068b3bcdef63f21247e2cdffec WatchSource:0}: Error finding container 9504b648f8b9eed16ef27467dfc66e0b3a22b3068b3bcdef63f21247e2cdffec: Status 404 returned error can't find the container with id 9504b648f8b9eed16ef27467dfc66e0b3a22b3068b3bcdef63f21247e2cdffec Apr 16 20:29:56.661667 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:56.661635 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7qfs5_22753460-b103-4969-8aca-1ea39040795b/dns-node-resolver/0.log" Apr 16 20:29:56.705842 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:56.705799 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-d6hrz" event={"ID":"99557dd7-ec96-4c42-9baf-cbe25a9d29da","Type":"ContainerStarted","Data":"9504b648f8b9eed16ef27467dfc66e0b3a22b3068b3bcdef63f21247e2cdffec"} Apr 16 20:29:56.707311 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:56.707267 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-shwkq_a53ef938-6712-4133-9657-41ecb93318cf/console-operator/1.log" Apr 16 20:29:56.707667 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:56.707646 2565 scope.go:117] "RemoveContainer" containerID="1b8a364847d6d4151b3d99262ebb7bfc4d74edf037d55ea6f17aff130425ef84" Apr 16 20:29:56.707840 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:29:56.707822 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-shwkq_openshift-console-operator(a53ef938-6712-4133-9657-41ecb93318cf)\"" pod="openshift-console-operator/console-operator-9d4b6777b-shwkq" podUID="a53ef938-6712-4133-9657-41ecb93318cf" Apr 16 20:29:57.710514 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:57.710442 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-d6hrz" event={"ID":"99557dd7-ec96-4c42-9baf-cbe25a9d29da","Type":"ContainerStarted","Data":"3e01ddb750a55066fbad870ad4e2569a044539fa3ddc7856630b80e7a9502255"} Apr 16 20:29:57.710514 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:57.710480 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-d6hrz" event={"ID":"99557dd7-ec96-4c42-9baf-cbe25a9d29da","Type":"ContainerStarted","Data":"0797bf85ca75c1189e2a7bdffabff81e4988f689658e4213eae5464221f79a97"} Apr 16 20:29:57.724608 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:57.724547 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-d6hrz" podStartSLOduration=1.3963851649999999 podStartE2EDuration="2.724531865s" podCreationTimestamp="2026-04-16 20:29:55 +0000 UTC" firstStartedPulling="2026-04-16 20:29:56.084526678 +0000 UTC m=+139.352269110" lastFinishedPulling="2026-04-16 20:29:57.412673372 +0000 UTC m=+140.680415810" observedRunningTime="2026-04-16 20:29:57.724398182 +0000 UTC m=+140.992140637" watchObservedRunningTime="2026-04-16 20:29:57.724531865 +0000 UTC m=+140.992274365" Apr 16 20:29:58.061949 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:58.061882 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-n2tj2_72204d28-677e-4d89-a353-b087ce28c38f/node-ca/0.log" Apr 16 20:29:58.104496 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:58.104468 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-9vngr"] Apr 16 20:29:58.107290 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:58.107258 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-9vngr" Apr 16 20:29:58.109628 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:58.109608 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 20:29:58.109729 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:58.109655 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 20:29:58.109844 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:58.109830 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 20:29:58.109901 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:58.109859 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-lxqfl\"" Apr 16 20:29:58.110553 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:58.110540 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 20:29:58.114400 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:58.114382 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-9vngr"] Apr 16 20:29:58.236483 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:58.236450 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8b42e11f-8bfe-41db-b13d-87555a1d9063-signing-key\") pod \"service-ca-865cb79987-9vngr\" (UID: \"8b42e11f-8bfe-41db-b13d-87555a1d9063\") " pod="openshift-service-ca/service-ca-865cb79987-9vngr" Apr 16 20:29:58.236630 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:58.236491 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8b42e11f-8bfe-41db-b13d-87555a1d9063-signing-cabundle\") pod \"service-ca-865cb79987-9vngr\" (UID: \"8b42e11f-8bfe-41db-b13d-87555a1d9063\") " pod="openshift-service-ca/service-ca-865cb79987-9vngr" Apr 16 20:29:58.236630 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:58.236546 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgq2j\" (UniqueName: \"kubernetes.io/projected/8b42e11f-8bfe-41db-b13d-87555a1d9063-kube-api-access-qgq2j\") pod \"service-ca-865cb79987-9vngr\" (UID: \"8b42e11f-8bfe-41db-b13d-87555a1d9063\") " pod="openshift-service-ca/service-ca-865cb79987-9vngr" Apr 16 20:29:58.337415 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:58.337337 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qgq2j\" (UniqueName: \"kubernetes.io/projected/8b42e11f-8bfe-41db-b13d-87555a1d9063-kube-api-access-qgq2j\") pod \"service-ca-865cb79987-9vngr\" (UID: \"8b42e11f-8bfe-41db-b13d-87555a1d9063\") " pod="openshift-service-ca/service-ca-865cb79987-9vngr" Apr 16 20:29:58.337558 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:58.337428 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8b42e11f-8bfe-41db-b13d-87555a1d9063-signing-key\") pod \"service-ca-865cb79987-9vngr\" (UID: \"8b42e11f-8bfe-41db-b13d-87555a1d9063\") " pod="openshift-service-ca/service-ca-865cb79987-9vngr" Apr 16 20:29:58.337558 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:58.337465 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8b42e11f-8bfe-41db-b13d-87555a1d9063-signing-cabundle\") pod \"service-ca-865cb79987-9vngr\" (UID: \"8b42e11f-8bfe-41db-b13d-87555a1d9063\") " pod="openshift-service-ca/service-ca-865cb79987-9vngr" Apr 16 20:29:58.338083 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:58.338059 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8b42e11f-8bfe-41db-b13d-87555a1d9063-signing-cabundle\") pod \"service-ca-865cb79987-9vngr\" (UID: \"8b42e11f-8bfe-41db-b13d-87555a1d9063\") " pod="openshift-service-ca/service-ca-865cb79987-9vngr" Apr 16 20:29:58.339831 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:58.339809 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8b42e11f-8bfe-41db-b13d-87555a1d9063-signing-key\") pod \"service-ca-865cb79987-9vngr\" (UID: \"8b42e11f-8bfe-41db-b13d-87555a1d9063\") " pod="openshift-service-ca/service-ca-865cb79987-9vngr" Apr 16 20:29:58.345021 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:58.345000 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgq2j\" (UniqueName: \"kubernetes.io/projected/8b42e11f-8bfe-41db-b13d-87555a1d9063-kube-api-access-qgq2j\") pod \"service-ca-865cb79987-9vngr\" (UID: \"8b42e11f-8bfe-41db-b13d-87555a1d9063\") " pod="openshift-service-ca/service-ca-865cb79987-9vngr" Apr 16 20:29:58.416675 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:58.416650 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-9vngr" Apr 16 20:29:58.526086 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:58.526057 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-9vngr"] Apr 16 20:29:58.529267 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:29:58.529231 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b42e11f_8bfe_41db_b13d_87555a1d9063.slice/crio-a246560568a169fb35993c551538bd4ceec1093a5655ebaaa535e8f60fee0a60 WatchSource:0}: Error finding container a246560568a169fb35993c551538bd4ceec1093a5655ebaaa535e8f60fee0a60: Status 404 returned error can't find the container with id a246560568a169fb35993c551538bd4ceec1093a5655ebaaa535e8f60fee0a60 Apr 16 20:29:58.716349 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:58.716315 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-9vngr" event={"ID":"8b42e11f-8bfe-41db-b13d-87555a1d9063","Type":"ContainerStarted","Data":"a246560568a169fb35993c551538bd4ceec1093a5655ebaaa535e8f60fee0a60"} Apr 16 20:29:58.739843 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:29:58.739821 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fb910f93-043f-467a-88bc-ff78901b3eb4-registry-tls\") pod \"image-registry-96894b6c-psxd6\" (UID: \"fb910f93-043f-467a-88bc-ff78901b3eb4\") " pod="openshift-image-registry/image-registry-96894b6c-psxd6" Apr 16 20:29:58.739927 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:29:58.739914 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:29:58.739927 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:29:58.739925 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-96894b6c-psxd6: secret "image-registry-tls" not found Apr 16 20:29:58.740000 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:29:58.739976 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fb910f93-043f-467a-88bc-ff78901b3eb4-registry-tls podName:fb910f93-043f-467a-88bc-ff78901b3eb4 nodeName:}" failed. No retries permitted until 2026-04-16 20:30:06.739963672 +0000 UTC m=+150.007706103 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fb910f93-043f-467a-88bc-ff78901b3eb4-registry-tls") pod "image-registry-96894b6c-psxd6" (UID: "fb910f93-043f-467a-88bc-ff78901b3eb4") : secret "image-registry-tls" not found Apr 16 20:30:00.722672 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:00.722633 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-9vngr" event={"ID":"8b42e11f-8bfe-41db-b13d-87555a1d9063","Type":"ContainerStarted","Data":"197aec5d698b008edd005daf44339c862b69399cbdc5bc14c11146e0a8066d7e"} Apr 16 20:30:00.736927 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:00.736879 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-9vngr" podStartSLOduration=1.04900359 podStartE2EDuration="2.736866158s" podCreationTimestamp="2026-04-16 20:29:58 +0000 UTC" firstStartedPulling="2026-04-16 20:29:58.531456468 +0000 UTC m=+141.799198899" lastFinishedPulling="2026-04-16 20:30:00.219319033 +0000 UTC m=+143.487061467" observedRunningTime="2026-04-16 20:30:00.736442449 +0000 UTC m=+144.004184901" watchObservedRunningTime="2026-04-16 20:30:00.736866158 +0000 UTC m=+144.004608608" Apr 16 20:30:01.257306 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:01.257262 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-shwkq" Apr 16 20:30:01.257452 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:01.257314 2565 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-shwkq" Apr 16 20:30:01.257663 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:01.257649 2565 scope.go:117] "RemoveContainer" containerID="1b8a364847d6d4151b3d99262ebb7bfc4d74edf037d55ea6f17aff130425ef84" Apr 16 20:30:01.257814 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:30:01.257798 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-shwkq_openshift-console-operator(a53ef938-6712-4133-9657-41ecb93318cf)\"" pod="openshift-console-operator/console-operator-9d4b6777b-shwkq" podUID="a53ef938-6712-4133-9657-41ecb93318cf" Apr 16 20:30:06.809547 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:06.809506 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fb910f93-043f-467a-88bc-ff78901b3eb4-registry-tls\") pod \"image-registry-96894b6c-psxd6\" (UID: \"fb910f93-043f-467a-88bc-ff78901b3eb4\") " pod="openshift-image-registry/image-registry-96894b6c-psxd6" Apr 16 20:30:06.811922 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:06.811902 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fb910f93-043f-467a-88bc-ff78901b3eb4-registry-tls\") pod \"image-registry-96894b6c-psxd6\" (UID: \"fb910f93-043f-467a-88bc-ff78901b3eb4\") " pod="openshift-image-registry/image-registry-96894b6c-psxd6" Apr 16 20:30:06.959867 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:06.959824 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-96894b6c-psxd6" Apr 16 20:30:07.075991 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:07.075908 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-96894b6c-psxd6"] Apr 16 20:30:07.078755 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:30:07.078720 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb910f93_043f_467a_88bc_ff78901b3eb4.slice/crio-f0668f8f1be1c40aeb3e94b0ac6568e5c9ae281b200259220d53bba78e3615e7 WatchSource:0}: Error finding container f0668f8f1be1c40aeb3e94b0ac6568e5c9ae281b200259220d53bba78e3615e7: Status 404 returned error can't find the container with id f0668f8f1be1c40aeb3e94b0ac6568e5c9ae281b200259220d53bba78e3615e7 Apr 16 20:30:07.740943 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:07.740909 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-96894b6c-psxd6" event={"ID":"fb910f93-043f-467a-88bc-ff78901b3eb4","Type":"ContainerStarted","Data":"2224d9a0dbd96d97ca6f2f82631638186367512d5e987b90fe2d5f7b90b69410"} Apr 16 20:30:07.740943 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:07.740943 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-96894b6c-psxd6" event={"ID":"fb910f93-043f-467a-88bc-ff78901b3eb4","Type":"ContainerStarted","Data":"f0668f8f1be1c40aeb3e94b0ac6568e5c9ae281b200259220d53bba78e3615e7"} Apr 16 20:30:07.741139 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:07.741040 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-96894b6c-psxd6" Apr 16 20:30:07.759688 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:07.759645 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-96894b6c-psxd6" podStartSLOduration=16.759633562 podStartE2EDuration="16.759633562s" podCreationTimestamp="2026-04-16 20:29:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:30:07.758950893 +0000 UTC m=+151.026693346" watchObservedRunningTime="2026-04-16 20:30:07.759633562 +0000 UTC m=+151.027376015" Apr 16 20:30:12.308096 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:12.308066 2565 scope.go:117] "RemoveContainer" containerID="1b8a364847d6d4151b3d99262ebb7bfc4d74edf037d55ea6f17aff130425ef84" Apr 16 20:30:12.755980 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:12.755947 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-shwkq_a53ef938-6712-4133-9657-41ecb93318cf/console-operator/2.log" Apr 16 20:30:12.756380 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:12.756362 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-shwkq_a53ef938-6712-4133-9657-41ecb93318cf/console-operator/1.log" Apr 16 20:30:12.756494 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:12.756398 2565 generic.go:358] "Generic (PLEG): container finished" podID="a53ef938-6712-4133-9657-41ecb93318cf" containerID="3632baddf4da6fa4e1511220a73e8f61dea41225dad077d666b1db3488d6dd66" exitCode=255 Apr 16 20:30:12.756494 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:12.756440 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-shwkq" event={"ID":"a53ef938-6712-4133-9657-41ecb93318cf","Type":"ContainerDied","Data":"3632baddf4da6fa4e1511220a73e8f61dea41225dad077d666b1db3488d6dd66"} Apr 16 20:30:12.756494 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:12.756474 2565 scope.go:117] "RemoveContainer" containerID="1b8a364847d6d4151b3d99262ebb7bfc4d74edf037d55ea6f17aff130425ef84" Apr 16 20:30:12.756842 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:12.756824 2565 scope.go:117] "RemoveContainer" containerID="3632baddf4da6fa4e1511220a73e8f61dea41225dad077d666b1db3488d6dd66" Apr 16 20:30:12.757062 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:30:12.757035 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-shwkq_openshift-console-operator(a53ef938-6712-4133-9657-41ecb93318cf)\"" pod="openshift-console-operator/console-operator-9d4b6777b-shwkq" podUID="a53ef938-6712-4133-9657-41ecb93318cf" Apr 16 20:30:13.192026 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:30:13.191991 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-9gqr8" podUID="ec99b398-3371-4b5d-b2f7-ce06fed2c67c" Apr 16 20:30:13.218162 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:30:13.218134 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-bcshd" podUID="394dcb43-4d46-4c81-bcad-73d0aadfc01c" Apr 16 20:30:13.324893 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:30:13.324847 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-jzdqd" podUID="90b993f2-207d-4894-bbdf-e2219dbf690b" Apr 16 20:30:13.759714 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:13.759691 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-shwkq_a53ef938-6712-4133-9657-41ecb93318cf/console-operator/2.log" Apr 16 20:30:13.759878 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:13.759793 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9gqr8" Apr 16 20:30:18.089523 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:18.089491 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ec99b398-3371-4b5d-b2f7-ce06fed2c67c-metrics-tls\") pod \"dns-default-9gqr8\" (UID: \"ec99b398-3371-4b5d-b2f7-ce06fed2c67c\") " pod="openshift-dns/dns-default-9gqr8" Apr 16 20:30:18.089523 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:18.089528 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/394dcb43-4d46-4c81-bcad-73d0aadfc01c-cert\") pod \"ingress-canary-bcshd\" (UID: \"394dcb43-4d46-4c81-bcad-73d0aadfc01c\") " pod="openshift-ingress-canary/ingress-canary-bcshd" Apr 16 20:30:18.091881 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:18.091853 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ec99b398-3371-4b5d-b2f7-ce06fed2c67c-metrics-tls\") pod \"dns-default-9gqr8\" (UID: \"ec99b398-3371-4b5d-b2f7-ce06fed2c67c\") " pod="openshift-dns/dns-default-9gqr8" Apr 16 20:30:18.091949 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:18.091893 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/394dcb43-4d46-4c81-bcad-73d0aadfc01c-cert\") pod \"ingress-canary-bcshd\" (UID: \"394dcb43-4d46-4c81-bcad-73d0aadfc01c\") " pod="openshift-ingress-canary/ingress-canary-bcshd" Apr 16 20:30:18.262826 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:18.262801 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-w999j\"" Apr 16 20:30:18.267053 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:18.267030 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-2fvld"] Apr 16 20:30:18.271126 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:18.271110 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9gqr8" Apr 16 20:30:18.302320 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:18.302291 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2fvld"] Apr 16 20:30:18.302450 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:18.302334 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-96894b6c-psxd6"] Apr 16 20:30:18.302450 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:18.302374 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2fvld" Apr 16 20:30:18.305076 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:18.305010 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 20:30:18.305209 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:18.305137 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-78bvs\"" Apr 16 20:30:18.305209 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:18.305148 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 20:30:18.391081 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:18.391052 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/94d85daa-e508-46da-a759-2a5e804e6a61-data-volume\") pod \"insights-runtime-extractor-2fvld\" (UID: \"94d85daa-e508-46da-a759-2a5e804e6a61\") " pod="openshift-insights/insights-runtime-extractor-2fvld" Apr 16 20:30:18.391225 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:18.391105 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/94d85daa-e508-46da-a759-2a5e804e6a61-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2fvld\" (UID: \"94d85daa-e508-46da-a759-2a5e804e6a61\") " pod="openshift-insights/insights-runtime-extractor-2fvld" Apr 16 20:30:18.391300 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:18.391254 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/94d85daa-e508-46da-a759-2a5e804e6a61-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2fvld\" (UID: \"94d85daa-e508-46da-a759-2a5e804e6a61\") " pod="openshift-insights/insights-runtime-extractor-2fvld" Apr 16 20:30:18.391352 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:18.391320 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/94d85daa-e508-46da-a759-2a5e804e6a61-crio-socket\") pod \"insights-runtime-extractor-2fvld\" (UID: \"94d85daa-e508-46da-a759-2a5e804e6a61\") " pod="openshift-insights/insights-runtime-extractor-2fvld" Apr 16 20:30:18.391399 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:18.391363 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7tp9\" (UniqueName: \"kubernetes.io/projected/94d85daa-e508-46da-a759-2a5e804e6a61-kube-api-access-m7tp9\") pod \"insights-runtime-extractor-2fvld\" (UID: \"94d85daa-e508-46da-a759-2a5e804e6a61\") " pod="openshift-insights/insights-runtime-extractor-2fvld" Apr 16 20:30:18.392661 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:18.392641 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9gqr8"] Apr 16 20:30:18.395724 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:30:18.395691 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec99b398_3371_4b5d_b2f7_ce06fed2c67c.slice/crio-4530acf7e577e36103aacd5adebb3edd988728fdd65ded5e811c59c15c47b981 WatchSource:0}: Error finding container 4530acf7e577e36103aacd5adebb3edd988728fdd65ded5e811c59c15c47b981: Status 404 returned error can't find the container with id 4530acf7e577e36103aacd5adebb3edd988728fdd65ded5e811c59c15c47b981 Apr 16 20:30:18.492376 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:18.492355 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/94d85daa-e508-46da-a759-2a5e804e6a61-data-volume\") pod \"insights-runtime-extractor-2fvld\" (UID: \"94d85daa-e508-46da-a759-2a5e804e6a61\") " pod="openshift-insights/insights-runtime-extractor-2fvld" Apr 16 20:30:18.492484 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:18.492387 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/94d85daa-e508-46da-a759-2a5e804e6a61-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2fvld\" (UID: \"94d85daa-e508-46da-a759-2a5e804e6a61\") " pod="openshift-insights/insights-runtime-extractor-2fvld" Apr 16 20:30:18.492484 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:18.492424 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/94d85daa-e508-46da-a759-2a5e804e6a61-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2fvld\" (UID: \"94d85daa-e508-46da-a759-2a5e804e6a61\") " pod="openshift-insights/insights-runtime-extractor-2fvld" Apr 16 20:30:18.492484 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:18.492445 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/94d85daa-e508-46da-a759-2a5e804e6a61-crio-socket\") pod \"insights-runtime-extractor-2fvld\" (UID: \"94d85daa-e508-46da-a759-2a5e804e6a61\") " pod="openshift-insights/insights-runtime-extractor-2fvld" Apr 16 20:30:18.492598 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:18.492489 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m7tp9\" (UniqueName: \"kubernetes.io/projected/94d85daa-e508-46da-a759-2a5e804e6a61-kube-api-access-m7tp9\") pod \"insights-runtime-extractor-2fvld\" (UID: \"94d85daa-e508-46da-a759-2a5e804e6a61\") " pod="openshift-insights/insights-runtime-extractor-2fvld" Apr 16 20:30:18.492598 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:18.492553 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/94d85daa-e508-46da-a759-2a5e804e6a61-crio-socket\") pod \"insights-runtime-extractor-2fvld\" (UID: \"94d85daa-e508-46da-a759-2a5e804e6a61\") " pod="openshift-insights/insights-runtime-extractor-2fvld" Apr 16 20:30:18.492747 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:18.492727 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/94d85daa-e508-46da-a759-2a5e804e6a61-data-volume\") pod \"insights-runtime-extractor-2fvld\" (UID: \"94d85daa-e508-46da-a759-2a5e804e6a61\") " pod="openshift-insights/insights-runtime-extractor-2fvld" Apr 16 20:30:18.492923 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:18.492899 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/94d85daa-e508-46da-a759-2a5e804e6a61-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2fvld\" (UID: \"94d85daa-e508-46da-a759-2a5e804e6a61\") " pod="openshift-insights/insights-runtime-extractor-2fvld" Apr 16 20:30:18.494690 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:18.494674 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/94d85daa-e508-46da-a759-2a5e804e6a61-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2fvld\" (UID: \"94d85daa-e508-46da-a759-2a5e804e6a61\") " pod="openshift-insights/insights-runtime-extractor-2fvld" Apr 16 20:30:18.502417 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:18.502401 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7tp9\" (UniqueName: \"kubernetes.io/projected/94d85daa-e508-46da-a759-2a5e804e6a61-kube-api-access-m7tp9\") pod \"insights-runtime-extractor-2fvld\" (UID: \"94d85daa-e508-46da-a759-2a5e804e6a61\") " pod="openshift-insights/insights-runtime-extractor-2fvld" Apr 16 20:30:18.612751 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:18.612704 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2fvld" Apr 16 20:30:18.722782 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:18.722753 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2fvld"] Apr 16 20:30:18.726167 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:30:18.726143 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94d85daa_e508_46da_a759_2a5e804e6a61.slice/crio-b893a6e34da516f708635920dd5bf4585465ebb3289a03269693d5c05813384d WatchSource:0}: Error finding container b893a6e34da516f708635920dd5bf4585465ebb3289a03269693d5c05813384d: Status 404 returned error can't find the container with id b893a6e34da516f708635920dd5bf4585465ebb3289a03269693d5c05813384d Apr 16 20:30:18.770943 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:18.770920 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2fvld" event={"ID":"94d85daa-e508-46da-a759-2a5e804e6a61","Type":"ContainerStarted","Data":"b893a6e34da516f708635920dd5bf4585465ebb3289a03269693d5c05813384d"} Apr 16 20:30:18.771878 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:18.771854 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9gqr8" event={"ID":"ec99b398-3371-4b5d-b2f7-ce06fed2c67c","Type":"ContainerStarted","Data":"4530acf7e577e36103aacd5adebb3edd988728fdd65ded5e811c59c15c47b981"} Apr 16 20:30:19.775458 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:19.775417 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2fvld" event={"ID":"94d85daa-e508-46da-a759-2a5e804e6a61","Type":"ContainerStarted","Data":"ffaa1507914a1882c9a20766fd016afeb6c014673cbe8be481cea7ec44923904"} Apr 16 20:30:20.779797 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:20.779757 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2fvld" event={"ID":"94d85daa-e508-46da-a759-2a5e804e6a61","Type":"ContainerStarted","Data":"3fdca1b62bd4fea7b88c7e9a3cadec798a3735bf24ed3bd859452ad0465c0d03"} Apr 16 20:30:20.781322 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:20.781291 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9gqr8" event={"ID":"ec99b398-3371-4b5d-b2f7-ce06fed2c67c","Type":"ContainerStarted","Data":"9a86d2fde5d33574bef278ea49c1797ffbfa258652544f02bf2946a6d5da75fa"} Apr 16 20:30:20.781455 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:20.781328 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9gqr8" event={"ID":"ec99b398-3371-4b5d-b2f7-ce06fed2c67c","Type":"ContainerStarted","Data":"d79c5c67e18fc238b23093e494d0b125a8ad53a1f1dd281c28e5b752fff2bc20"} Apr 16 20:30:20.781512 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:20.781462 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-9gqr8" Apr 16 20:30:20.798051 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:20.797989 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-9gqr8" podStartSLOduration=129.311844256 podStartE2EDuration="2m10.797972392s" podCreationTimestamp="2026-04-16 20:28:10 +0000 UTC" firstStartedPulling="2026-04-16 20:30:18.397436803 +0000 UTC m=+161.665179234" lastFinishedPulling="2026-04-16 20:30:19.883564939 +0000 UTC m=+163.151307370" observedRunningTime="2026-04-16 20:30:20.797467553 +0000 UTC m=+164.065210004" watchObservedRunningTime="2026-04-16 20:30:20.797972392 +0000 UTC m=+164.065714847" Apr 16 20:30:21.257989 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:21.256628 2565 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-shwkq" Apr 16 20:30:21.257989 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:21.257056 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-shwkq" Apr 16 20:30:21.257989 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:21.257433 2565 scope.go:117] "RemoveContainer" containerID="3632baddf4da6fa4e1511220a73e8f61dea41225dad077d666b1db3488d6dd66" Apr 16 20:30:21.257989 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:30:21.257644 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-shwkq_openshift-console-operator(a53ef938-6712-4133-9657-41ecb93318cf)\"" pod="openshift-console-operator/console-operator-9d4b6777b-shwkq" podUID="a53ef938-6712-4133-9657-41ecb93318cf" Apr 16 20:30:21.785799 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:21.785772 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2fvld" event={"ID":"94d85daa-e508-46da-a759-2a5e804e6a61","Type":"ContainerStarted","Data":"16ecaa6087440995b4b8d0d8ee1a8a45346eb54e8bd51c6c0022f3742b2c0f5b"} Apr 16 20:30:21.786145 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:21.786024 2565 scope.go:117] "RemoveContainer" containerID="3632baddf4da6fa4e1511220a73e8f61dea41225dad077d666b1db3488d6dd66" Apr 16 20:30:21.786195 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:30:21.786167 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-shwkq_openshift-console-operator(a53ef938-6712-4133-9657-41ecb93318cf)\"" pod="openshift-console-operator/console-operator-9d4b6777b-shwkq" podUID="a53ef938-6712-4133-9657-41ecb93318cf" Apr 16 20:30:21.804603 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:21.804563 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-2fvld" podStartSLOduration=1.333433711 podStartE2EDuration="3.804551356s" podCreationTimestamp="2026-04-16 20:30:18 +0000 UTC" firstStartedPulling="2026-04-16 20:30:18.860747141 +0000 UTC m=+162.128489585" lastFinishedPulling="2026-04-16 20:30:21.331864799 +0000 UTC m=+164.599607230" observedRunningTime="2026-04-16 20:30:21.80370059 +0000 UTC m=+165.071443044" watchObservedRunningTime="2026-04-16 20:30:21.804551356 +0000 UTC m=+165.072293803" Apr 16 20:30:24.308323 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:24.308291 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bcshd" Apr 16 20:30:24.311094 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:24.311077 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-2cgzs\"" Apr 16 20:30:24.319130 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:24.319113 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bcshd" Apr 16 20:30:24.428191 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:24.428159 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bcshd"] Apr 16 20:30:24.430539 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:30:24.430512 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod394dcb43_4d46_4c81_bcad_73d0aadfc01c.slice/crio-ae6b88e865919c9ac244037b2684f219d6625b1bab3450704567830532a33f38 WatchSource:0}: Error finding container ae6b88e865919c9ac244037b2684f219d6625b1bab3450704567830532a33f38: Status 404 returned error can't find the container with id ae6b88e865919c9ac244037b2684f219d6625b1bab3450704567830532a33f38 Apr 16 20:30:24.794168 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:24.794131 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bcshd" event={"ID":"394dcb43-4d46-4c81-bcad-73d0aadfc01c","Type":"ContainerStarted","Data":"ae6b88e865919c9ac244037b2684f219d6625b1bab3450704567830532a33f38"} Apr 16 20:30:26.307678 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:26.307641 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jzdqd" Apr 16 20:30:26.800955 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:26.800919 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bcshd" event={"ID":"394dcb43-4d46-4c81-bcad-73d0aadfc01c","Type":"ContainerStarted","Data":"17dbf3f98d690a1476ab803f1011256ca392358983a974bb6430f5a2ee08775e"} Apr 16 20:30:26.816039 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:26.815982 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-bcshd" podStartSLOduration=135.089593142 podStartE2EDuration="2m16.815956309s" podCreationTimestamp="2026-04-16 20:28:10 +0000 UTC" firstStartedPulling="2026-04-16 20:30:24.436363754 +0000 UTC m=+167.704106187" lastFinishedPulling="2026-04-16 20:30:26.162726923 +0000 UTC m=+169.430469354" observedRunningTime="2026-04-16 20:30:26.815366321 +0000 UTC m=+170.083108776" watchObservedRunningTime="2026-04-16 20:30:26.815956309 +0000 UTC m=+170.083698763" Apr 16 20:30:28.307156 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:28.307129 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-96894b6c-psxd6" Apr 16 20:30:30.787783 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:30.787749 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-9gqr8" Apr 16 20:30:33.079635 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.079602 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-vwwqh"] Apr 16 20:30:33.083180 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.083162 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-vwwqh" Apr 16 20:30:33.085661 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.085631 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 20:30:33.085807 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.085674 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 20:30:33.085807 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.085676 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 20:30:33.086042 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.085850 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 20:30:33.086861 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.086838 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-8h2bq\"" Apr 16 20:30:33.086984 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.086850 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 20:30:33.087247 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.086860 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 20:30:33.093427 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.093407 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-rffls"] Apr 16 20:30:33.096675 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.096658 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-rffls" Apr 16 20:30:33.099829 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.099811 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 20:30:33.099939 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.099919 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-49d8q\"" Apr 16 20:30:33.100037 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.099987 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 20:30:33.100089 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.100067 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 20:30:33.112545 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.112525 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-rffls"] Apr 16 20:30:33.201572 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.201538 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/390e92b4-466b-4c16-8ae4-95b92628be89-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-rffls\" (UID: \"390e92b4-466b-4c16-8ae4-95b92628be89\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rffls" Apr 16 20:30:33.201709 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.201588 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c0695fb2-0f19-4495-91ba-ec07f29dbbc0-metrics-client-ca\") pod \"node-exporter-vwwqh\" (UID: \"c0695fb2-0f19-4495-91ba-ec07f29dbbc0\") " pod="openshift-monitoring/node-exporter-vwwqh" Apr 16 20:30:33.201709 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.201621 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c0695fb2-0f19-4495-91ba-ec07f29dbbc0-node-exporter-textfile\") pod \"node-exporter-vwwqh\" (UID: \"c0695fb2-0f19-4495-91ba-ec07f29dbbc0\") " pod="openshift-monitoring/node-exporter-vwwqh" Apr 16 20:30:33.201709 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.201654 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/390e92b4-466b-4c16-8ae4-95b92628be89-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-rffls\" (UID: \"390e92b4-466b-4c16-8ae4-95b92628be89\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rffls" Apr 16 20:30:33.201709 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.201694 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c0695fb2-0f19-4495-91ba-ec07f29dbbc0-node-exporter-accelerators-collector-config\") pod \"node-exporter-vwwqh\" (UID: \"c0695fb2-0f19-4495-91ba-ec07f29dbbc0\") " pod="openshift-monitoring/node-exporter-vwwqh" Apr 16 20:30:33.201922 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.201746 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c0695fb2-0f19-4495-91ba-ec07f29dbbc0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vwwqh\" (UID: \"c0695fb2-0f19-4495-91ba-ec07f29dbbc0\") " pod="openshift-monitoring/node-exporter-vwwqh" Apr 16 20:30:33.201922 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.201781 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c0695fb2-0f19-4495-91ba-ec07f29dbbc0-node-exporter-wtmp\") pod \"node-exporter-vwwqh\" (UID: \"c0695fb2-0f19-4495-91ba-ec07f29dbbc0\") " pod="openshift-monitoring/node-exporter-vwwqh" Apr 16 20:30:33.201922 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.201809 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/390e92b4-466b-4c16-8ae4-95b92628be89-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-rffls\" (UID: \"390e92b4-466b-4c16-8ae4-95b92628be89\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rffls" Apr 16 20:30:33.201922 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.201875 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c0695fb2-0f19-4495-91ba-ec07f29dbbc0-node-exporter-tls\") pod \"node-exporter-vwwqh\" (UID: \"c0695fb2-0f19-4495-91ba-ec07f29dbbc0\") " pod="openshift-monitoring/node-exporter-vwwqh" Apr 16 20:30:33.202060 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.201928 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/390e92b4-466b-4c16-8ae4-95b92628be89-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-rffls\" (UID: \"390e92b4-466b-4c16-8ae4-95b92628be89\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rffls" Apr 16 20:30:33.202060 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.201978 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c0695fb2-0f19-4495-91ba-ec07f29dbbc0-sys\") pod \"node-exporter-vwwqh\" (UID: \"c0695fb2-0f19-4495-91ba-ec07f29dbbc0\") " pod="openshift-monitoring/node-exporter-vwwqh" Apr 16 20:30:33.202060 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.202002 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c0695fb2-0f19-4495-91ba-ec07f29dbbc0-root\") pod \"node-exporter-vwwqh\" (UID: \"c0695fb2-0f19-4495-91ba-ec07f29dbbc0\") " pod="openshift-monitoring/node-exporter-vwwqh" Apr 16 20:30:33.202060 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.202027 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lgbv\" (UniqueName: \"kubernetes.io/projected/390e92b4-466b-4c16-8ae4-95b92628be89-kube-api-access-5lgbv\") pod \"kube-state-metrics-69db897b98-rffls\" (UID: \"390e92b4-466b-4c16-8ae4-95b92628be89\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rffls" Apr 16 20:30:33.202211 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.202058 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhlzr\" (UniqueName: \"kubernetes.io/projected/c0695fb2-0f19-4495-91ba-ec07f29dbbc0-kube-api-access-xhlzr\") pod \"node-exporter-vwwqh\" (UID: \"c0695fb2-0f19-4495-91ba-ec07f29dbbc0\") " pod="openshift-monitoring/node-exporter-vwwqh" Apr 16 20:30:33.202211 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.202129 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/390e92b4-466b-4c16-8ae4-95b92628be89-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-rffls\" (UID: \"390e92b4-466b-4c16-8ae4-95b92628be89\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rffls" Apr 16 20:30:33.302958 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.302907 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xhlzr\" (UniqueName: \"kubernetes.io/projected/c0695fb2-0f19-4495-91ba-ec07f29dbbc0-kube-api-access-xhlzr\") pod \"node-exporter-vwwqh\" (UID: \"c0695fb2-0f19-4495-91ba-ec07f29dbbc0\") " pod="openshift-monitoring/node-exporter-vwwqh" Apr 16 20:30:33.303153 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.302973 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/390e92b4-466b-4c16-8ae4-95b92628be89-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-rffls\" (UID: \"390e92b4-466b-4c16-8ae4-95b92628be89\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rffls" Apr 16 20:30:33.303153 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.303010 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/390e92b4-466b-4c16-8ae4-95b92628be89-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-rffls\" (UID: \"390e92b4-466b-4c16-8ae4-95b92628be89\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rffls" Apr 16 20:30:33.303153 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.303045 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c0695fb2-0f19-4495-91ba-ec07f29dbbc0-metrics-client-ca\") pod \"node-exporter-vwwqh\" (UID: \"c0695fb2-0f19-4495-91ba-ec07f29dbbc0\") " pod="openshift-monitoring/node-exporter-vwwqh" Apr 16 20:30:33.303427 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:30:33.303404 2565 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 16 20:30:33.303477 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.303458 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/390e92b4-466b-4c16-8ae4-95b92628be89-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-rffls\" (UID: \"390e92b4-466b-4c16-8ae4-95b92628be89\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rffls" Apr 16 20:30:33.303529 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:30:33.303491 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/390e92b4-466b-4c16-8ae4-95b92628be89-kube-state-metrics-tls podName:390e92b4-466b-4c16-8ae4-95b92628be89 nodeName:}" failed. No retries permitted until 2026-04-16 20:30:33.803471761 +0000 UTC m=+177.071214200 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/390e92b4-466b-4c16-8ae4-95b92628be89-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-rffls" (UID: "390e92b4-466b-4c16-8ae4-95b92628be89") : secret "kube-state-metrics-tls" not found Apr 16 20:30:33.303529 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.303515 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c0695fb2-0f19-4495-91ba-ec07f29dbbc0-node-exporter-textfile\") pod \"node-exporter-vwwqh\" (UID: \"c0695fb2-0f19-4495-91ba-ec07f29dbbc0\") " pod="openshift-monitoring/node-exporter-vwwqh" Apr 16 20:30:33.303635 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.303551 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/390e92b4-466b-4c16-8ae4-95b92628be89-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-rffls\" (UID: \"390e92b4-466b-4c16-8ae4-95b92628be89\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rffls" Apr 16 20:30:33.303635 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.303582 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c0695fb2-0f19-4495-91ba-ec07f29dbbc0-node-exporter-accelerators-collector-config\") pod \"node-exporter-vwwqh\" (UID: \"c0695fb2-0f19-4495-91ba-ec07f29dbbc0\") " pod="openshift-monitoring/node-exporter-vwwqh" Apr 16 20:30:33.303635 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.303624 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c0695fb2-0f19-4495-91ba-ec07f29dbbc0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vwwqh\" (UID: \"c0695fb2-0f19-4495-91ba-ec07f29dbbc0\") " pod="openshift-monitoring/node-exporter-vwwqh" Apr 16 20:30:33.303777 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.303666 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c0695fb2-0f19-4495-91ba-ec07f29dbbc0-node-exporter-wtmp\") pod \"node-exporter-vwwqh\" (UID: \"c0695fb2-0f19-4495-91ba-ec07f29dbbc0\") " pod="openshift-monitoring/node-exporter-vwwqh" Apr 16 20:30:33.303777 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.303675 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c0695fb2-0f19-4495-91ba-ec07f29dbbc0-metrics-client-ca\") pod \"node-exporter-vwwqh\" (UID: \"c0695fb2-0f19-4495-91ba-ec07f29dbbc0\") " pod="openshift-monitoring/node-exporter-vwwqh" Apr 16 20:30:33.303777 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.303693 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/390e92b4-466b-4c16-8ae4-95b92628be89-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-rffls\" (UID: \"390e92b4-466b-4c16-8ae4-95b92628be89\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rffls" Apr 16 20:30:33.303777 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.303724 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c0695fb2-0f19-4495-91ba-ec07f29dbbc0-node-exporter-tls\") pod \"node-exporter-vwwqh\" (UID: \"c0695fb2-0f19-4495-91ba-ec07f29dbbc0\") " pod="openshift-monitoring/node-exporter-vwwqh" Apr 16 20:30:33.303958 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:30:33.303789 2565 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 20:30:33.303958 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:30:33.303823 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0695fb2-0f19-4495-91ba-ec07f29dbbc0-node-exporter-tls podName:c0695fb2-0f19-4495-91ba-ec07f29dbbc0 nodeName:}" failed. No retries permitted until 2026-04-16 20:30:33.803811585 +0000 UTC m=+177.071554021 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/c0695fb2-0f19-4495-91ba-ec07f29dbbc0-node-exporter-tls") pod "node-exporter-vwwqh" (UID: "c0695fb2-0f19-4495-91ba-ec07f29dbbc0") : secret "node-exporter-tls" not found Apr 16 20:30:33.303958 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.303935 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c0695fb2-0f19-4495-91ba-ec07f29dbbc0-node-exporter-wtmp\") pod \"node-exporter-vwwqh\" (UID: \"c0695fb2-0f19-4495-91ba-ec07f29dbbc0\") " pod="openshift-monitoring/node-exporter-vwwqh" Apr 16 20:30:33.304491 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.304446 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c0695fb2-0f19-4495-91ba-ec07f29dbbc0-node-exporter-textfile\") pod \"node-exporter-vwwqh\" (UID: \"c0695fb2-0f19-4495-91ba-ec07f29dbbc0\") " pod="openshift-monitoring/node-exporter-vwwqh" Apr 16 20:30:33.304608 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.304528 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c0695fb2-0f19-4495-91ba-ec07f29dbbc0-node-exporter-accelerators-collector-config\") pod \"node-exporter-vwwqh\" (UID: \"c0695fb2-0f19-4495-91ba-ec07f29dbbc0\") " pod="openshift-monitoring/node-exporter-vwwqh" Apr 16 20:30:33.304608 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.304567 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/390e92b4-466b-4c16-8ae4-95b92628be89-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-rffls\" (UID: \"390e92b4-466b-4c16-8ae4-95b92628be89\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rffls" Apr 16 20:30:33.304716 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.304645 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/390e92b4-466b-4c16-8ae4-95b92628be89-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-rffls\" (UID: \"390e92b4-466b-4c16-8ae4-95b92628be89\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rffls" Apr 16 20:30:33.304716 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.304680 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c0695fb2-0f19-4495-91ba-ec07f29dbbc0-sys\") pod \"node-exporter-vwwqh\" (UID: \"c0695fb2-0f19-4495-91ba-ec07f29dbbc0\") " pod="openshift-monitoring/node-exporter-vwwqh" Apr 16 20:30:33.304716 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.304707 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c0695fb2-0f19-4495-91ba-ec07f29dbbc0-root\") pod \"node-exporter-vwwqh\" (UID: \"c0695fb2-0f19-4495-91ba-ec07f29dbbc0\") " pod="openshift-monitoring/node-exporter-vwwqh" Apr 16 20:30:33.304862 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.304732 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5lgbv\" (UniqueName: \"kubernetes.io/projected/390e92b4-466b-4c16-8ae4-95b92628be89-kube-api-access-5lgbv\") pod \"kube-state-metrics-69db897b98-rffls\" (UID: \"390e92b4-466b-4c16-8ae4-95b92628be89\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rffls" Apr 16 20:30:33.305576 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.304966 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c0695fb2-0f19-4495-91ba-ec07f29dbbc0-sys\") pod \"node-exporter-vwwqh\" (UID: \"c0695fb2-0f19-4495-91ba-ec07f29dbbc0\") " pod="openshift-monitoring/node-exporter-vwwqh" Apr 16 20:30:33.305576 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.305018 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c0695fb2-0f19-4495-91ba-ec07f29dbbc0-root\") pod \"node-exporter-vwwqh\" (UID: \"c0695fb2-0f19-4495-91ba-ec07f29dbbc0\") " pod="openshift-monitoring/node-exporter-vwwqh" Apr 16 20:30:33.305576 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.305541 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/390e92b4-466b-4c16-8ae4-95b92628be89-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-rffls\" (UID: \"390e92b4-466b-4c16-8ae4-95b92628be89\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rffls" Apr 16 20:30:33.306574 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.306536 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c0695fb2-0f19-4495-91ba-ec07f29dbbc0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vwwqh\" (UID: \"c0695fb2-0f19-4495-91ba-ec07f29dbbc0\") " pod="openshift-monitoring/node-exporter-vwwqh" Apr 16 20:30:33.306935 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.306911 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/390e92b4-466b-4c16-8ae4-95b92628be89-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-rffls\" (UID: \"390e92b4-466b-4c16-8ae4-95b92628be89\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rffls" Apr 16 20:30:33.314796 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.314755 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhlzr\" (UniqueName: \"kubernetes.io/projected/c0695fb2-0f19-4495-91ba-ec07f29dbbc0-kube-api-access-xhlzr\") pod \"node-exporter-vwwqh\" (UID: \"c0695fb2-0f19-4495-91ba-ec07f29dbbc0\") " pod="openshift-monitoring/node-exporter-vwwqh" Apr 16 20:30:33.315674 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.315654 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lgbv\" (UniqueName: \"kubernetes.io/projected/390e92b4-466b-4c16-8ae4-95b92628be89-kube-api-access-5lgbv\") pod \"kube-state-metrics-69db897b98-rffls\" (UID: \"390e92b4-466b-4c16-8ae4-95b92628be89\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rffls" Apr 16 20:30:33.809658 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.809629 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c0695fb2-0f19-4495-91ba-ec07f29dbbc0-node-exporter-tls\") pod \"node-exporter-vwwqh\" (UID: \"c0695fb2-0f19-4495-91ba-ec07f29dbbc0\") " pod="openshift-monitoring/node-exporter-vwwqh" Apr 16 20:30:33.809828 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.809693 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/390e92b4-466b-4c16-8ae4-95b92628be89-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-rffls\" (UID: \"390e92b4-466b-4c16-8ae4-95b92628be89\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rffls" Apr 16 20:30:33.811835 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.811808 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c0695fb2-0f19-4495-91ba-ec07f29dbbc0-node-exporter-tls\") pod \"node-exporter-vwwqh\" (UID: \"c0695fb2-0f19-4495-91ba-ec07f29dbbc0\") " pod="openshift-monitoring/node-exporter-vwwqh" Apr 16 20:30:33.812065 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.812047 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/390e92b4-466b-4c16-8ae4-95b92628be89-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-rffls\" (UID: \"390e92b4-466b-4c16-8ae4-95b92628be89\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rffls" Apr 16 20:30:33.992924 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:33.992887 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-vwwqh" Apr 16 20:30:34.000771 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:30:34.000744 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0695fb2_0f19_4495_91ba_ec07f29dbbc0.slice/crio-eacca1c33b44a718b54892a251925aa497ad769e9093ab88b86d80a9e5b1d57a WatchSource:0}: Error finding container eacca1c33b44a718b54892a251925aa497ad769e9093ab88b86d80a9e5b1d57a: Status 404 returned error can't find the container with id eacca1c33b44a718b54892a251925aa497ad769e9093ab88b86d80a9e5b1d57a Apr 16 20:30:34.006885 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.006865 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-rffls" Apr 16 20:30:34.125258 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.125227 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-rffls"] Apr 16 20:30:34.128351 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:30:34.128324 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod390e92b4_466b_4c16_8ae4_95b92628be89.slice/crio-9df9c522cb58df1ad51f43e907437bb083df6891e17444ebdcfa7ddb3bd2205a WatchSource:0}: Error finding container 9df9c522cb58df1ad51f43e907437bb083df6891e17444ebdcfa7ddb3bd2205a: Status 404 returned error can't find the container with id 9df9c522cb58df1ad51f43e907437bb083df6891e17444ebdcfa7ddb3bd2205a Apr 16 20:30:34.132869 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.128916 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:30:34.136319 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.136303 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.138423 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.138402 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 20:30:34.138534 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.138424 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 20:30:34.138593 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.138540 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 20:30:34.138593 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.138544 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 20:30:34.138811 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.138792 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 20:30:34.138892 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.138838 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 20:30:34.138994 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.138978 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 20:30:34.139057 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.139047 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-ckdgm\"" Apr 16 20:30:34.139133 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.138796 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 20:30:34.139194 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.139174 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 20:30:34.145201 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.145180 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:30:34.308524 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.308501 2565 scope.go:117] "RemoveContainer" containerID="3632baddf4da6fa4e1511220a73e8f61dea41225dad077d666b1db3488d6dd66" Apr 16 20:30:34.314224 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.314202 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6dfb82aa-444e-48aa-b144-b043d8b099f1-tls-assets\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.314340 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.314251 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6dfb82aa-444e-48aa-b144-b043d8b099f1-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.314401 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.314330 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6dfb82aa-444e-48aa-b144-b043d8b099f1-config-out\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.314449 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.314394 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.314449 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.314427 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-config-volume\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.314547 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.314473 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.314547 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.314503 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6dfb82aa-444e-48aa-b144-b043d8b099f1-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.314652 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.314556 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.314652 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.314606 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6dfb82aa-444e-48aa-b144-b043d8b099f1-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.314652 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.314644 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj868\" (UniqueName: \"kubernetes.io/projected/6dfb82aa-444e-48aa-b144-b043d8b099f1-kube-api-access-mj868\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.314758 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.314671 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.314758 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.314710 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.314758 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.314744 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-web-config\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.415166 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.415128 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.415350 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.415192 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.415350 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.415225 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-web-config\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.415350 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.415270 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6dfb82aa-444e-48aa-b144-b043d8b099f1-tls-assets\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.415350 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.415322 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6dfb82aa-444e-48aa-b144-b043d8b099f1-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.415543 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.415350 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6dfb82aa-444e-48aa-b144-b043d8b099f1-config-out\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.415543 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.415380 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.415543 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.415423 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-config-volume\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.415543 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.415447 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.415543 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.415473 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6dfb82aa-444e-48aa-b144-b043d8b099f1-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.415543 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.415522 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.415788 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:30:34.415668 2565 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 16 20:30:34.415788 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:30:34.415752 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-secret-alertmanager-main-tls podName:6dfb82aa-444e-48aa-b144-b043d8b099f1 nodeName:}" failed. No retries permitted until 2026-04-16 20:30:34.915731966 +0000 UTC m=+178.183474412 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "6dfb82aa-444e-48aa-b144-b043d8b099f1") : secret "alertmanager-main-tls" not found Apr 16 20:30:34.416217 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.416192 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6dfb82aa-444e-48aa-b144-b043d8b099f1-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.418138 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.417199 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6dfb82aa-444e-48aa-b144-b043d8b099f1-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.418138 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.417296 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mj868\" (UniqueName: \"kubernetes.io/projected/6dfb82aa-444e-48aa-b144-b043d8b099f1-kube-api-access-mj868\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.418138 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.417676 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6dfb82aa-444e-48aa-b144-b043d8b099f1-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.418441 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.418205 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6dfb82aa-444e-48aa-b144-b043d8b099f1-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.418816 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.418786 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.419828 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.419509 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.419828 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.419587 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6dfb82aa-444e-48aa-b144-b043d8b099f1-tls-assets\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.419828 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.419638 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.419828 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.419732 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6dfb82aa-444e-48aa-b144-b043d8b099f1-config-out\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.420637 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.420614 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-web-config\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.421036 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.421011 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-config-volume\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.421633 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.421612 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.424822 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.424798 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj868\" (UniqueName: \"kubernetes.io/projected/6dfb82aa-444e-48aa-b144-b043d8b099f1-kube-api-access-mj868\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.822247 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.822204 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vwwqh" event={"ID":"c0695fb2-0f19-4495-91ba-ec07f29dbbc0","Type":"ContainerStarted","Data":"eacca1c33b44a718b54892a251925aa497ad769e9093ab88b86d80a9e5b1d57a"} Apr 16 20:30:34.824088 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.824066 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-shwkq_a53ef938-6712-4133-9657-41ecb93318cf/console-operator/2.log" Apr 16 20:30:34.824216 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.824154 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-shwkq" event={"ID":"a53ef938-6712-4133-9657-41ecb93318cf","Type":"ContainerStarted","Data":"cfb3654b51e16d680060933187dcb83a7520caf1a143130dac76e9812a629164"} Apr 16 20:30:34.824573 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.824529 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-shwkq" Apr 16 20:30:34.825549 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.825411 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-rffls" event={"ID":"390e92b4-466b-4c16-8ae4-95b92628be89","Type":"ContainerStarted","Data":"9df9c522cb58df1ad51f43e907437bb083df6891e17444ebdcfa7ddb3bd2205a"} Apr 16 20:30:34.839920 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.839863 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-shwkq" podStartSLOduration=42.152103216 podStartE2EDuration="44.839849543s" podCreationTimestamp="2026-04-16 20:29:50 +0000 UTC" firstStartedPulling="2026-04-16 20:29:51.38782611 +0000 UTC m=+134.655568542" lastFinishedPulling="2026-04-16 20:29:54.075572434 +0000 UTC m=+137.343314869" observedRunningTime="2026-04-16 20:30:34.839198423 +0000 UTC m=+178.106940891" watchObservedRunningTime="2026-04-16 20:30:34.839849543 +0000 UTC m=+178.107591999" Apr 16 20:30:34.922497 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.922464 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:34.925375 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:34.925353 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:35.000923 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:35.000894 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-shwkq" Apr 16 20:30:35.046206 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:35.046172 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:30:35.200213 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:35.200171 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:30:35.478718 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:30:35.478678 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dfb82aa_444e_48aa_b144_b043d8b099f1.slice/crio-15bf682d36511ec1b0ba660e6d840158602624145077a889159625d9dec34305 WatchSource:0}: Error finding container 15bf682d36511ec1b0ba660e6d840158602624145077a889159625d9dec34305: Status 404 returned error can't find the container with id 15bf682d36511ec1b0ba660e6d840158602624145077a889159625d9dec34305 Apr 16 20:30:35.830826 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:35.830741 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-rffls" event={"ID":"390e92b4-466b-4c16-8ae4-95b92628be89","Type":"ContainerStarted","Data":"54e77d4042441115428675dcaac0f6f1b7b8f0ebbe641074da6ffa119833b6f1"} Apr 16 20:30:35.830826 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:35.830783 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-rffls" event={"ID":"390e92b4-466b-4c16-8ae4-95b92628be89","Type":"ContainerStarted","Data":"9ea7550216049ab968a6d9e460d206887640e13d0bafbbc5bab6078d9bf1cc5c"} Apr 16 20:30:35.830826 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:35.830797 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-rffls" event={"ID":"390e92b4-466b-4c16-8ae4-95b92628be89","Type":"ContainerStarted","Data":"6edf585fbd2b3379e0fc7432e0c51bf2292eb26385680c08aa1c1c0b1d0ac23c"} Apr 16 20:30:35.832865 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:35.832805 2565 generic.go:358] "Generic (PLEG): container finished" podID="c0695fb2-0f19-4495-91ba-ec07f29dbbc0" containerID="ad3ff375c61420761ea79d0241aaeb59aae9743e1a15b99ebf7d4356a3921b56" exitCode=0 Apr 16 20:30:35.833004 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:35.832882 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vwwqh" event={"ID":"c0695fb2-0f19-4495-91ba-ec07f29dbbc0","Type":"ContainerDied","Data":"ad3ff375c61420761ea79d0241aaeb59aae9743e1a15b99ebf7d4356a3921b56"} Apr 16 20:30:35.834297 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:35.834196 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6dfb82aa-444e-48aa-b144-b043d8b099f1","Type":"ContainerStarted","Data":"15bf682d36511ec1b0ba660e6d840158602624145077a889159625d9dec34305"} Apr 16 20:30:35.869381 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:35.869330 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-rffls" podStartSLOduration=1.4814698929999999 podStartE2EDuration="2.869313825s" podCreationTimestamp="2026-04-16 20:30:33 +0000 UTC" firstStartedPulling="2026-04-16 20:30:34.134861761 +0000 UTC m=+177.402604192" lastFinishedPulling="2026-04-16 20:30:35.522705689 +0000 UTC m=+178.790448124" observedRunningTime="2026-04-16 20:30:35.850497207 +0000 UTC m=+179.118239661" watchObservedRunningTime="2026-04-16 20:30:35.869313825 +0000 UTC m=+179.137056280" Apr 16 20:30:36.085346 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.085240 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7f76dcb489-d8kdg"] Apr 16 20:30:36.089285 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.089247 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7f76dcb489-d8kdg" Apr 16 20:30:36.092771 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.092732 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 20:30:36.093011 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.092977 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-w7mxs\"" Apr 16 20:30:36.093104 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.093038 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 20:30:36.093206 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.093188 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 20:30:36.093445 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.093404 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-d89o4d6rlujbt\"" Apr 16 20:30:36.093854 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.093828 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 20:30:36.094083 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.094061 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 20:30:36.095857 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.095736 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7f76dcb489-d8kdg"] Apr 16 20:30:36.136390 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.136369 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/34913715-f62a-4682-9a2d-a76391be2411-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7f76dcb489-d8kdg\" (UID: \"34913715-f62a-4682-9a2d-a76391be2411\") " pod="openshift-monitoring/thanos-querier-7f76dcb489-d8kdg" Apr 16 20:30:36.136534 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.136399 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/34913715-f62a-4682-9a2d-a76391be2411-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7f76dcb489-d8kdg\" (UID: \"34913715-f62a-4682-9a2d-a76391be2411\") " pod="openshift-monitoring/thanos-querier-7f76dcb489-d8kdg" Apr 16 20:30:36.136534 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.136425 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v69j\" (UniqueName: \"kubernetes.io/projected/34913715-f62a-4682-9a2d-a76391be2411-kube-api-access-5v69j\") pod \"thanos-querier-7f76dcb489-d8kdg\" (UID: \"34913715-f62a-4682-9a2d-a76391be2411\") " pod="openshift-monitoring/thanos-querier-7f76dcb489-d8kdg" Apr 16 20:30:36.136534 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.136468 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/34913715-f62a-4682-9a2d-a76391be2411-metrics-client-ca\") pod \"thanos-querier-7f76dcb489-d8kdg\" (UID: \"34913715-f62a-4682-9a2d-a76391be2411\") " pod="openshift-monitoring/thanos-querier-7f76dcb489-d8kdg" Apr 16 20:30:36.136695 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.136538 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/34913715-f62a-4682-9a2d-a76391be2411-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7f76dcb489-d8kdg\" (UID: \"34913715-f62a-4682-9a2d-a76391be2411\") " pod="openshift-monitoring/thanos-querier-7f76dcb489-d8kdg" Apr 16 20:30:36.136695 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.136568 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/34913715-f62a-4682-9a2d-a76391be2411-secret-grpc-tls\") pod \"thanos-querier-7f76dcb489-d8kdg\" (UID: \"34913715-f62a-4682-9a2d-a76391be2411\") " pod="openshift-monitoring/thanos-querier-7f76dcb489-d8kdg" Apr 16 20:30:36.136695 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.136595 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/34913715-f62a-4682-9a2d-a76391be2411-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7f76dcb489-d8kdg\" (UID: \"34913715-f62a-4682-9a2d-a76391be2411\") " pod="openshift-monitoring/thanos-querier-7f76dcb489-d8kdg" Apr 16 20:30:36.136695 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.136641 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/34913715-f62a-4682-9a2d-a76391be2411-secret-thanos-querier-tls\") pod \"thanos-querier-7f76dcb489-d8kdg\" (UID: \"34913715-f62a-4682-9a2d-a76391be2411\") " pod="openshift-monitoring/thanos-querier-7f76dcb489-d8kdg" Apr 16 20:30:36.237659 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.237623 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/34913715-f62a-4682-9a2d-a76391be2411-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7f76dcb489-d8kdg\" (UID: \"34913715-f62a-4682-9a2d-a76391be2411\") " pod="openshift-monitoring/thanos-querier-7f76dcb489-d8kdg" Apr 16 20:30:36.237659 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.237660 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/34913715-f62a-4682-9a2d-a76391be2411-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7f76dcb489-d8kdg\" (UID: \"34913715-f62a-4682-9a2d-a76391be2411\") " pod="openshift-monitoring/thanos-querier-7f76dcb489-d8kdg" Apr 16 20:30:36.238147 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.237683 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5v69j\" (UniqueName: \"kubernetes.io/projected/34913715-f62a-4682-9a2d-a76391be2411-kube-api-access-5v69j\") pod \"thanos-querier-7f76dcb489-d8kdg\" (UID: \"34913715-f62a-4682-9a2d-a76391be2411\") " pod="openshift-monitoring/thanos-querier-7f76dcb489-d8kdg" Apr 16 20:30:36.238147 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.237703 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/34913715-f62a-4682-9a2d-a76391be2411-metrics-client-ca\") pod \"thanos-querier-7f76dcb489-d8kdg\" (UID: \"34913715-f62a-4682-9a2d-a76391be2411\") " pod="openshift-monitoring/thanos-querier-7f76dcb489-d8kdg" Apr 16 20:30:36.238147 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.237722 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/34913715-f62a-4682-9a2d-a76391be2411-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7f76dcb489-d8kdg\" (UID: \"34913715-f62a-4682-9a2d-a76391be2411\") " pod="openshift-monitoring/thanos-querier-7f76dcb489-d8kdg" Apr 16 20:30:36.238147 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.237748 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/34913715-f62a-4682-9a2d-a76391be2411-secret-grpc-tls\") pod \"thanos-querier-7f76dcb489-d8kdg\" (UID: \"34913715-f62a-4682-9a2d-a76391be2411\") " pod="openshift-monitoring/thanos-querier-7f76dcb489-d8kdg" Apr 16 20:30:36.238147 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.237774 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/34913715-f62a-4682-9a2d-a76391be2411-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7f76dcb489-d8kdg\" (UID: \"34913715-f62a-4682-9a2d-a76391be2411\") " pod="openshift-monitoring/thanos-querier-7f76dcb489-d8kdg" Apr 16 20:30:36.238147 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.237801 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/34913715-f62a-4682-9a2d-a76391be2411-secret-thanos-querier-tls\") pod \"thanos-querier-7f76dcb489-d8kdg\" (UID: \"34913715-f62a-4682-9a2d-a76391be2411\") " pod="openshift-monitoring/thanos-querier-7f76dcb489-d8kdg" Apr 16 20:30:36.238522 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.238378 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/34913715-f62a-4682-9a2d-a76391be2411-metrics-client-ca\") pod \"thanos-querier-7f76dcb489-d8kdg\" (UID: \"34913715-f62a-4682-9a2d-a76391be2411\") " pod="openshift-monitoring/thanos-querier-7f76dcb489-d8kdg" Apr 16 20:30:36.241061 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.241031 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/34913715-f62a-4682-9a2d-a76391be2411-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7f76dcb489-d8kdg\" (UID: \"34913715-f62a-4682-9a2d-a76391be2411\") " pod="openshift-monitoring/thanos-querier-7f76dcb489-d8kdg" Apr 16 20:30:36.241373 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.241352 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/34913715-f62a-4682-9a2d-a76391be2411-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7f76dcb489-d8kdg\" (UID: \"34913715-f62a-4682-9a2d-a76391be2411\") " pod="openshift-monitoring/thanos-querier-7f76dcb489-d8kdg" Apr 16 20:30:36.241421 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.241370 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/34913715-f62a-4682-9a2d-a76391be2411-secret-thanos-querier-tls\") pod \"thanos-querier-7f76dcb489-d8kdg\" (UID: \"34913715-f62a-4682-9a2d-a76391be2411\") " pod="openshift-monitoring/thanos-querier-7f76dcb489-d8kdg" Apr 16 20:30:36.241475 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.241447 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/34913715-f62a-4682-9a2d-a76391be2411-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7f76dcb489-d8kdg\" (UID: \"34913715-f62a-4682-9a2d-a76391be2411\") " pod="openshift-monitoring/thanos-querier-7f76dcb489-d8kdg" Apr 16 20:30:36.241522 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.241499 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/34913715-f62a-4682-9a2d-a76391be2411-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7f76dcb489-d8kdg\" (UID: \"34913715-f62a-4682-9a2d-a76391be2411\") " pod="openshift-monitoring/thanos-querier-7f76dcb489-d8kdg" Apr 16 20:30:36.241585 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.241568 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/34913715-f62a-4682-9a2d-a76391be2411-secret-grpc-tls\") pod \"thanos-querier-7f76dcb489-d8kdg\" (UID: \"34913715-f62a-4682-9a2d-a76391be2411\") " pod="openshift-monitoring/thanos-querier-7f76dcb489-d8kdg" Apr 16 20:30:36.245578 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.245555 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v69j\" (UniqueName: \"kubernetes.io/projected/34913715-f62a-4682-9a2d-a76391be2411-kube-api-access-5v69j\") pod \"thanos-querier-7f76dcb489-d8kdg\" (UID: \"34913715-f62a-4682-9a2d-a76391be2411\") " pod="openshift-monitoring/thanos-querier-7f76dcb489-d8kdg" Apr 16 20:30:36.402701 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.402621 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7f76dcb489-d8kdg" Apr 16 20:30:36.722372 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.722340 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7f76dcb489-d8kdg"] Apr 16 20:30:36.726148 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:30:36.726122 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34913715_f62a_4682_9a2d_a76391be2411.slice/crio-a02bbc765e036d28a6e5555f55334e18639fe81e10084adba4a685c65793db8e WatchSource:0}: Error finding container a02bbc765e036d28a6e5555f55334e18639fe81e10084adba4a685c65793db8e: Status 404 returned error can't find the container with id a02bbc765e036d28a6e5555f55334e18639fe81e10084adba4a685c65793db8e Apr 16 20:30:36.842378 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.842320 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vwwqh" event={"ID":"c0695fb2-0f19-4495-91ba-ec07f29dbbc0","Type":"ContainerStarted","Data":"ad30f14608015f08603bb9bdfd507e63cb68f5ac47207e037b8852c263172d35"} Apr 16 20:30:36.842378 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.842361 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vwwqh" event={"ID":"c0695fb2-0f19-4495-91ba-ec07f29dbbc0","Type":"ContainerStarted","Data":"8155c273d07afc7e45e41ed37b4a97989802b3767cbe365bbd1f0217ef53cca0"} Apr 16 20:30:36.843685 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.843655 2565 generic.go:358] "Generic (PLEG): container finished" podID="6dfb82aa-444e-48aa-b144-b043d8b099f1" containerID="5488117389be3df45eb854726907ce7cbbcad8d24e29c8f9d7a72498ed36e24d" exitCode=0 Apr 16 20:30:36.843836 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.843699 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6dfb82aa-444e-48aa-b144-b043d8b099f1","Type":"ContainerDied","Data":"5488117389be3df45eb854726907ce7cbbcad8d24e29c8f9d7a72498ed36e24d"} Apr 16 20:30:36.844854 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.844828 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f76dcb489-d8kdg" event={"ID":"34913715-f62a-4682-9a2d-a76391be2411","Type":"ContainerStarted","Data":"a02bbc765e036d28a6e5555f55334e18639fe81e10084adba4a685c65793db8e"} Apr 16 20:30:36.865968 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:36.865929 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-vwwqh" podStartSLOduration=2.969948029 podStartE2EDuration="3.86591831s" podCreationTimestamp="2026-04-16 20:30:33 +0000 UTC" firstStartedPulling="2026-04-16 20:30:34.00211169 +0000 UTC m=+177.269854121" lastFinishedPulling="2026-04-16 20:30:34.898081957 +0000 UTC m=+178.165824402" observedRunningTime="2026-04-16 20:30:36.864792096 +0000 UTC m=+180.132534560" watchObservedRunningTime="2026-04-16 20:30:36.86591831 +0000 UTC m=+180.133660763" Apr 16 20:30:37.467489 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:37.467459 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-5bdd65d46d-bpprw"] Apr 16 20:30:37.470862 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:37.470841 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5bdd65d46d-bpprw" Apr 16 20:30:37.473631 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:37.473610 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 20:30:37.473745 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:37.473681 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 20:30:37.473745 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:37.473732 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-an5lmfq3k9kmt\"" Apr 16 20:30:37.473849 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:37.473840 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 20:30:37.473962 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:37.473944 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-knhls\"" Apr 16 20:30:37.474054 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:37.474034 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 20:30:37.481470 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:37.481426 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5bdd65d46d-bpprw"] Apr 16 20:30:37.550226 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:37.550194 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/501c40e3-b641-489e-b6a9-5314521662b0-secret-metrics-server-client-certs\") pod \"metrics-server-5bdd65d46d-bpprw\" (UID: \"501c40e3-b641-489e-b6a9-5314521662b0\") " pod="openshift-monitoring/metrics-server-5bdd65d46d-bpprw" Apr 16 20:30:37.550464 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:37.550306 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/501c40e3-b641-489e-b6a9-5314521662b0-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5bdd65d46d-bpprw\" (UID: \"501c40e3-b641-489e-b6a9-5314521662b0\") " pod="openshift-monitoring/metrics-server-5bdd65d46d-bpprw" Apr 16 20:30:37.550464 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:37.550353 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/501c40e3-b641-489e-b6a9-5314521662b0-metrics-server-audit-profiles\") pod \"metrics-server-5bdd65d46d-bpprw\" (UID: \"501c40e3-b641-489e-b6a9-5314521662b0\") " pod="openshift-monitoring/metrics-server-5bdd65d46d-bpprw" Apr 16 20:30:37.550464 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:37.550390 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/501c40e3-b641-489e-b6a9-5314521662b0-secret-metrics-server-tls\") pod \"metrics-server-5bdd65d46d-bpprw\" (UID: \"501c40e3-b641-489e-b6a9-5314521662b0\") " pod="openshift-monitoring/metrics-server-5bdd65d46d-bpprw" Apr 16 20:30:37.550464 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:37.550426 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr2bh\" (UniqueName: \"kubernetes.io/projected/501c40e3-b641-489e-b6a9-5314521662b0-kube-api-access-kr2bh\") pod \"metrics-server-5bdd65d46d-bpprw\" (UID: \"501c40e3-b641-489e-b6a9-5314521662b0\") " pod="openshift-monitoring/metrics-server-5bdd65d46d-bpprw" Apr 16 20:30:37.550803 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:37.550486 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/501c40e3-b641-489e-b6a9-5314521662b0-audit-log\") pod \"metrics-server-5bdd65d46d-bpprw\" (UID: \"501c40e3-b641-489e-b6a9-5314521662b0\") " pod="openshift-monitoring/metrics-server-5bdd65d46d-bpprw" Apr 16 20:30:37.550803 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:37.550534 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/501c40e3-b641-489e-b6a9-5314521662b0-client-ca-bundle\") pod \"metrics-server-5bdd65d46d-bpprw\" (UID: \"501c40e3-b641-489e-b6a9-5314521662b0\") " pod="openshift-monitoring/metrics-server-5bdd65d46d-bpprw" Apr 16 20:30:37.651205 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:37.651170 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/501c40e3-b641-489e-b6a9-5314521662b0-metrics-server-audit-profiles\") pod \"metrics-server-5bdd65d46d-bpprw\" (UID: \"501c40e3-b641-489e-b6a9-5314521662b0\") " pod="openshift-monitoring/metrics-server-5bdd65d46d-bpprw" Apr 16 20:30:37.651392 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:37.651223 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/501c40e3-b641-489e-b6a9-5314521662b0-secret-metrics-server-tls\") pod \"metrics-server-5bdd65d46d-bpprw\" (UID: \"501c40e3-b641-489e-b6a9-5314521662b0\") " pod="openshift-monitoring/metrics-server-5bdd65d46d-bpprw" Apr 16 20:30:37.651392 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:37.651264 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kr2bh\" (UniqueName: \"kubernetes.io/projected/501c40e3-b641-489e-b6a9-5314521662b0-kube-api-access-kr2bh\") pod \"metrics-server-5bdd65d46d-bpprw\" (UID: \"501c40e3-b641-489e-b6a9-5314521662b0\") " pod="openshift-monitoring/metrics-server-5bdd65d46d-bpprw" Apr 16 20:30:37.651392 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:37.651323 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/501c40e3-b641-489e-b6a9-5314521662b0-audit-log\") pod \"metrics-server-5bdd65d46d-bpprw\" (UID: \"501c40e3-b641-489e-b6a9-5314521662b0\") " pod="openshift-monitoring/metrics-server-5bdd65d46d-bpprw" Apr 16 20:30:37.651392 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:37.651359 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/501c40e3-b641-489e-b6a9-5314521662b0-client-ca-bundle\") pod \"metrics-server-5bdd65d46d-bpprw\" (UID: \"501c40e3-b641-489e-b6a9-5314521662b0\") " pod="openshift-monitoring/metrics-server-5bdd65d46d-bpprw" Apr 16 20:30:37.651618 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:37.651416 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/501c40e3-b641-489e-b6a9-5314521662b0-secret-metrics-server-client-certs\") pod \"metrics-server-5bdd65d46d-bpprw\" (UID: \"501c40e3-b641-489e-b6a9-5314521662b0\") " pod="openshift-monitoring/metrics-server-5bdd65d46d-bpprw" Apr 16 20:30:37.652311 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:37.651955 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/501c40e3-b641-489e-b6a9-5314521662b0-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5bdd65d46d-bpprw\" (UID: \"501c40e3-b641-489e-b6a9-5314521662b0\") " pod="openshift-monitoring/metrics-server-5bdd65d46d-bpprw" Apr 16 20:30:37.652311 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:37.652006 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/501c40e3-b641-489e-b6a9-5314521662b0-audit-log\") pod \"metrics-server-5bdd65d46d-bpprw\" (UID: \"501c40e3-b641-489e-b6a9-5314521662b0\") " pod="openshift-monitoring/metrics-server-5bdd65d46d-bpprw" Apr 16 20:30:37.652674 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:37.652632 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/501c40e3-b641-489e-b6a9-5314521662b0-metrics-server-audit-profiles\") pod \"metrics-server-5bdd65d46d-bpprw\" (UID: \"501c40e3-b641-489e-b6a9-5314521662b0\") " pod="openshift-monitoring/metrics-server-5bdd65d46d-bpprw" Apr 16 20:30:37.652782 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:37.652737 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/501c40e3-b641-489e-b6a9-5314521662b0-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5bdd65d46d-bpprw\" (UID: \"501c40e3-b641-489e-b6a9-5314521662b0\") " pod="openshift-monitoring/metrics-server-5bdd65d46d-bpprw" Apr 16 20:30:37.654434 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:37.654411 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/501c40e3-b641-489e-b6a9-5314521662b0-secret-metrics-server-client-certs\") pod \"metrics-server-5bdd65d46d-bpprw\" (UID: \"501c40e3-b641-489e-b6a9-5314521662b0\") " pod="openshift-monitoring/metrics-server-5bdd65d46d-bpprw" Apr 16 20:30:37.654899 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:37.654876 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/501c40e3-b641-489e-b6a9-5314521662b0-secret-metrics-server-tls\") pod \"metrics-server-5bdd65d46d-bpprw\" (UID: \"501c40e3-b641-489e-b6a9-5314521662b0\") " pod="openshift-monitoring/metrics-server-5bdd65d46d-bpprw" Apr 16 20:30:37.655800 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:37.655778 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/501c40e3-b641-489e-b6a9-5314521662b0-client-ca-bundle\") pod \"metrics-server-5bdd65d46d-bpprw\" (UID: \"501c40e3-b641-489e-b6a9-5314521662b0\") " pod="openshift-monitoring/metrics-server-5bdd65d46d-bpprw" Apr 16 20:30:37.660600 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:37.660568 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr2bh\" (UniqueName: \"kubernetes.io/projected/501c40e3-b641-489e-b6a9-5314521662b0-kube-api-access-kr2bh\") pod \"metrics-server-5bdd65d46d-bpprw\" (UID: \"501c40e3-b641-489e-b6a9-5314521662b0\") " pod="openshift-monitoring/metrics-server-5bdd65d46d-bpprw" Apr 16 20:30:37.784003 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:37.783902 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5bdd65d46d-bpprw" Apr 16 20:30:37.889210 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:37.888979 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-hn9tf"] Apr 16 20:30:37.893870 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:37.893848 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hn9tf" Apr 16 20:30:37.896784 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:37.896559 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-mn7nz\"" Apr 16 20:30:37.896784 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:37.896631 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 20:30:37.902404 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:37.902368 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-hn9tf"] Apr 16 20:30:37.935426 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:37.935401 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5bdd65d46d-bpprw"] Apr 16 20:30:37.938193 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:30:37.938164 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod501c40e3_b641_489e_b6a9_5314521662b0.slice/crio-614a748b9d32c2c08c2498de951945c8ec7df49107e5725fe228a0e830f9c917 WatchSource:0}: Error finding container 614a748b9d32c2c08c2498de951945c8ec7df49107e5725fe228a0e830f9c917: Status 404 returned error can't find the container with id 614a748b9d32c2c08c2498de951945c8ec7df49107e5725fe228a0e830f9c917 Apr 16 20:30:37.954291 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:37.954257 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/81f9c12b-ce8f-4485-bd78-ddb944272d6b-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-hn9tf\" (UID: \"81f9c12b-ce8f-4485-bd78-ddb944272d6b\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hn9tf" Apr 16 20:30:38.054785 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.054705 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/81f9c12b-ce8f-4485-bd78-ddb944272d6b-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-hn9tf\" (UID: \"81f9c12b-ce8f-4485-bd78-ddb944272d6b\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hn9tf" Apr 16 20:30:38.057233 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.057213 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/81f9c12b-ce8f-4485-bd78-ddb944272d6b-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-hn9tf\" (UID: \"81f9c12b-ce8f-4485-bd78-ddb944272d6b\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hn9tf" Apr 16 20:30:38.209060 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.209030 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hn9tf" Apr 16 20:30:38.282001 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.281967 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-775c5544d8-4gndf"] Apr 16 20:30:38.287226 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.287196 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-775c5544d8-4gndf" Apr 16 20:30:38.290039 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.290008 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 20:30:38.290402 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.290350 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 20:30:38.290760 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.290358 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-g2ph4\"" Apr 16 20:30:38.290760 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.290365 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 20:30:38.291240 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.291124 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 20:30:38.291607 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.291432 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 20:30:38.296499 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.296215 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 20:30:38.298180 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.298127 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-775c5544d8-4gndf"] Apr 16 20:30:38.357628 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.357565 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/e4a3994e-037a-4c38-bdcb-1fb57bfec1e8-federate-client-tls\") pod \"telemeter-client-775c5544d8-4gndf\" (UID: \"e4a3994e-037a-4c38-bdcb-1fb57bfec1e8\") " pod="openshift-monitoring/telemeter-client-775c5544d8-4gndf" Apr 16 20:30:38.357628 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.357601 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e4a3994e-037a-4c38-bdcb-1fb57bfec1e8-metrics-client-ca\") pod \"telemeter-client-775c5544d8-4gndf\" (UID: \"e4a3994e-037a-4c38-bdcb-1fb57bfec1e8\") " pod="openshift-monitoring/telemeter-client-775c5544d8-4gndf" Apr 16 20:30:38.357806 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.357632 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/e4a3994e-037a-4c38-bdcb-1fb57bfec1e8-secret-telemeter-client\") pod \"telemeter-client-775c5544d8-4gndf\" (UID: \"e4a3994e-037a-4c38-bdcb-1fb57bfec1e8\") " pod="openshift-monitoring/telemeter-client-775c5544d8-4gndf" Apr 16 20:30:38.357806 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.357696 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4a3994e-037a-4c38-bdcb-1fb57bfec1e8-telemeter-trusted-ca-bundle\") pod \"telemeter-client-775c5544d8-4gndf\" (UID: \"e4a3994e-037a-4c38-bdcb-1fb57bfec1e8\") " pod="openshift-monitoring/telemeter-client-775c5544d8-4gndf" Apr 16 20:30:38.357806 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.357737 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4a3994e-037a-4c38-bdcb-1fb57bfec1e8-serving-certs-ca-bundle\") pod \"telemeter-client-775c5544d8-4gndf\" (UID: \"e4a3994e-037a-4c38-bdcb-1fb57bfec1e8\") " pod="openshift-monitoring/telemeter-client-775c5544d8-4gndf" Apr 16 20:30:38.357902 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.357825 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhv6v\" (UniqueName: \"kubernetes.io/projected/e4a3994e-037a-4c38-bdcb-1fb57bfec1e8-kube-api-access-lhv6v\") pod \"telemeter-client-775c5544d8-4gndf\" (UID: \"e4a3994e-037a-4c38-bdcb-1fb57bfec1e8\") " pod="openshift-monitoring/telemeter-client-775c5544d8-4gndf" Apr 16 20:30:38.357902 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.357874 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/e4a3994e-037a-4c38-bdcb-1fb57bfec1e8-telemeter-client-tls\") pod \"telemeter-client-775c5544d8-4gndf\" (UID: \"e4a3994e-037a-4c38-bdcb-1fb57bfec1e8\") " pod="openshift-monitoring/telemeter-client-775c5544d8-4gndf" Apr 16 20:30:38.357902 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.357892 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e4a3994e-037a-4c38-bdcb-1fb57bfec1e8-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-775c5544d8-4gndf\" (UID: \"e4a3994e-037a-4c38-bdcb-1fb57bfec1e8\") " pod="openshift-monitoring/telemeter-client-775c5544d8-4gndf" Apr 16 20:30:38.458624 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.458594 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lhv6v\" (UniqueName: \"kubernetes.io/projected/e4a3994e-037a-4c38-bdcb-1fb57bfec1e8-kube-api-access-lhv6v\") pod \"telemeter-client-775c5544d8-4gndf\" (UID: \"e4a3994e-037a-4c38-bdcb-1fb57bfec1e8\") " pod="openshift-monitoring/telemeter-client-775c5544d8-4gndf" Apr 16 20:30:38.458776 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.458668 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/e4a3994e-037a-4c38-bdcb-1fb57bfec1e8-telemeter-client-tls\") pod \"telemeter-client-775c5544d8-4gndf\" (UID: \"e4a3994e-037a-4c38-bdcb-1fb57bfec1e8\") " pod="openshift-monitoring/telemeter-client-775c5544d8-4gndf" Apr 16 20:30:38.458776 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.458697 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e4a3994e-037a-4c38-bdcb-1fb57bfec1e8-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-775c5544d8-4gndf\" (UID: \"e4a3994e-037a-4c38-bdcb-1fb57bfec1e8\") " pod="openshift-monitoring/telemeter-client-775c5544d8-4gndf" Apr 16 20:30:38.458776 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.458766 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/e4a3994e-037a-4c38-bdcb-1fb57bfec1e8-federate-client-tls\") pod \"telemeter-client-775c5544d8-4gndf\" (UID: \"e4a3994e-037a-4c38-bdcb-1fb57bfec1e8\") " pod="openshift-monitoring/telemeter-client-775c5544d8-4gndf" Apr 16 20:30:38.458954 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.458798 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e4a3994e-037a-4c38-bdcb-1fb57bfec1e8-metrics-client-ca\") pod \"telemeter-client-775c5544d8-4gndf\" (UID: \"e4a3994e-037a-4c38-bdcb-1fb57bfec1e8\") " pod="openshift-monitoring/telemeter-client-775c5544d8-4gndf" Apr 16 20:30:38.458954 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.458829 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/e4a3994e-037a-4c38-bdcb-1fb57bfec1e8-secret-telemeter-client\") pod \"telemeter-client-775c5544d8-4gndf\" (UID: \"e4a3994e-037a-4c38-bdcb-1fb57bfec1e8\") " pod="openshift-monitoring/telemeter-client-775c5544d8-4gndf" Apr 16 20:30:38.458954 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.458871 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4a3994e-037a-4c38-bdcb-1fb57bfec1e8-telemeter-trusted-ca-bundle\") pod \"telemeter-client-775c5544d8-4gndf\" (UID: \"e4a3994e-037a-4c38-bdcb-1fb57bfec1e8\") " pod="openshift-monitoring/telemeter-client-775c5544d8-4gndf" Apr 16 20:30:38.458954 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.458911 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4a3994e-037a-4c38-bdcb-1fb57bfec1e8-serving-certs-ca-bundle\") pod \"telemeter-client-775c5544d8-4gndf\" (UID: \"e4a3994e-037a-4c38-bdcb-1fb57bfec1e8\") " pod="openshift-monitoring/telemeter-client-775c5544d8-4gndf" Apr 16 20:30:38.459628 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.459599 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4a3994e-037a-4c38-bdcb-1fb57bfec1e8-serving-certs-ca-bundle\") pod \"telemeter-client-775c5544d8-4gndf\" (UID: \"e4a3994e-037a-4c38-bdcb-1fb57bfec1e8\") " pod="openshift-monitoring/telemeter-client-775c5544d8-4gndf" Apr 16 20:30:38.459729 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.459645 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e4a3994e-037a-4c38-bdcb-1fb57bfec1e8-metrics-client-ca\") pod \"telemeter-client-775c5544d8-4gndf\" (UID: \"e4a3994e-037a-4c38-bdcb-1fb57bfec1e8\") " pod="openshift-monitoring/telemeter-client-775c5544d8-4gndf" Apr 16 20:30:38.460060 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.460035 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4a3994e-037a-4c38-bdcb-1fb57bfec1e8-telemeter-trusted-ca-bundle\") pod \"telemeter-client-775c5544d8-4gndf\" (UID: \"e4a3994e-037a-4c38-bdcb-1fb57bfec1e8\") " pod="openshift-monitoring/telemeter-client-775c5544d8-4gndf" Apr 16 20:30:38.461493 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.461452 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e4a3994e-037a-4c38-bdcb-1fb57bfec1e8-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-775c5544d8-4gndf\" (UID: \"e4a3994e-037a-4c38-bdcb-1fb57bfec1e8\") " pod="openshift-monitoring/telemeter-client-775c5544d8-4gndf" Apr 16 20:30:38.461858 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.461834 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/e4a3994e-037a-4c38-bdcb-1fb57bfec1e8-federate-client-tls\") pod \"telemeter-client-775c5544d8-4gndf\" (UID: \"e4a3994e-037a-4c38-bdcb-1fb57bfec1e8\") " pod="openshift-monitoring/telemeter-client-775c5544d8-4gndf" Apr 16 20:30:38.462289 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.462253 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/e4a3994e-037a-4c38-bdcb-1fb57bfec1e8-secret-telemeter-client\") pod \"telemeter-client-775c5544d8-4gndf\" (UID: \"e4a3994e-037a-4c38-bdcb-1fb57bfec1e8\") " pod="openshift-monitoring/telemeter-client-775c5544d8-4gndf" Apr 16 20:30:38.462366 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.462256 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/e4a3994e-037a-4c38-bdcb-1fb57bfec1e8-telemeter-client-tls\") pod \"telemeter-client-775c5544d8-4gndf\" (UID: \"e4a3994e-037a-4c38-bdcb-1fb57bfec1e8\") " pod="openshift-monitoring/telemeter-client-775c5544d8-4gndf" Apr 16 20:30:38.466396 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.466378 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhv6v\" (UniqueName: \"kubernetes.io/projected/e4a3994e-037a-4c38-bdcb-1fb57bfec1e8-kube-api-access-lhv6v\") pod \"telemeter-client-775c5544d8-4gndf\" (UID: \"e4a3994e-037a-4c38-bdcb-1fb57bfec1e8\") " pod="openshift-monitoring/telemeter-client-775c5544d8-4gndf" Apr 16 20:30:38.490657 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.490635 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-hn9tf"] Apr 16 20:30:38.493620 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:30:38.493599 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81f9c12b_ce8f_4485_bd78_ddb944272d6b.slice/crio-923d6eb43c18e403141ab702b735f1c8c90e77ace057480bbfb8b4b1d6bc075a WatchSource:0}: Error finding container 923d6eb43c18e403141ab702b735f1c8c90e77ace057480bbfb8b4b1d6bc075a: Status 404 returned error can't find the container with id 923d6eb43c18e403141ab702b735f1c8c90e77ace057480bbfb8b4b1d6bc075a Apr 16 20:30:38.601051 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.601031 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-775c5544d8-4gndf" Apr 16 20:30:38.782576 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.782548 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-775c5544d8-4gndf"] Apr 16 20:30:38.854534 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.854488 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5bdd65d46d-bpprw" event={"ID":"501c40e3-b641-489e-b6a9-5314521662b0","Type":"ContainerStarted","Data":"614a748b9d32c2c08c2498de951945c8ec7df49107e5725fe228a0e830f9c917"} Apr 16 20:30:38.857892 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.857856 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6dfb82aa-444e-48aa-b144-b043d8b099f1","Type":"ContainerStarted","Data":"b9ec15369521b6c5f61859777ee2f943f762759dc5304f9079c3a28bd6a6a5f9"} Apr 16 20:30:38.857998 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.857897 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6dfb82aa-444e-48aa-b144-b043d8b099f1","Type":"ContainerStarted","Data":"d0d017712e5bf193dd467406283e577771a0193fcb706b3c8d021487c76223b6"} Apr 16 20:30:38.857998 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.857912 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6dfb82aa-444e-48aa-b144-b043d8b099f1","Type":"ContainerStarted","Data":"dadad5e0c0eb91e82a54b9dba9234b550ee66b79b7e4c4cdf50e8dc715440e26"} Apr 16 20:30:38.857998 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.857925 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6dfb82aa-444e-48aa-b144-b043d8b099f1","Type":"ContainerStarted","Data":"739ec335c47112785d4e230a7f15bb02f8dc1fffad06b1df2e44af3d0cbdaff0"} Apr 16 20:30:38.857998 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.857938 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6dfb82aa-444e-48aa-b144-b043d8b099f1","Type":"ContainerStarted","Data":"232382c5680114f86dddf017bc601b089324dc6fb6f1c7c063e4c31a8146d344"} Apr 16 20:30:38.859093 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:38.859057 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hn9tf" event={"ID":"81f9c12b-ce8f-4485-bd78-ddb944272d6b","Type":"ContainerStarted","Data":"923d6eb43c18e403141ab702b735f1c8c90e77ace057480bbfb8b4b1d6bc075a"} Apr 16 20:30:39.331956 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.331924 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:30:39.336743 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.336718 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.339605 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.339578 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-7qv47\"" Apr 16 20:30:39.339749 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.339602 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 20:30:39.339749 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.339603 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 20:30:39.339749 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.339710 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 20:30:39.339850 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.339799 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 20:30:39.339850 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.339840 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 20:30:39.339976 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.339931 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 20:30:39.340038 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.340026 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 20:30:39.340136 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.340101 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-eaai3e5b500du\"" Apr 16 20:30:39.340136 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.340121 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 20:30:39.340339 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.340211 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 20:30:39.340735 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.340645 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 20:30:39.340829 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.340737 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 20:30:39.342844 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.342676 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 20:30:39.348620 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.348597 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:30:39.368228 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.368205 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-web-config\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.368334 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.368249 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.368334 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.368309 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.368334 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.368330 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.368480 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.368368 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.368480 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.368415 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.368480 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.368472 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.368687 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.368509 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.368687 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.368539 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.368687 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.368577 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.368687 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.368621 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.368687 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.368648 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dppg5\" (UniqueName: \"kubernetes.io/projected/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-kube-api-access-dppg5\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.368687 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.368665 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.368687 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.368685 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.368947 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.368719 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.368947 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.368739 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.368947 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.368768 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-config\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.368947 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.368792 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-config-out\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.369158 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:30:39.369130 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4a3994e_037a_4c38_bdcb_1fb57bfec1e8.slice/crio-f8e811bceb0ed6bc3f1d9f05d70302fe38633dcb1c5f9cc2379b00c4677dfc97 WatchSource:0}: Error finding container f8e811bceb0ed6bc3f1d9f05d70302fe38633dcb1c5f9cc2379b00c4677dfc97: Status 404 returned error can't find the container with id f8e811bceb0ed6bc3f1d9f05d70302fe38633dcb1c5f9cc2379b00c4677dfc97 Apr 16 20:30:39.470260 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.470233 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.470260 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.470266 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.470504 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.470412 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.470504 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.470475 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.470616 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.470534 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.470616 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.470564 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.470616 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.470598 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.470780 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.470750 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.470839 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.470816 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.470897 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.470851 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dppg5\" (UniqueName: \"kubernetes.io/projected/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-kube-api-access-dppg5\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.470897 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.470880 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.470987 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.470907 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.470987 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.470936 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.470987 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.470965 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.471127 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.471009 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-config\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.471127 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.471032 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-config-out\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.471127 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.471082 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-web-config\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.471269 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.471127 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.471358 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.471316 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.473340 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.471451 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.473340 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.472000 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.473340 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.471999 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.473340 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.472162 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.473750 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.473728 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.474360 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.474334 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.475017 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.474979 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.475563 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.475538 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-config\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.476069 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.476024 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.476342 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.476320 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-config-out\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.476474 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.476453 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.476711 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.476690 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.477020 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.476968 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-web-config\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.477113 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.477092 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.477630 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.477613 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.477917 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.477887 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.479895 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.479876 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dppg5\" (UniqueName: \"kubernetes.io/projected/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-kube-api-access-dppg5\") pod \"prometheus-k8s-0\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.650944 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.650764 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:39.863689 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:39.863651 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-775c5544d8-4gndf" event={"ID":"e4a3994e-037a-4c38-bdcb-1fb57bfec1e8","Type":"ContainerStarted","Data":"f8e811bceb0ed6bc3f1d9f05d70302fe38633dcb1c5f9cc2379b00c4677dfc97"} Apr 16 20:30:40.153861 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:40.152525 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:30:40.158718 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:30:40.158685 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9cbe6a2_2f41_4e82_b07a_84e1361bfca1.slice/crio-16b4f60cde4e68c9e68c2aef8ef1cb222b1c53fdaec466aa5a0cd580e0867eac WatchSource:0}: Error finding container 16b4f60cde4e68c9e68c2aef8ef1cb222b1c53fdaec466aa5a0cd580e0867eac: Status 404 returned error can't find the container with id 16b4f60cde4e68c9e68c2aef8ef1cb222b1c53fdaec466aa5a0cd580e0867eac Apr 16 20:30:40.867683 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:40.867657 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hn9tf" event={"ID":"81f9c12b-ce8f-4485-bd78-ddb944272d6b","Type":"ContainerStarted","Data":"bf78bd558b85de93041d546feaeb5e9545a5df3331972aeb820a2d33d385e634"} Apr 16 20:30:40.868184 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:40.868160 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hn9tf" Apr 16 20:30:40.869447 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:40.869422 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5bdd65d46d-bpprw" event={"ID":"501c40e3-b641-489e-b6a9-5314521662b0","Type":"ContainerStarted","Data":"4afea005067c1c5d15f3b4e421528441890bb9d3f34a25bf40ab568edcae1770"} Apr 16 20:30:40.870916 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:40.870896 2565 generic.go:358] "Generic (PLEG): container finished" podID="c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" containerID="fdcdce4015d452d749462968da772236fe5f6e35a33c98b11aa4e424d5035ccf" exitCode=0 Apr 16 20:30:40.871011 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:40.870960 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1","Type":"ContainerDied","Data":"fdcdce4015d452d749462968da772236fe5f6e35a33c98b11aa4e424d5035ccf"} Apr 16 20:30:40.871011 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:40.870991 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1","Type":"ContainerStarted","Data":"16b4f60cde4e68c9e68c2aef8ef1cb222b1c53fdaec466aa5a0cd580e0867eac"} Apr 16 20:30:40.874035 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:40.873956 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hn9tf" Apr 16 20:30:40.874193 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:40.874173 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6dfb82aa-444e-48aa-b144-b043d8b099f1","Type":"ContainerStarted","Data":"e42102d03ecf27d8630c0000d20f3f264c49f3b45eb47d9f47addfec1cc131a0"} Apr 16 20:30:40.876885 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:40.876866 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f76dcb489-d8kdg" event={"ID":"34913715-f62a-4682-9a2d-a76391be2411","Type":"ContainerStarted","Data":"a044060be5b0267c83239eb06cebf260a7d05611ac01ebfac3d6e5cfe8490c54"} Apr 16 20:30:40.876978 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:40.876889 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f76dcb489-d8kdg" event={"ID":"34913715-f62a-4682-9a2d-a76391be2411","Type":"ContainerStarted","Data":"cac04c2f458c17ee6c11ccc3dd6212316acee6a4f05583d2764e8e4b483d54c2"} Apr 16 20:30:40.876978 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:40.876904 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f76dcb489-d8kdg" event={"ID":"34913715-f62a-4682-9a2d-a76391be2411","Type":"ContainerStarted","Data":"4db1a5ac38ec497e1187be8488d2cd1961782cb5b1024a79cfaae5e0c2841c54"} Apr 16 20:30:40.876978 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:40.876913 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f76dcb489-d8kdg" event={"ID":"34913715-f62a-4682-9a2d-a76391be2411","Type":"ContainerStarted","Data":"f5dd2305371b8936fcfe22a96fa96d9cfd2cd9dd7327a0a82bdf7feb15ceaa2a"} Apr 16 20:30:40.876978 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:40.876921 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f76dcb489-d8kdg" event={"ID":"34913715-f62a-4682-9a2d-a76391be2411","Type":"ContainerStarted","Data":"5e44372bea49ace8921693469f915c124baa61b3b4fcbf17f2511dd9aa065921"} Apr 16 20:30:40.883343 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:40.883298 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hn9tf" podStartSLOduration=1.717962264 podStartE2EDuration="3.883262213s" podCreationTimestamp="2026-04-16 20:30:37 +0000 UTC" firstStartedPulling="2026-04-16 20:30:38.495332033 +0000 UTC m=+181.763074463" lastFinishedPulling="2026-04-16 20:30:40.660631978 +0000 UTC m=+183.928374412" observedRunningTime="2026-04-16 20:30:40.882650928 +0000 UTC m=+184.150393386" watchObservedRunningTime="2026-04-16 20:30:40.883262213 +0000 UTC m=+184.151004659" Apr 16 20:30:40.927548 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:40.927501 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.35447657 podStartE2EDuration="6.92748678s" podCreationTimestamp="2026-04-16 20:30:34 +0000 UTC" firstStartedPulling="2026-04-16 20:30:35.48066541 +0000 UTC m=+178.748407841" lastFinishedPulling="2026-04-16 20:30:40.053675608 +0000 UTC m=+183.321418051" observedRunningTime="2026-04-16 20:30:40.925125231 +0000 UTC m=+184.192867711" watchObservedRunningTime="2026-04-16 20:30:40.92748678 +0000 UTC m=+184.195229232" Apr 16 20:30:40.972522 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:40.972474 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-5bdd65d46d-bpprw" podStartSLOduration=1.9321207120000001 podStartE2EDuration="3.972457999s" podCreationTimestamp="2026-04-16 20:30:37 +0000 UTC" firstStartedPulling="2026-04-16 20:30:37.940036138 +0000 UTC m=+181.207778569" lastFinishedPulling="2026-04-16 20:30:39.98037342 +0000 UTC m=+183.248115856" observedRunningTime="2026-04-16 20:30:40.972208626 +0000 UTC m=+184.239951080" watchObservedRunningTime="2026-04-16 20:30:40.972457999 +0000 UTC m=+184.240200454" Apr 16 20:30:41.882771 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:41.882741 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f76dcb489-d8kdg" event={"ID":"34913715-f62a-4682-9a2d-a76391be2411","Type":"ContainerStarted","Data":"6f10a407097f63e1e03ffdeae03f29ea6496755b84ee39859f55bf0858cbbbd3"} Apr 16 20:30:41.883121 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:41.882904 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-7f76dcb489-d8kdg" Apr 16 20:30:41.884127 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:41.884087 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-775c5544d8-4gndf" event={"ID":"e4a3994e-037a-4c38-bdcb-1fb57bfec1e8","Type":"ContainerStarted","Data":"1b77accab482495e51f83cfe483c037cc4c46e6d71113238e55a5dca5cbf5c6f"} Apr 16 20:30:41.903134 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:41.903084 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7f76dcb489-d8kdg" podStartSLOduration=2.650755436 podStartE2EDuration="5.903070695s" podCreationTimestamp="2026-04-16 20:30:36 +0000 UTC" firstStartedPulling="2026-04-16 20:30:36.728042428 +0000 UTC m=+179.995784860" lastFinishedPulling="2026-04-16 20:30:39.980357685 +0000 UTC m=+183.248100119" observedRunningTime="2026-04-16 20:30:41.901651839 +0000 UTC m=+185.169394293" watchObservedRunningTime="2026-04-16 20:30:41.903070695 +0000 UTC m=+185.170813174" Apr 16 20:30:42.890229 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:42.890144 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-775c5544d8-4gndf" event={"ID":"e4a3994e-037a-4c38-bdcb-1fb57bfec1e8","Type":"ContainerStarted","Data":"88194cafbba4de420e004c4d5ac761f49eec5accb3bfec0763c0172952d26db8"} Apr 16 20:30:42.890229 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:42.890203 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-775c5544d8-4gndf" event={"ID":"e4a3994e-037a-4c38-bdcb-1fb57bfec1e8","Type":"ContainerStarted","Data":"76263d6720f12764be4501e5cff82f6b486330ea81703b864e828c064f9890c9"} Apr 16 20:30:42.915035 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:42.914318 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-775c5544d8-4gndf" podStartSLOduration=2.5225570680000002 podStartE2EDuration="4.914300558s" podCreationTimestamp="2026-04-16 20:30:38 +0000 UTC" firstStartedPulling="2026-04-16 20:30:39.370972989 +0000 UTC m=+182.638715423" lastFinishedPulling="2026-04-16 20:30:41.762716481 +0000 UTC m=+185.030458913" observedRunningTime="2026-04-16 20:30:42.912686051 +0000 UTC m=+186.180428505" watchObservedRunningTime="2026-04-16 20:30:42.914300558 +0000 UTC m=+186.182043003" Apr 16 20:30:43.323594 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:43.323521 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-96894b6c-psxd6" podUID="fb910f93-043f-467a-88bc-ff78901b3eb4" containerName="registry" containerID="cri-o://2224d9a0dbd96d97ca6f2f82631638186367512d5e987b90fe2d5f7b90b69410" gracePeriod=30 Apr 16 20:30:43.780851 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:43.780829 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-96894b6c-psxd6" Apr 16 20:30:43.820119 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:43.820081 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fb910f93-043f-467a-88bc-ff78901b3eb4-image-registry-private-configuration\") pod \"fb910f93-043f-467a-88bc-ff78901b3eb4\" (UID: \"fb910f93-043f-467a-88bc-ff78901b3eb4\") " Apr 16 20:30:43.820229 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:43.820170 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fb910f93-043f-467a-88bc-ff78901b3eb4-installation-pull-secrets\") pod \"fb910f93-043f-467a-88bc-ff78901b3eb4\" (UID: \"fb910f93-043f-467a-88bc-ff78901b3eb4\") " Apr 16 20:30:43.820229 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:43.820206 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhnqk\" (UniqueName: \"kubernetes.io/projected/fb910f93-043f-467a-88bc-ff78901b3eb4-kube-api-access-fhnqk\") pod \"fb910f93-043f-467a-88bc-ff78901b3eb4\" (UID: \"fb910f93-043f-467a-88bc-ff78901b3eb4\") " Apr 16 20:30:43.820349 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:43.820238 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fb910f93-043f-467a-88bc-ff78901b3eb4-ca-trust-extracted\") pod \"fb910f93-043f-467a-88bc-ff78901b3eb4\" (UID: \"fb910f93-043f-467a-88bc-ff78901b3eb4\") " Apr 16 20:30:43.820349 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:43.820296 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fb910f93-043f-467a-88bc-ff78901b3eb4-trusted-ca\") pod \"fb910f93-043f-467a-88bc-ff78901b3eb4\" (UID: \"fb910f93-043f-467a-88bc-ff78901b3eb4\") " Apr 16 20:30:43.820469 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:43.820358 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fb910f93-043f-467a-88bc-ff78901b3eb4-registry-tls\") pod \"fb910f93-043f-467a-88bc-ff78901b3eb4\" (UID: \"fb910f93-043f-467a-88bc-ff78901b3eb4\") " Apr 16 20:30:43.820469 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:43.820397 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fb910f93-043f-467a-88bc-ff78901b3eb4-registry-certificates\") pod \"fb910f93-043f-467a-88bc-ff78901b3eb4\" (UID: \"fb910f93-043f-467a-88bc-ff78901b3eb4\") " Apr 16 20:30:43.820469 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:43.820453 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fb910f93-043f-467a-88bc-ff78901b3eb4-bound-sa-token\") pod \"fb910f93-043f-467a-88bc-ff78901b3eb4\" (UID: \"fb910f93-043f-467a-88bc-ff78901b3eb4\") " Apr 16 20:30:43.821507 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:43.821177 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb910f93-043f-467a-88bc-ff78901b3eb4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "fb910f93-043f-467a-88bc-ff78901b3eb4" (UID: "fb910f93-043f-467a-88bc-ff78901b3eb4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:30:43.822393 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:43.822307 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb910f93-043f-467a-88bc-ff78901b3eb4-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "fb910f93-043f-467a-88bc-ff78901b3eb4" (UID: "fb910f93-043f-467a-88bc-ff78901b3eb4"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:30:43.823000 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:43.822972 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb910f93-043f-467a-88bc-ff78901b3eb4-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "fb910f93-043f-467a-88bc-ff78901b3eb4" (UID: "fb910f93-043f-467a-88bc-ff78901b3eb4"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:30:43.823098 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:43.823072 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb910f93-043f-467a-88bc-ff78901b3eb4-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "fb910f93-043f-467a-88bc-ff78901b3eb4" (UID: "fb910f93-043f-467a-88bc-ff78901b3eb4"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:30:43.823414 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:43.823388 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb910f93-043f-467a-88bc-ff78901b3eb4-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "fb910f93-043f-467a-88bc-ff78901b3eb4" (UID: "fb910f93-043f-467a-88bc-ff78901b3eb4"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:30:43.824689 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:43.824661 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb910f93-043f-467a-88bc-ff78901b3eb4-kube-api-access-fhnqk" (OuterVolumeSpecName: "kube-api-access-fhnqk") pod "fb910f93-043f-467a-88bc-ff78901b3eb4" (UID: "fb910f93-043f-467a-88bc-ff78901b3eb4"). InnerVolumeSpecName "kube-api-access-fhnqk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:30:43.825199 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:43.825164 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb910f93-043f-467a-88bc-ff78901b3eb4-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "fb910f93-043f-467a-88bc-ff78901b3eb4" (UID: "fb910f93-043f-467a-88bc-ff78901b3eb4"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:30:43.835563 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:43.835539 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb910f93-043f-467a-88bc-ff78901b3eb4-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "fb910f93-043f-467a-88bc-ff78901b3eb4" (UID: "fb910f93-043f-467a-88bc-ff78901b3eb4"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:30:43.898543 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:43.898472 2565 generic.go:358] "Generic (PLEG): container finished" podID="fb910f93-043f-467a-88bc-ff78901b3eb4" containerID="2224d9a0dbd96d97ca6f2f82631638186367512d5e987b90fe2d5f7b90b69410" exitCode=0 Apr 16 20:30:43.898944 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:43.898562 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-96894b6c-psxd6" event={"ID":"fb910f93-043f-467a-88bc-ff78901b3eb4","Type":"ContainerDied","Data":"2224d9a0dbd96d97ca6f2f82631638186367512d5e987b90fe2d5f7b90b69410"} Apr 16 20:30:43.898944 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:43.898590 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-96894b6c-psxd6" event={"ID":"fb910f93-043f-467a-88bc-ff78901b3eb4","Type":"ContainerDied","Data":"f0668f8f1be1c40aeb3e94b0ac6568e5c9ae281b200259220d53bba78e3615e7"} Apr 16 20:30:43.898944 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:43.898631 2565 scope.go:117] "RemoveContainer" containerID="2224d9a0dbd96d97ca6f2f82631638186367512d5e987b90fe2d5f7b90b69410" Apr 16 20:30:43.898944 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:43.898769 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-96894b6c-psxd6" Apr 16 20:30:43.905962 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:43.905816 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1","Type":"ContainerStarted","Data":"bb1f627b5a7f64b98b09aed90e98a78a955c3a4b978bb69a57a8b089888dc774"} Apr 16 20:30:43.905962 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:43.905852 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1","Type":"ContainerStarted","Data":"da1f9c211d6c48096cc51096d2acff6a2973b99c0a72b2070251695d95224071"} Apr 16 20:30:43.905962 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:43.905865 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1","Type":"ContainerStarted","Data":"2cf5bae9b94d5bab4456c4313f23f13f3683bb800074cf32ea30e5859efb67bd"} Apr 16 20:30:43.917644 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:43.917407 2565 scope.go:117] "RemoveContainer" containerID="2224d9a0dbd96d97ca6f2f82631638186367512d5e987b90fe2d5f7b90b69410" Apr 16 20:30:43.920310 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:30:43.918979 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2224d9a0dbd96d97ca6f2f82631638186367512d5e987b90fe2d5f7b90b69410\": container with ID starting with 2224d9a0dbd96d97ca6f2f82631638186367512d5e987b90fe2d5f7b90b69410 not found: ID does not exist" containerID="2224d9a0dbd96d97ca6f2f82631638186367512d5e987b90fe2d5f7b90b69410" Apr 16 20:30:43.920310 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:43.919019 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2224d9a0dbd96d97ca6f2f82631638186367512d5e987b90fe2d5f7b90b69410"} err="failed to get container status \"2224d9a0dbd96d97ca6f2f82631638186367512d5e987b90fe2d5f7b90b69410\": rpc error: code = NotFound desc = could not find container \"2224d9a0dbd96d97ca6f2f82631638186367512d5e987b90fe2d5f7b90b69410\": container with ID starting with 2224d9a0dbd96d97ca6f2f82631638186367512d5e987b90fe2d5f7b90b69410 not found: ID does not exist" Apr 16 20:30:43.922956 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:43.921391 2565 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fb910f93-043f-467a-88bc-ff78901b3eb4-installation-pull-secrets\") on node \"ip-10-0-139-150.ec2.internal\" DevicePath \"\"" Apr 16 20:30:43.922956 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:43.921414 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fhnqk\" (UniqueName: \"kubernetes.io/projected/fb910f93-043f-467a-88bc-ff78901b3eb4-kube-api-access-fhnqk\") on node \"ip-10-0-139-150.ec2.internal\" DevicePath \"\"" Apr 16 20:30:43.922956 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:43.921432 2565 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fb910f93-043f-467a-88bc-ff78901b3eb4-ca-trust-extracted\") on node \"ip-10-0-139-150.ec2.internal\" DevicePath \"\"" Apr 16 20:30:43.922956 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:43.921446 2565 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fb910f93-043f-467a-88bc-ff78901b3eb4-trusted-ca\") on node \"ip-10-0-139-150.ec2.internal\" DevicePath \"\"" Apr 16 20:30:43.922956 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:43.921460 2565 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fb910f93-043f-467a-88bc-ff78901b3eb4-registry-tls\") on node \"ip-10-0-139-150.ec2.internal\" DevicePath \"\"" Apr 16 20:30:43.922956 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:43.921474 2565 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fb910f93-043f-467a-88bc-ff78901b3eb4-registry-certificates\") on node \"ip-10-0-139-150.ec2.internal\" DevicePath \"\"" Apr 16 20:30:43.922956 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:43.921541 2565 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fb910f93-043f-467a-88bc-ff78901b3eb4-bound-sa-token\") on node \"ip-10-0-139-150.ec2.internal\" DevicePath \"\"" Apr 16 20:30:43.922956 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:43.921557 2565 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fb910f93-043f-467a-88bc-ff78901b3eb4-image-registry-private-configuration\") on node \"ip-10-0-139-150.ec2.internal\" DevicePath \"\"" Apr 16 20:30:43.934203 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:43.931014 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-96894b6c-psxd6"] Apr 16 20:30:43.937252 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:43.937224 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-96894b6c-psxd6"] Apr 16 20:30:44.911395 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:44.911359 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1","Type":"ContainerStarted","Data":"0a14ce85c742f235487cc76398562f67db05b986bcbccd3b4b0bef6f839d9744"} Apr 16 20:30:44.911395 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:44.911394 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1","Type":"ContainerStarted","Data":"8f3e71fd8ca5b2353c8fe462479d0a9649fbbccbd14e1e77bd639fc0ca03d2b7"} Apr 16 20:30:44.911856 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:44.911403 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1","Type":"ContainerStarted","Data":"2613c92047f3ad072943a21336e74359ab9a3fc791ecfceb45b786dac36fbcc2"} Apr 16 20:30:44.938739 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:44.938694 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.115149938 podStartE2EDuration="5.938679373s" podCreationTimestamp="2026-04-16 20:30:39 +0000 UTC" firstStartedPulling="2026-04-16 20:30:40.872352606 +0000 UTC m=+184.140095042" lastFinishedPulling="2026-04-16 20:30:43.695882036 +0000 UTC m=+186.963624477" observedRunningTime="2026-04-16 20:30:44.935959194 +0000 UTC m=+188.203701648" watchObservedRunningTime="2026-04-16 20:30:44.938679373 +0000 UTC m=+188.206421826" Apr 16 20:30:45.312507 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:45.312430 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb910f93-043f-467a-88bc-ff78901b3eb4" path="/var/lib/kubelet/pods/fb910f93-043f-467a-88bc-ff78901b3eb4/volumes" Apr 16 20:30:47.897402 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:47.897373 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7f76dcb489-d8kdg" Apr 16 20:30:49.651459 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:49.651418 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:30:57.784197 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:57.784164 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-5bdd65d46d-bpprw" Apr 16 20:30:57.784569 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:30:57.784213 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-5bdd65d46d-bpprw" Apr 16 20:31:04.974932 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:04.974897 2565 generic.go:358] "Generic (PLEG): container finished" podID="60ac5c46-c967-4000-bd8b-4f0c90324ecb" containerID="e20b52ae41c0d3cf835a6b7d875fd8f6bc190244ff13e879d44d708696eef505" exitCode=0 Apr 16 20:31:04.975360 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:04.974947 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-l9zgr" event={"ID":"60ac5c46-c967-4000-bd8b-4f0c90324ecb","Type":"ContainerDied","Data":"e20b52ae41c0d3cf835a6b7d875fd8f6bc190244ff13e879d44d708696eef505"} Apr 16 20:31:04.975360 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:04.975322 2565 scope.go:117] "RemoveContainer" containerID="e20b52ae41c0d3cf835a6b7d875fd8f6bc190244ff13e879d44d708696eef505" Apr 16 20:31:05.534145 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:05.534114 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-bcshd_394dcb43-4d46-4c81-bcad-73d0aadfc01c/serve-healthcheck-canary/0.log" Apr 16 20:31:05.979055 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:05.979025 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-l9zgr" event={"ID":"60ac5c46-c967-4000-bd8b-4f0c90324ecb","Type":"ContainerStarted","Data":"f50385f759d562c1a731a379688829c5e2e7b42ea6863776b07cc543dc9baf9c"} Apr 16 20:31:17.797704 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:17.797666 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-5bdd65d46d-bpprw" Apr 16 20:31:17.805533 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:17.805508 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-5bdd65d46d-bpprw" Apr 16 20:31:39.652009 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:39.651977 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:39.672065 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:39.672039 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:40.103404 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:40.103381 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:47.961474 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:47.961440 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90b993f2-207d-4894-bbdf-e2219dbf690b-metrics-certs\") pod \"network-metrics-daemon-jzdqd\" (UID: \"90b993f2-207d-4894-bbdf-e2219dbf690b\") " pod="openshift-multus/network-metrics-daemon-jzdqd" Apr 16 20:31:47.963761 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:47.963727 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90b993f2-207d-4894-bbdf-e2219dbf690b-metrics-certs\") pod \"network-metrics-daemon-jzdqd\" (UID: \"90b993f2-207d-4894-bbdf-e2219dbf690b\") " pod="openshift-multus/network-metrics-daemon-jzdqd" Apr 16 20:31:48.211259 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:48.211229 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-v8bd2\"" Apr 16 20:31:48.219358 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:48.219271 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jzdqd" Apr 16 20:31:48.337435 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:48.337410 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jzdqd"] Apr 16 20:31:48.340349 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:31:48.340320 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90b993f2_207d_4894_bbdf_e2219dbf690b.slice/crio-47dada99c5d98e1a5a1039d4674ed09efeb057a4d7667ce02f123305e432f8ea WatchSource:0}: Error finding container 47dada99c5d98e1a5a1039d4674ed09efeb057a4d7667ce02f123305e432f8ea: Status 404 returned error can't find the container with id 47dada99c5d98e1a5a1039d4674ed09efeb057a4d7667ce02f123305e432f8ea Apr 16 20:31:49.118942 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:49.118903 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jzdqd" event={"ID":"90b993f2-207d-4894-bbdf-e2219dbf690b","Type":"ContainerStarted","Data":"47dada99c5d98e1a5a1039d4674ed09efeb057a4d7667ce02f123305e432f8ea"} Apr 16 20:31:50.123562 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:50.123528 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jzdqd" event={"ID":"90b993f2-207d-4894-bbdf-e2219dbf690b","Type":"ContainerStarted","Data":"43d66ec90198f7a01cab2b96a9861497151ae662b6139d502c0c7846b456ad68"} Apr 16 20:31:50.123903 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:50.123570 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jzdqd" event={"ID":"90b993f2-207d-4894-bbdf-e2219dbf690b","Type":"ContainerStarted","Data":"4ed83bb8f5894e72107ea8540379ed7f47a8acefb46024afb71b90c7cc88b768"} Apr 16 20:31:50.138950 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:50.138901 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-jzdqd" podStartSLOduration=252.136192915 podStartE2EDuration="4m13.138887456s" podCreationTimestamp="2026-04-16 20:27:37 +0000 UTC" firstStartedPulling="2026-04-16 20:31:48.342641227 +0000 UTC m=+251.610383658" lastFinishedPulling="2026-04-16 20:31:49.345335749 +0000 UTC m=+252.613078199" observedRunningTime="2026-04-16 20:31:50.137237256 +0000 UTC m=+253.404979708" watchObservedRunningTime="2026-04-16 20:31:50.138887456 +0000 UTC m=+253.406629908" Apr 16 20:31:53.423909 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:53.423876 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:31:53.424375 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:53.424349 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="6dfb82aa-444e-48aa-b144-b043d8b099f1" containerName="alertmanager" containerID="cri-o://232382c5680114f86dddf017bc601b089324dc6fb6f1c7c063e4c31a8146d344" gracePeriod=120 Apr 16 20:31:53.424455 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:53.424413 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="6dfb82aa-444e-48aa-b144-b043d8b099f1" containerName="kube-rbac-proxy-metric" containerID="cri-o://b9ec15369521b6c5f61859777ee2f943f762759dc5304f9079c3a28bd6a6a5f9" gracePeriod=120 Apr 16 20:31:53.424509 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:53.424454 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="6dfb82aa-444e-48aa-b144-b043d8b099f1" containerName="config-reloader" containerID="cri-o://739ec335c47112785d4e230a7f15bb02f8dc1fffad06b1df2e44af3d0cbdaff0" gracePeriod=120 Apr 16 20:31:53.424509 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:53.424433 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="6dfb82aa-444e-48aa-b144-b043d8b099f1" containerName="kube-rbac-proxy-web" containerID="cri-o://dadad5e0c0eb91e82a54b9dba9234b550ee66b79b7e4c4cdf50e8dc715440e26" gracePeriod=120 Apr 16 20:31:53.424599 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:53.424543 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="6dfb82aa-444e-48aa-b144-b043d8b099f1" containerName="prom-label-proxy" containerID="cri-o://e42102d03ecf27d8630c0000d20f3f264c49f3b45eb47d9f47addfec1cc131a0" gracePeriod=120 Apr 16 20:31:53.424697 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:53.424459 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="6dfb82aa-444e-48aa-b144-b043d8b099f1" containerName="kube-rbac-proxy" containerID="cri-o://d0d017712e5bf193dd467406283e577771a0193fcb706b3c8d021487c76223b6" gracePeriod=120 Apr 16 20:31:54.139934 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.139894 2565 generic.go:358] "Generic (PLEG): container finished" podID="6dfb82aa-444e-48aa-b144-b043d8b099f1" containerID="e42102d03ecf27d8630c0000d20f3f264c49f3b45eb47d9f47addfec1cc131a0" exitCode=0 Apr 16 20:31:54.139934 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.139919 2565 generic.go:358] "Generic (PLEG): container finished" podID="6dfb82aa-444e-48aa-b144-b043d8b099f1" containerID="d0d017712e5bf193dd467406283e577771a0193fcb706b3c8d021487c76223b6" exitCode=0 Apr 16 20:31:54.139934 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.139926 2565 generic.go:358] "Generic (PLEG): container finished" podID="6dfb82aa-444e-48aa-b144-b043d8b099f1" containerID="739ec335c47112785d4e230a7f15bb02f8dc1fffad06b1df2e44af3d0cbdaff0" exitCode=0 Apr 16 20:31:54.139934 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.139931 2565 generic.go:358] "Generic (PLEG): container finished" podID="6dfb82aa-444e-48aa-b144-b043d8b099f1" containerID="232382c5680114f86dddf017bc601b089324dc6fb6f1c7c063e4c31a8146d344" exitCode=0 Apr 16 20:31:54.140203 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.139949 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6dfb82aa-444e-48aa-b144-b043d8b099f1","Type":"ContainerDied","Data":"e42102d03ecf27d8630c0000d20f3f264c49f3b45eb47d9f47addfec1cc131a0"} Apr 16 20:31:54.140203 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.139972 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6dfb82aa-444e-48aa-b144-b043d8b099f1","Type":"ContainerDied","Data":"d0d017712e5bf193dd467406283e577771a0193fcb706b3c8d021487c76223b6"} Apr 16 20:31:54.140203 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.139993 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6dfb82aa-444e-48aa-b144-b043d8b099f1","Type":"ContainerDied","Data":"739ec335c47112785d4e230a7f15bb02f8dc1fffad06b1df2e44af3d0cbdaff0"} Apr 16 20:31:54.140203 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.140002 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6dfb82aa-444e-48aa-b144-b043d8b099f1","Type":"ContainerDied","Data":"232382c5680114f86dddf017bc601b089324dc6fb6f1c7c063e4c31a8146d344"} Apr 16 20:31:54.665648 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.665625 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:54.822442 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.822357 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6dfb82aa-444e-48aa-b144-b043d8b099f1-alertmanager-main-db\") pod \"6dfb82aa-444e-48aa-b144-b043d8b099f1\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " Apr 16 20:31:54.822442 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.822396 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-secret-alertmanager-kube-rbac-proxy-web\") pod \"6dfb82aa-444e-48aa-b144-b043d8b099f1\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " Apr 16 20:31:54.822442 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.822414 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6dfb82aa-444e-48aa-b144-b043d8b099f1-metrics-client-ca\") pod \"6dfb82aa-444e-48aa-b144-b043d8b099f1\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " Apr 16 20:31:54.822695 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.822444 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-secret-alertmanager-kube-rbac-proxy\") pod \"6dfb82aa-444e-48aa-b144-b043d8b099f1\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " Apr 16 20:31:54.822695 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.822562 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6dfb82aa-444e-48aa-b144-b043d8b099f1-alertmanager-trusted-ca-bundle\") pod \"6dfb82aa-444e-48aa-b144-b043d8b099f1\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " Apr 16 20:31:54.822695 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.822602 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-secret-alertmanager-main-tls\") pod \"6dfb82aa-444e-48aa-b144-b043d8b099f1\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " Apr 16 20:31:54.822695 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.822639 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-config-volume\") pod \"6dfb82aa-444e-48aa-b144-b043d8b099f1\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " Apr 16 20:31:54.822695 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.822678 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6dfb82aa-444e-48aa-b144-b043d8b099f1-tls-assets\") pod \"6dfb82aa-444e-48aa-b144-b043d8b099f1\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " Apr 16 20:31:54.822944 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.822735 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dfb82aa-444e-48aa-b144-b043d8b099f1-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "6dfb82aa-444e-48aa-b144-b043d8b099f1" (UID: "6dfb82aa-444e-48aa-b144-b043d8b099f1"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:31:54.822944 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.822742 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-cluster-tls-config\") pod \"6dfb82aa-444e-48aa-b144-b043d8b099f1\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " Apr 16 20:31:54.822944 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.822768 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"6dfb82aa-444e-48aa-b144-b043d8b099f1\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " Apr 16 20:31:54.822944 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.822798 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6dfb82aa-444e-48aa-b144-b043d8b099f1-config-out\") pod \"6dfb82aa-444e-48aa-b144-b043d8b099f1\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " Apr 16 20:31:54.822944 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.822822 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mj868\" (UniqueName: \"kubernetes.io/projected/6dfb82aa-444e-48aa-b144-b043d8b099f1-kube-api-access-mj868\") pod \"6dfb82aa-444e-48aa-b144-b043d8b099f1\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " Apr 16 20:31:54.822944 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.822868 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-web-config\") pod \"6dfb82aa-444e-48aa-b144-b043d8b099f1\" (UID: \"6dfb82aa-444e-48aa-b144-b043d8b099f1\") " Apr 16 20:31:54.822944 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.822877 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dfb82aa-444e-48aa-b144-b043d8b099f1-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "6dfb82aa-444e-48aa-b144-b043d8b099f1" (UID: "6dfb82aa-444e-48aa-b144-b043d8b099f1"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:31:54.823325 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.822990 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dfb82aa-444e-48aa-b144-b043d8b099f1-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "6dfb82aa-444e-48aa-b144-b043d8b099f1" (UID: "6dfb82aa-444e-48aa-b144-b043d8b099f1"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:31:54.823325 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.823154 2565 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6dfb82aa-444e-48aa-b144-b043d8b099f1-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-139-150.ec2.internal\" DevicePath \"\"" Apr 16 20:31:54.823325 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.823173 2565 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6dfb82aa-444e-48aa-b144-b043d8b099f1-alertmanager-main-db\") on node \"ip-10-0-139-150.ec2.internal\" DevicePath \"\"" Apr 16 20:31:54.823325 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.823187 2565 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6dfb82aa-444e-48aa-b144-b043d8b099f1-metrics-client-ca\") on node \"ip-10-0-139-150.ec2.internal\" DevicePath \"\"" Apr 16 20:31:54.825452 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.825301 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "6dfb82aa-444e-48aa-b144-b043d8b099f1" (UID: "6dfb82aa-444e-48aa-b144-b043d8b099f1"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:31:54.825452 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.825335 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "6dfb82aa-444e-48aa-b144-b043d8b099f1" (UID: "6dfb82aa-444e-48aa-b144-b043d8b099f1"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:31:54.825641 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.825489 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "6dfb82aa-444e-48aa-b144-b043d8b099f1" (UID: "6dfb82aa-444e-48aa-b144-b043d8b099f1"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:31:54.825641 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.825504 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "6dfb82aa-444e-48aa-b144-b043d8b099f1" (UID: "6dfb82aa-444e-48aa-b144-b043d8b099f1"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:31:54.825641 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.825553 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-config-volume" (OuterVolumeSpecName: "config-volume") pod "6dfb82aa-444e-48aa-b144-b043d8b099f1" (UID: "6dfb82aa-444e-48aa-b144-b043d8b099f1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:31:54.826046 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.826014 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dfb82aa-444e-48aa-b144-b043d8b099f1-kube-api-access-mj868" (OuterVolumeSpecName: "kube-api-access-mj868") pod "6dfb82aa-444e-48aa-b144-b043d8b099f1" (UID: "6dfb82aa-444e-48aa-b144-b043d8b099f1"). InnerVolumeSpecName "kube-api-access-mj868". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:31:54.826546 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.826515 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dfb82aa-444e-48aa-b144-b043d8b099f1-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "6dfb82aa-444e-48aa-b144-b043d8b099f1" (UID: "6dfb82aa-444e-48aa-b144-b043d8b099f1"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:31:54.827229 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.827206 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dfb82aa-444e-48aa-b144-b043d8b099f1-config-out" (OuterVolumeSpecName: "config-out") pod "6dfb82aa-444e-48aa-b144-b043d8b099f1" (UID: "6dfb82aa-444e-48aa-b144-b043d8b099f1"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:31:54.830668 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.830647 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "6dfb82aa-444e-48aa-b144-b043d8b099f1" (UID: "6dfb82aa-444e-48aa-b144-b043d8b099f1"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:31:54.837603 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.837579 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-web-config" (OuterVolumeSpecName: "web-config") pod "6dfb82aa-444e-48aa-b144-b043d8b099f1" (UID: "6dfb82aa-444e-48aa-b144-b043d8b099f1"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:31:54.924185 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.924155 2565 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-web-config\") on node \"ip-10-0-139-150.ec2.internal\" DevicePath \"\"" Apr 16 20:31:54.924185 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.924180 2565 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-139-150.ec2.internal\" DevicePath \"\"" Apr 16 20:31:54.924349 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.924191 2565 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-139-150.ec2.internal\" DevicePath \"\"" Apr 16 20:31:54.924349 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.924200 2565 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-secret-alertmanager-main-tls\") on node \"ip-10-0-139-150.ec2.internal\" DevicePath \"\"" Apr 16 20:31:54.924349 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.924209 2565 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-config-volume\") on node \"ip-10-0-139-150.ec2.internal\" DevicePath \"\"" Apr 16 20:31:54.924349 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.924218 2565 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6dfb82aa-444e-48aa-b144-b043d8b099f1-tls-assets\") on node \"ip-10-0-139-150.ec2.internal\" DevicePath \"\"" Apr 16 20:31:54.924349 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.924227 2565 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-cluster-tls-config\") on node \"ip-10-0-139-150.ec2.internal\" DevicePath \"\"" Apr 16 20:31:54.924349 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.924236 2565 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6dfb82aa-444e-48aa-b144-b043d8b099f1-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-139-150.ec2.internal\" DevicePath \"\"" Apr 16 20:31:54.924349 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.924245 2565 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6dfb82aa-444e-48aa-b144-b043d8b099f1-config-out\") on node \"ip-10-0-139-150.ec2.internal\" DevicePath \"\"" Apr 16 20:31:54.924349 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:54.924254 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mj868\" (UniqueName: \"kubernetes.io/projected/6dfb82aa-444e-48aa-b144-b043d8b099f1-kube-api-access-mj868\") on node \"ip-10-0-139-150.ec2.internal\" DevicePath \"\"" Apr 16 20:31:55.146198 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.146117 2565 generic.go:358] "Generic (PLEG): container finished" podID="6dfb82aa-444e-48aa-b144-b043d8b099f1" containerID="b9ec15369521b6c5f61859777ee2f943f762759dc5304f9079c3a28bd6a6a5f9" exitCode=0 Apr 16 20:31:55.146198 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.146141 2565 generic.go:358] "Generic (PLEG): container finished" podID="6dfb82aa-444e-48aa-b144-b043d8b099f1" containerID="dadad5e0c0eb91e82a54b9dba9234b550ee66b79b7e4c4cdf50e8dc715440e26" exitCode=0 Apr 16 20:31:55.146397 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.146201 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6dfb82aa-444e-48aa-b144-b043d8b099f1","Type":"ContainerDied","Data":"b9ec15369521b6c5f61859777ee2f943f762759dc5304f9079c3a28bd6a6a5f9"} Apr 16 20:31:55.146397 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.146236 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.146397 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.146252 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6dfb82aa-444e-48aa-b144-b043d8b099f1","Type":"ContainerDied","Data":"dadad5e0c0eb91e82a54b9dba9234b550ee66b79b7e4c4cdf50e8dc715440e26"} Apr 16 20:31:55.146397 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.146265 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6dfb82aa-444e-48aa-b144-b043d8b099f1","Type":"ContainerDied","Data":"15bf682d36511ec1b0ba660e6d840158602624145077a889159625d9dec34305"} Apr 16 20:31:55.146397 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.146296 2565 scope.go:117] "RemoveContainer" containerID="e42102d03ecf27d8630c0000d20f3f264c49f3b45eb47d9f47addfec1cc131a0" Apr 16 20:31:55.154304 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.154270 2565 scope.go:117] "RemoveContainer" containerID="b9ec15369521b6c5f61859777ee2f943f762759dc5304f9079c3a28bd6a6a5f9" Apr 16 20:31:55.160694 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.160679 2565 scope.go:117] "RemoveContainer" containerID="d0d017712e5bf193dd467406283e577771a0193fcb706b3c8d021487c76223b6" Apr 16 20:31:55.166631 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.166615 2565 scope.go:117] "RemoveContainer" containerID="dadad5e0c0eb91e82a54b9dba9234b550ee66b79b7e4c4cdf50e8dc715440e26" Apr 16 20:31:55.170871 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.170845 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:31:55.173636 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.173615 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:31:55.174362 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.174338 2565 scope.go:117] "RemoveContainer" containerID="739ec335c47112785d4e230a7f15bb02f8dc1fffad06b1df2e44af3d0cbdaff0" Apr 16 20:31:55.182514 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.181484 2565 scope.go:117] "RemoveContainer" containerID="232382c5680114f86dddf017bc601b089324dc6fb6f1c7c063e4c31a8146d344" Apr 16 20:31:55.189247 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.189230 2565 scope.go:117] "RemoveContainer" containerID="5488117389be3df45eb854726907ce7cbbcad8d24e29c8f9d7a72498ed36e24d" Apr 16 20:31:55.195820 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.195797 2565 scope.go:117] "RemoveContainer" containerID="e42102d03ecf27d8630c0000d20f3f264c49f3b45eb47d9f47addfec1cc131a0" Apr 16 20:31:55.196216 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:31:55.196155 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e42102d03ecf27d8630c0000d20f3f264c49f3b45eb47d9f47addfec1cc131a0\": container with ID starting with e42102d03ecf27d8630c0000d20f3f264c49f3b45eb47d9f47addfec1cc131a0 not found: ID does not exist" containerID="e42102d03ecf27d8630c0000d20f3f264c49f3b45eb47d9f47addfec1cc131a0" Apr 16 20:31:55.196390 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.196193 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e42102d03ecf27d8630c0000d20f3f264c49f3b45eb47d9f47addfec1cc131a0"} err="failed to get container status \"e42102d03ecf27d8630c0000d20f3f264c49f3b45eb47d9f47addfec1cc131a0\": rpc error: code = NotFound desc = could not find container \"e42102d03ecf27d8630c0000d20f3f264c49f3b45eb47d9f47addfec1cc131a0\": container with ID starting with e42102d03ecf27d8630c0000d20f3f264c49f3b45eb47d9f47addfec1cc131a0 not found: ID does not exist" Apr 16 20:31:55.196390 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.196252 2565 scope.go:117] "RemoveContainer" containerID="b9ec15369521b6c5f61859777ee2f943f762759dc5304f9079c3a28bd6a6a5f9" Apr 16 20:31:55.196626 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:31:55.196602 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9ec15369521b6c5f61859777ee2f943f762759dc5304f9079c3a28bd6a6a5f9\": container with ID starting with b9ec15369521b6c5f61859777ee2f943f762759dc5304f9079c3a28bd6a6a5f9 not found: ID does not exist" containerID="b9ec15369521b6c5f61859777ee2f943f762759dc5304f9079c3a28bd6a6a5f9" Apr 16 20:31:55.196717 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.196634 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9ec15369521b6c5f61859777ee2f943f762759dc5304f9079c3a28bd6a6a5f9"} err="failed to get container status \"b9ec15369521b6c5f61859777ee2f943f762759dc5304f9079c3a28bd6a6a5f9\": rpc error: code = NotFound desc = could not find container \"b9ec15369521b6c5f61859777ee2f943f762759dc5304f9079c3a28bd6a6a5f9\": container with ID starting with b9ec15369521b6c5f61859777ee2f943f762759dc5304f9079c3a28bd6a6a5f9 not found: ID does not exist" Apr 16 20:31:55.196717 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.196656 2565 scope.go:117] "RemoveContainer" containerID="d0d017712e5bf193dd467406283e577771a0193fcb706b3c8d021487c76223b6" Apr 16 20:31:55.196995 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:31:55.196974 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0d017712e5bf193dd467406283e577771a0193fcb706b3c8d021487c76223b6\": container with ID starting with d0d017712e5bf193dd467406283e577771a0193fcb706b3c8d021487c76223b6 not found: ID does not exist" containerID="d0d017712e5bf193dd467406283e577771a0193fcb706b3c8d021487c76223b6" Apr 16 20:31:55.197071 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.197003 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0d017712e5bf193dd467406283e577771a0193fcb706b3c8d021487c76223b6"} err="failed to get container status \"d0d017712e5bf193dd467406283e577771a0193fcb706b3c8d021487c76223b6\": rpc error: code = NotFound desc = could not find container \"d0d017712e5bf193dd467406283e577771a0193fcb706b3c8d021487c76223b6\": container with ID starting with d0d017712e5bf193dd467406283e577771a0193fcb706b3c8d021487c76223b6 not found: ID does not exist" Apr 16 20:31:55.197071 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.197020 2565 scope.go:117] "RemoveContainer" containerID="dadad5e0c0eb91e82a54b9dba9234b550ee66b79b7e4c4cdf50e8dc715440e26" Apr 16 20:31:55.197322 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:31:55.197303 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dadad5e0c0eb91e82a54b9dba9234b550ee66b79b7e4c4cdf50e8dc715440e26\": container with ID starting with dadad5e0c0eb91e82a54b9dba9234b550ee66b79b7e4c4cdf50e8dc715440e26 not found: ID does not exist" containerID="dadad5e0c0eb91e82a54b9dba9234b550ee66b79b7e4c4cdf50e8dc715440e26" Apr 16 20:31:55.197436 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.197328 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dadad5e0c0eb91e82a54b9dba9234b550ee66b79b7e4c4cdf50e8dc715440e26"} err="failed to get container status \"dadad5e0c0eb91e82a54b9dba9234b550ee66b79b7e4c4cdf50e8dc715440e26\": rpc error: code = NotFound desc = could not find container \"dadad5e0c0eb91e82a54b9dba9234b550ee66b79b7e4c4cdf50e8dc715440e26\": container with ID starting with dadad5e0c0eb91e82a54b9dba9234b550ee66b79b7e4c4cdf50e8dc715440e26 not found: ID does not exist" Apr 16 20:31:55.197436 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.197347 2565 scope.go:117] "RemoveContainer" containerID="739ec335c47112785d4e230a7f15bb02f8dc1fffad06b1df2e44af3d0cbdaff0" Apr 16 20:31:55.197642 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:31:55.197626 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"739ec335c47112785d4e230a7f15bb02f8dc1fffad06b1df2e44af3d0cbdaff0\": container with ID starting with 739ec335c47112785d4e230a7f15bb02f8dc1fffad06b1df2e44af3d0cbdaff0 not found: ID does not exist" containerID="739ec335c47112785d4e230a7f15bb02f8dc1fffad06b1df2e44af3d0cbdaff0" Apr 16 20:31:55.197694 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.197648 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"739ec335c47112785d4e230a7f15bb02f8dc1fffad06b1df2e44af3d0cbdaff0"} err="failed to get container status \"739ec335c47112785d4e230a7f15bb02f8dc1fffad06b1df2e44af3d0cbdaff0\": rpc error: code = NotFound desc = could not find container \"739ec335c47112785d4e230a7f15bb02f8dc1fffad06b1df2e44af3d0cbdaff0\": container with ID starting with 739ec335c47112785d4e230a7f15bb02f8dc1fffad06b1df2e44af3d0cbdaff0 not found: ID does not exist" Apr 16 20:31:55.197694 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.197664 2565 scope.go:117] "RemoveContainer" containerID="232382c5680114f86dddf017bc601b089324dc6fb6f1c7c063e4c31a8146d344" Apr 16 20:31:55.197835 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.197700 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:31:55.197911 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:31:55.197895 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"232382c5680114f86dddf017bc601b089324dc6fb6f1c7c063e4c31a8146d344\": container with ID starting with 232382c5680114f86dddf017bc601b089324dc6fb6f1c7c063e4c31a8146d344 not found: ID does not exist" containerID="232382c5680114f86dddf017bc601b089324dc6fb6f1c7c063e4c31a8146d344" Apr 16 20:31:55.197949 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.197915 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"232382c5680114f86dddf017bc601b089324dc6fb6f1c7c063e4c31a8146d344"} err="failed to get container status \"232382c5680114f86dddf017bc601b089324dc6fb6f1c7c063e4c31a8146d344\": rpc error: code = NotFound desc = could not find container \"232382c5680114f86dddf017bc601b089324dc6fb6f1c7c063e4c31a8146d344\": container with ID starting with 232382c5680114f86dddf017bc601b089324dc6fb6f1c7c063e4c31a8146d344 not found: ID does not exist" Apr 16 20:31:55.197949 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.197937 2565 scope.go:117] "RemoveContainer" containerID="5488117389be3df45eb854726907ce7cbbcad8d24e29c8f9d7a72498ed36e24d" Apr 16 20:31:55.198068 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.198051 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6dfb82aa-444e-48aa-b144-b043d8b099f1" containerName="alertmanager" Apr 16 20:31:55.198117 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.198070 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dfb82aa-444e-48aa-b144-b043d8b099f1" containerName="alertmanager" Apr 16 20:31:55.198117 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.198084 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6dfb82aa-444e-48aa-b144-b043d8b099f1" containerName="kube-rbac-proxy-web" Apr 16 20:31:55.198117 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.198093 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dfb82aa-444e-48aa-b144-b043d8b099f1" containerName="kube-rbac-proxy-web" Apr 16 20:31:55.198117 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.198107 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb910f93-043f-467a-88bc-ff78901b3eb4" containerName="registry" Apr 16 20:31:55.198117 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.198113 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb910f93-043f-467a-88bc-ff78901b3eb4" containerName="registry" Apr 16 20:31:55.198328 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.198137 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6dfb82aa-444e-48aa-b144-b043d8b099f1" containerName="config-reloader" Apr 16 20:31:55.198328 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.198149 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dfb82aa-444e-48aa-b144-b043d8b099f1" containerName="config-reloader" Apr 16 20:31:55.198328 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:31:55.198132 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5488117389be3df45eb854726907ce7cbbcad8d24e29c8f9d7a72498ed36e24d\": container with ID starting with 5488117389be3df45eb854726907ce7cbbcad8d24e29c8f9d7a72498ed36e24d not found: ID does not exist" containerID="5488117389be3df45eb854726907ce7cbbcad8d24e29c8f9d7a72498ed36e24d" Apr 16 20:31:55.198328 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.198168 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5488117389be3df45eb854726907ce7cbbcad8d24e29c8f9d7a72498ed36e24d"} err="failed to get container status \"5488117389be3df45eb854726907ce7cbbcad8d24e29c8f9d7a72498ed36e24d\": rpc error: code = NotFound desc = could not find container \"5488117389be3df45eb854726907ce7cbbcad8d24e29c8f9d7a72498ed36e24d\": container with ID starting with 5488117389be3df45eb854726907ce7cbbcad8d24e29c8f9d7a72498ed36e24d not found: ID does not exist" Apr 16 20:31:55.198328 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.198184 2565 scope.go:117] "RemoveContainer" containerID="e42102d03ecf27d8630c0000d20f3f264c49f3b45eb47d9f47addfec1cc131a0" Apr 16 20:31:55.198328 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.198156 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6dfb82aa-444e-48aa-b144-b043d8b099f1" containerName="kube-rbac-proxy" Apr 16 20:31:55.198328 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.198216 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dfb82aa-444e-48aa-b144-b043d8b099f1" containerName="kube-rbac-proxy" Apr 16 20:31:55.198328 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.198235 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6dfb82aa-444e-48aa-b144-b043d8b099f1" containerName="prom-label-proxy" Apr 16 20:31:55.198328 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.198244 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dfb82aa-444e-48aa-b144-b043d8b099f1" containerName="prom-label-proxy" Apr 16 20:31:55.198328 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.198262 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6dfb82aa-444e-48aa-b144-b043d8b099f1" containerName="init-config-reloader" Apr 16 20:31:55.198328 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.198268 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dfb82aa-444e-48aa-b144-b043d8b099f1" containerName="init-config-reloader" Apr 16 20:31:55.198328 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.198298 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6dfb82aa-444e-48aa-b144-b043d8b099f1" containerName="kube-rbac-proxy-metric" Apr 16 20:31:55.198328 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.198305 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dfb82aa-444e-48aa-b144-b043d8b099f1" containerName="kube-rbac-proxy-metric" Apr 16 20:31:55.198877 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.198394 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="6dfb82aa-444e-48aa-b144-b043d8b099f1" containerName="config-reloader" Apr 16 20:31:55.198877 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.198407 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="6dfb82aa-444e-48aa-b144-b043d8b099f1" containerName="kube-rbac-proxy-metric" Apr 16 20:31:55.198877 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.198417 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="6dfb82aa-444e-48aa-b144-b043d8b099f1" containerName="alertmanager" Apr 16 20:31:55.198877 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.198428 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="6dfb82aa-444e-48aa-b144-b043d8b099f1" containerName="kube-rbac-proxy-web" Apr 16 20:31:55.198877 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.198436 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="6dfb82aa-444e-48aa-b144-b043d8b099f1" containerName="kube-rbac-proxy" Apr 16 20:31:55.198877 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.198442 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="6dfb82aa-444e-48aa-b144-b043d8b099f1" containerName="prom-label-proxy" Apr 16 20:31:55.198877 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.198451 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb910f93-043f-467a-88bc-ff78901b3eb4" containerName="registry" Apr 16 20:31:55.198877 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.198450 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e42102d03ecf27d8630c0000d20f3f264c49f3b45eb47d9f47addfec1cc131a0"} err="failed to get container status \"e42102d03ecf27d8630c0000d20f3f264c49f3b45eb47d9f47addfec1cc131a0\": rpc error: code = NotFound desc = could not find container \"e42102d03ecf27d8630c0000d20f3f264c49f3b45eb47d9f47addfec1cc131a0\": container with ID starting with e42102d03ecf27d8630c0000d20f3f264c49f3b45eb47d9f47addfec1cc131a0 not found: ID does not exist" Apr 16 20:31:55.198877 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.198473 2565 scope.go:117] "RemoveContainer" containerID="b9ec15369521b6c5f61859777ee2f943f762759dc5304f9079c3a28bd6a6a5f9" Apr 16 20:31:55.198877 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.198704 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9ec15369521b6c5f61859777ee2f943f762759dc5304f9079c3a28bd6a6a5f9"} err="failed to get container status \"b9ec15369521b6c5f61859777ee2f943f762759dc5304f9079c3a28bd6a6a5f9\": rpc error: code = NotFound desc = could not find container \"b9ec15369521b6c5f61859777ee2f943f762759dc5304f9079c3a28bd6a6a5f9\": container with ID starting with b9ec15369521b6c5f61859777ee2f943f762759dc5304f9079c3a28bd6a6a5f9 not found: ID does not exist" Apr 16 20:31:55.198877 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.198721 2565 scope.go:117] "RemoveContainer" containerID="d0d017712e5bf193dd467406283e577771a0193fcb706b3c8d021487c76223b6" Apr 16 20:31:55.199370 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.198993 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0d017712e5bf193dd467406283e577771a0193fcb706b3c8d021487c76223b6"} err="failed to get container status \"d0d017712e5bf193dd467406283e577771a0193fcb706b3c8d021487c76223b6\": rpc error: code = NotFound desc = could not find container \"d0d017712e5bf193dd467406283e577771a0193fcb706b3c8d021487c76223b6\": container with ID starting with d0d017712e5bf193dd467406283e577771a0193fcb706b3c8d021487c76223b6 not found: ID does not exist" Apr 16 20:31:55.199370 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.199018 2565 scope.go:117] "RemoveContainer" containerID="dadad5e0c0eb91e82a54b9dba9234b550ee66b79b7e4c4cdf50e8dc715440e26" Apr 16 20:31:55.199370 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.199298 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dadad5e0c0eb91e82a54b9dba9234b550ee66b79b7e4c4cdf50e8dc715440e26"} err="failed to get container status \"dadad5e0c0eb91e82a54b9dba9234b550ee66b79b7e4c4cdf50e8dc715440e26\": rpc error: code = NotFound desc = could not find container \"dadad5e0c0eb91e82a54b9dba9234b550ee66b79b7e4c4cdf50e8dc715440e26\": container with ID starting with dadad5e0c0eb91e82a54b9dba9234b550ee66b79b7e4c4cdf50e8dc715440e26 not found: ID does not exist" Apr 16 20:31:55.199370 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.199319 2565 scope.go:117] "RemoveContainer" containerID="739ec335c47112785d4e230a7f15bb02f8dc1fffad06b1df2e44af3d0cbdaff0" Apr 16 20:31:55.199564 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.199546 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"739ec335c47112785d4e230a7f15bb02f8dc1fffad06b1df2e44af3d0cbdaff0"} err="failed to get container status \"739ec335c47112785d4e230a7f15bb02f8dc1fffad06b1df2e44af3d0cbdaff0\": rpc error: code = NotFound desc = could not find container \"739ec335c47112785d4e230a7f15bb02f8dc1fffad06b1df2e44af3d0cbdaff0\": container with ID starting with 739ec335c47112785d4e230a7f15bb02f8dc1fffad06b1df2e44af3d0cbdaff0 not found: ID does not exist" Apr 16 20:31:55.199600 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.199566 2565 scope.go:117] "RemoveContainer" containerID="232382c5680114f86dddf017bc601b089324dc6fb6f1c7c063e4c31a8146d344" Apr 16 20:31:55.199773 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.199754 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"232382c5680114f86dddf017bc601b089324dc6fb6f1c7c063e4c31a8146d344"} err="failed to get container status \"232382c5680114f86dddf017bc601b089324dc6fb6f1c7c063e4c31a8146d344\": rpc error: code = NotFound desc = could not find container \"232382c5680114f86dddf017bc601b089324dc6fb6f1c7c063e4c31a8146d344\": container with ID starting with 232382c5680114f86dddf017bc601b089324dc6fb6f1c7c063e4c31a8146d344 not found: ID does not exist" Apr 16 20:31:55.199817 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.199774 2565 scope.go:117] "RemoveContainer" containerID="5488117389be3df45eb854726907ce7cbbcad8d24e29c8f9d7a72498ed36e24d" Apr 16 20:31:55.199949 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.199934 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5488117389be3df45eb854726907ce7cbbcad8d24e29c8f9d7a72498ed36e24d"} err="failed to get container status \"5488117389be3df45eb854726907ce7cbbcad8d24e29c8f9d7a72498ed36e24d\": rpc error: code = NotFound desc = could not find container \"5488117389be3df45eb854726907ce7cbbcad8d24e29c8f9d7a72498ed36e24d\": container with ID starting with 5488117389be3df45eb854726907ce7cbbcad8d24e29c8f9d7a72498ed36e24d not found: ID does not exist" Apr 16 20:31:55.203820 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.203803 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.206161 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.206141 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 20:31:55.206253 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.206160 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 20:31:55.206253 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.206150 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 20:31:55.206375 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.206261 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 20:31:55.206575 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.206559 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 20:31:55.206646 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.206618 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-ckdgm\"" Apr 16 20:31:55.206742 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.206714 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 20:31:55.206742 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.206714 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 20:31:55.206893 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.206806 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 20:31:55.211642 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.211621 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 20:31:55.214413 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.214392 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:31:55.311631 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.311576 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dfb82aa-444e-48aa-b144-b043d8b099f1" path="/var/lib/kubelet/pods/6dfb82aa-444e-48aa-b144-b043d8b099f1/volumes" Apr 16 20:31:55.328189 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.328168 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d5b60c80-add7-47a6-96f9-55ecb755c5d5-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d5b60c80-add7-47a6-96f9-55ecb755c5d5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.328257 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.328198 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d5b60c80-add7-47a6-96f9-55ecb755c5d5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d5b60c80-add7-47a6-96f9-55ecb755c5d5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.328257 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.328217 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d5b60c80-add7-47a6-96f9-55ecb755c5d5-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d5b60c80-add7-47a6-96f9-55ecb755c5d5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.328257 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.328235 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d5b60c80-add7-47a6-96f9-55ecb755c5d5-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d5b60c80-add7-47a6-96f9-55ecb755c5d5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.328452 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.328343 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d5b60c80-add7-47a6-96f9-55ecb755c5d5-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d5b60c80-add7-47a6-96f9-55ecb755c5d5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.328452 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.328381 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d5b60c80-add7-47a6-96f9-55ecb755c5d5-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d5b60c80-add7-47a6-96f9-55ecb755c5d5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.328452 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.328411 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d5b60c80-add7-47a6-96f9-55ecb755c5d5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d5b60c80-add7-47a6-96f9-55ecb755c5d5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.328567 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.328457 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d5b60c80-add7-47a6-96f9-55ecb755c5d5-config-out\") pod \"alertmanager-main-0\" (UID: \"d5b60c80-add7-47a6-96f9-55ecb755c5d5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.328567 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.328482 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx4wl\" (UniqueName: \"kubernetes.io/projected/d5b60c80-add7-47a6-96f9-55ecb755c5d5-kube-api-access-qx4wl\") pod \"alertmanager-main-0\" (UID: \"d5b60c80-add7-47a6-96f9-55ecb755c5d5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.328567 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.328507 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d5b60c80-add7-47a6-96f9-55ecb755c5d5-config-volume\") pod \"alertmanager-main-0\" (UID: \"d5b60c80-add7-47a6-96f9-55ecb755c5d5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.328567 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.328532 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d5b60c80-add7-47a6-96f9-55ecb755c5d5-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d5b60c80-add7-47a6-96f9-55ecb755c5d5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.328567 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.328556 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5b60c80-add7-47a6-96f9-55ecb755c5d5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d5b60c80-add7-47a6-96f9-55ecb755c5d5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.328715 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.328576 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d5b60c80-add7-47a6-96f9-55ecb755c5d5-web-config\") pod \"alertmanager-main-0\" (UID: \"d5b60c80-add7-47a6-96f9-55ecb755c5d5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.429039 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.429013 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d5b60c80-add7-47a6-96f9-55ecb755c5d5-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d5b60c80-add7-47a6-96f9-55ecb755c5d5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.429164 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.429054 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d5b60c80-add7-47a6-96f9-55ecb755c5d5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d5b60c80-add7-47a6-96f9-55ecb755c5d5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.429164 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.429075 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d5b60c80-add7-47a6-96f9-55ecb755c5d5-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d5b60c80-add7-47a6-96f9-55ecb755c5d5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.429164 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.429092 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d5b60c80-add7-47a6-96f9-55ecb755c5d5-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d5b60c80-add7-47a6-96f9-55ecb755c5d5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.429164 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.429116 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d5b60c80-add7-47a6-96f9-55ecb755c5d5-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d5b60c80-add7-47a6-96f9-55ecb755c5d5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.429445 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.429249 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d5b60c80-add7-47a6-96f9-55ecb755c5d5-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d5b60c80-add7-47a6-96f9-55ecb755c5d5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.429445 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.429318 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d5b60c80-add7-47a6-96f9-55ecb755c5d5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d5b60c80-add7-47a6-96f9-55ecb755c5d5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.429445 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.429390 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d5b60c80-add7-47a6-96f9-55ecb755c5d5-config-out\") pod \"alertmanager-main-0\" (UID: \"d5b60c80-add7-47a6-96f9-55ecb755c5d5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.429445 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.429421 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qx4wl\" (UniqueName: \"kubernetes.io/projected/d5b60c80-add7-47a6-96f9-55ecb755c5d5-kube-api-access-qx4wl\") pod \"alertmanager-main-0\" (UID: \"d5b60c80-add7-47a6-96f9-55ecb755c5d5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.429638 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.429468 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d5b60c80-add7-47a6-96f9-55ecb755c5d5-config-volume\") pod \"alertmanager-main-0\" (UID: \"d5b60c80-add7-47a6-96f9-55ecb755c5d5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.429638 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.429517 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d5b60c80-add7-47a6-96f9-55ecb755c5d5-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d5b60c80-add7-47a6-96f9-55ecb755c5d5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.429638 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.429538 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d5b60c80-add7-47a6-96f9-55ecb755c5d5-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d5b60c80-add7-47a6-96f9-55ecb755c5d5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.429638 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.429568 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5b60c80-add7-47a6-96f9-55ecb755c5d5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d5b60c80-add7-47a6-96f9-55ecb755c5d5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.429638 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.429616 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d5b60c80-add7-47a6-96f9-55ecb755c5d5-web-config\") pod \"alertmanager-main-0\" (UID: \"d5b60c80-add7-47a6-96f9-55ecb755c5d5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.431217 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.430564 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d5b60c80-add7-47a6-96f9-55ecb755c5d5-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d5b60c80-add7-47a6-96f9-55ecb755c5d5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.432146 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.432121 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d5b60c80-add7-47a6-96f9-55ecb755c5d5-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d5b60c80-add7-47a6-96f9-55ecb755c5d5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.432235 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.432153 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d5b60c80-add7-47a6-96f9-55ecb755c5d5-config-out\") pod \"alertmanager-main-0\" (UID: \"d5b60c80-add7-47a6-96f9-55ecb755c5d5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.432235 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.432159 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d5b60c80-add7-47a6-96f9-55ecb755c5d5-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d5b60c80-add7-47a6-96f9-55ecb755c5d5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.432375 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.432261 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d5b60c80-add7-47a6-96f9-55ecb755c5d5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d5b60c80-add7-47a6-96f9-55ecb755c5d5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.432467 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.432443 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d5b60c80-add7-47a6-96f9-55ecb755c5d5-web-config\") pod \"alertmanager-main-0\" (UID: \"d5b60c80-add7-47a6-96f9-55ecb755c5d5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.432578 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.432507 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d5b60c80-add7-47a6-96f9-55ecb755c5d5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d5b60c80-add7-47a6-96f9-55ecb755c5d5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.432578 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.432549 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d5b60c80-add7-47a6-96f9-55ecb755c5d5-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d5b60c80-add7-47a6-96f9-55ecb755c5d5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.432812 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.432795 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5b60c80-add7-47a6-96f9-55ecb755c5d5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d5b60c80-add7-47a6-96f9-55ecb755c5d5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.433479 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.433463 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d5b60c80-add7-47a6-96f9-55ecb755c5d5-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d5b60c80-add7-47a6-96f9-55ecb755c5d5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.434059 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.434043 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d5b60c80-add7-47a6-96f9-55ecb755c5d5-config-volume\") pod \"alertmanager-main-0\" (UID: \"d5b60c80-add7-47a6-96f9-55ecb755c5d5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.437810 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.437792 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx4wl\" (UniqueName: \"kubernetes.io/projected/d5b60c80-add7-47a6-96f9-55ecb755c5d5-kube-api-access-qx4wl\") pod \"alertmanager-main-0\" (UID: \"d5b60c80-add7-47a6-96f9-55ecb755c5d5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.514289 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.514257 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:31:55.637187 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:55.637160 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:31:55.639114 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:31:55.639082 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5b60c80_add7_47a6_96f9_55ecb755c5d5.slice/crio-c89dc6351cdc5e717ea1352ef682c36fee0d64091e2ba30a82d05e21afbf7d7c WatchSource:0}: Error finding container c89dc6351cdc5e717ea1352ef682c36fee0d64091e2ba30a82d05e21afbf7d7c: Status 404 returned error can't find the container with id c89dc6351cdc5e717ea1352ef682c36fee0d64091e2ba30a82d05e21afbf7d7c Apr 16 20:31:56.150982 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:56.150888 2565 generic.go:358] "Generic (PLEG): container finished" podID="d5b60c80-add7-47a6-96f9-55ecb755c5d5" containerID="13de00ef02a5bdc7f496e99034d2b4251fea0ea6f36bf92e5bfa19d25c80467d" exitCode=0 Apr 16 20:31:56.150982 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:56.150929 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d5b60c80-add7-47a6-96f9-55ecb755c5d5","Type":"ContainerDied","Data":"13de00ef02a5bdc7f496e99034d2b4251fea0ea6f36bf92e5bfa19d25c80467d"} Apr 16 20:31:56.150982 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:56.150949 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d5b60c80-add7-47a6-96f9-55ecb755c5d5","Type":"ContainerStarted","Data":"c89dc6351cdc5e717ea1352ef682c36fee0d64091e2ba30a82d05e21afbf7d7c"} Apr 16 20:31:57.157202 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:57.157166 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d5b60c80-add7-47a6-96f9-55ecb755c5d5","Type":"ContainerStarted","Data":"3a0f1362a09667ff700816caa04c4b4d05a8cdefe37f372b2a086ed8403e6465"} Apr 16 20:31:57.157202 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:57.157205 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d5b60c80-add7-47a6-96f9-55ecb755c5d5","Type":"ContainerStarted","Data":"28a69d6295602508ac1cd26ec7de1957e049f72b9cfc0f2b47f1c313420a2990"} Apr 16 20:31:57.157595 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:57.157215 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d5b60c80-add7-47a6-96f9-55ecb755c5d5","Type":"ContainerStarted","Data":"9addd4408f29d7848c0b106b91910a8372c143e8477e0ca8575ba14dc6e64f15"} Apr 16 20:31:57.157595 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:57.157224 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d5b60c80-add7-47a6-96f9-55ecb755c5d5","Type":"ContainerStarted","Data":"511e927310b58c85366f20dd05a07b6624315594b8144eccae0883ae93753b44"} Apr 16 20:31:57.157595 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:57.157241 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d5b60c80-add7-47a6-96f9-55ecb755c5d5","Type":"ContainerStarted","Data":"c663180aa24a62e7f3296fd29d34ff1e8f7819e663ae743e3db8db2e77537cd5"} Apr 16 20:31:57.157595 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:57.157248 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d5b60c80-add7-47a6-96f9-55ecb755c5d5","Type":"ContainerStarted","Data":"33841de1214237a9d8accdff7996e59e4742c2e4d78d7fc0d330bd9520036b00"} Apr 16 20:31:57.184683 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:57.184630 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.184610286 podStartE2EDuration="2.184610286s" podCreationTimestamp="2026-04-16 20:31:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:31:57.182025943 +0000 UTC m=+260.449768473" watchObservedRunningTime="2026-04-16 20:31:57.184610286 +0000 UTC m=+260.452352740" Apr 16 20:31:57.646783 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:57.646689 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:31:57.647158 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:57.647124 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" containerName="prometheus" containerID="cri-o://2cf5bae9b94d5bab4456c4313f23f13f3683bb800074cf32ea30e5859efb67bd" gracePeriod=600 Apr 16 20:31:57.647336 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:57.647154 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" containerName="thanos-sidecar" containerID="cri-o://bb1f627b5a7f64b98b09aed90e98a78a955c3a4b978bb69a57a8b089888dc774" gracePeriod=600 Apr 16 20:31:57.647336 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:57.647181 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" containerName="kube-rbac-proxy-web" containerID="cri-o://2613c92047f3ad072943a21336e74359ab9a3fc791ecfceb45b786dac36fbcc2" gracePeriod=600 Apr 16 20:31:57.647336 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:57.647188 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" containerName="kube-rbac-proxy-thanos" containerID="cri-o://0a14ce85c742f235487cc76398562f67db05b986bcbccd3b4b0bef6f839d9744" gracePeriod=600 Apr 16 20:31:57.647336 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:57.647150 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" containerName="kube-rbac-proxy" containerID="cri-o://8f3e71fd8ca5b2353c8fe462479d0a9649fbbccbd14e1e77bd639fc0ca03d2b7" gracePeriod=600 Apr 16 20:31:57.647569 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:57.647333 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" containerName="config-reloader" containerID="cri-o://da1f9c211d6c48096cc51096d2acff6a2973b99c0a72b2070251695d95224071" gracePeriod=600 Apr 16 20:31:57.899293 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:57.899205 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.056424 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.056389 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-secret-metrics-client-certs\") pod \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " Apr 16 20:31:58.056602 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.056445 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-configmap-metrics-client-ca\") pod \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " Apr 16 20:31:58.056602 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.056468 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-secret-kube-rbac-proxy\") pod \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " Apr 16 20:31:58.056602 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.056483 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-config\") pod \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " Apr 16 20:31:58.056776 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.056613 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-prometheus-trusted-ca-bundle\") pod \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " Apr 16 20:31:58.056776 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.056657 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-configmap-serving-certs-ca-bundle\") pod \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " Apr 16 20:31:58.056776 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.056702 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " Apr 16 20:31:58.056776 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.056728 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-configmap-kubelet-serving-ca-bundle\") pod \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " Apr 16 20:31:58.056975 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.056779 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-config-out\") pod \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " Apr 16 20:31:58.056975 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.056812 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-prometheus-k8s-rulefiles-0\") pod \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " Apr 16 20:31:58.056975 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.056856 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-tls-assets\") pod \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " Apr 16 20:31:58.056975 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.056882 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dppg5\" (UniqueName: \"kubernetes.io/projected/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-kube-api-access-dppg5\") pod \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " Apr 16 20:31:58.056975 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.056909 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-prometheus-k8s-db\") pod \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " Apr 16 20:31:58.056975 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.056940 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-thanos-prometheus-http-client-file\") pod \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " Apr 16 20:31:58.056975 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.056946 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" (UID: "c9cbe6a2-2f41-4e82-b07a-84e1361bfca1"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:31:58.056975 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.056959 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" (UID: "c9cbe6a2-2f41-4e82-b07a-84e1361bfca1"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:31:58.057401 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.056982 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " Apr 16 20:31:58.057401 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.057267 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" (UID: "c9cbe6a2-2f41-4e82-b07a-84e1361bfca1"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:31:58.058359 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.058294 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" (UID: "c9cbe6a2-2f41-4e82-b07a-84e1361bfca1"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:31:58.058732 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.058708 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" (UID: "c9cbe6a2-2f41-4e82-b07a-84e1361bfca1"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:31:58.059187 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.059153 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" (UID: "c9cbe6a2-2f41-4e82-b07a-84e1361bfca1"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:31:58.059311 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.059209 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-secret-grpc-tls\") pod \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " Apr 16 20:31:58.059391 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.059312 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-web-config\") pod \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " Apr 16 20:31:58.059391 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.059347 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-secret-prometheus-k8s-tls\") pod \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\" (UID: \"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1\") " Apr 16 20:31:58.059491 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.059400 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-kube-api-access-dppg5" (OuterVolumeSpecName: "kube-api-access-dppg5") pod "c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" (UID: "c9cbe6a2-2f41-4e82-b07a-84e1361bfca1"). InnerVolumeSpecName "kube-api-access-dppg5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:31:58.059491 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.059457 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-config" (OuterVolumeSpecName: "config") pod "c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" (UID: "c9cbe6a2-2f41-4e82-b07a-84e1361bfca1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:31:58.059655 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.059628 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dppg5\" (UniqueName: \"kubernetes.io/projected/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-kube-api-access-dppg5\") on node \"ip-10-0-139-150.ec2.internal\" DevicePath \"\"" Apr 16 20:31:58.059767 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.059658 2565 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-prometheus-k8s-db\") on node \"ip-10-0-139-150.ec2.internal\" DevicePath \"\"" Apr 16 20:31:58.059767 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.059662 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" (UID: "c9cbe6a2-2f41-4e82-b07a-84e1361bfca1"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:31:58.059767 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.059674 2565 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-139-150.ec2.internal\" DevicePath \"\"" Apr 16 20:31:58.059767 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.059709 2565 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-configmap-metrics-client-ca\") on node \"ip-10-0-139-150.ec2.internal\" DevicePath \"\"" Apr 16 20:31:58.059767 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.059726 2565 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-config\") on node \"ip-10-0-139-150.ec2.internal\" DevicePath \"\"" Apr 16 20:31:58.059767 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.059741 2565 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-prometheus-trusted-ca-bundle\") on node \"ip-10-0-139-150.ec2.internal\" DevicePath \"\"" Apr 16 20:31:58.059767 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.059742 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" (UID: "c9cbe6a2-2f41-4e82-b07a-84e1361bfca1"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:31:58.059767 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.059756 2565 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-139-150.ec2.internal\" DevicePath \"\"" Apr 16 20:31:58.060233 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.059773 2565 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-139-150.ec2.internal\" DevicePath \"\"" Apr 16 20:31:58.060233 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.060146 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" (UID: "c9cbe6a2-2f41-4e82-b07a-84e1361bfca1"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:31:58.060635 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.060608 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" (UID: "c9cbe6a2-2f41-4e82-b07a-84e1361bfca1"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:31:58.060830 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.060807 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-config-out" (OuterVolumeSpecName: "config-out") pod "c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" (UID: "c9cbe6a2-2f41-4e82-b07a-84e1361bfca1"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:31:58.061359 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.061338 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" (UID: "c9cbe6a2-2f41-4e82-b07a-84e1361bfca1"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:31:58.061471 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.061451 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" (UID: "c9cbe6a2-2f41-4e82-b07a-84e1361bfca1"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:31:58.061684 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.061662 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" (UID: "c9cbe6a2-2f41-4e82-b07a-84e1361bfca1"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:31:58.061755 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.061742 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" (UID: "c9cbe6a2-2f41-4e82-b07a-84e1361bfca1"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:31:58.071611 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.071593 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-web-config" (OuterVolumeSpecName: "web-config") pod "c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" (UID: "c9cbe6a2-2f41-4e82-b07a-84e1361bfca1"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:31:58.160265 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.160236 2565 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-secret-kube-rbac-proxy\") on node \"ip-10-0-139-150.ec2.internal\" DevicePath \"\"" Apr 16 20:31:58.160265 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.160263 2565 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-139-150.ec2.internal\" DevicePath \"\"" Apr 16 20:31:58.160737 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.160298 2565 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-config-out\") on node \"ip-10-0-139-150.ec2.internal\" DevicePath \"\"" Apr 16 20:31:58.160737 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.160308 2565 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-139-150.ec2.internal\" DevicePath \"\"" Apr 16 20:31:58.160737 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.160317 2565 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-tls-assets\") on node \"ip-10-0-139-150.ec2.internal\" DevicePath \"\"" Apr 16 20:31:58.160737 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.160326 2565 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-thanos-prometheus-http-client-file\") on node \"ip-10-0-139-150.ec2.internal\" DevicePath \"\"" Apr 16 20:31:58.160737 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.160335 2565 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-secret-grpc-tls\") on node \"ip-10-0-139-150.ec2.internal\" DevicePath \"\"" Apr 16 20:31:58.160737 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.160343 2565 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-web-config\") on node \"ip-10-0-139-150.ec2.internal\" DevicePath \"\"" Apr 16 20:31:58.160737 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.160351 2565 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-secret-prometheus-k8s-tls\") on node \"ip-10-0-139-150.ec2.internal\" DevicePath \"\"" Apr 16 20:31:58.160737 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.160364 2565 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1-secret-metrics-client-certs\") on node \"ip-10-0-139-150.ec2.internal\" DevicePath \"\"" Apr 16 20:31:58.162742 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.162717 2565 generic.go:358] "Generic (PLEG): container finished" podID="c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" containerID="0a14ce85c742f235487cc76398562f67db05b986bcbccd3b4b0bef6f839d9744" exitCode=0 Apr 16 20:31:58.162742 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.162741 2565 generic.go:358] "Generic (PLEG): container finished" podID="c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" containerID="8f3e71fd8ca5b2353c8fe462479d0a9649fbbccbd14e1e77bd639fc0ca03d2b7" exitCode=0 Apr 16 20:31:58.162874 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.162747 2565 generic.go:358] "Generic (PLEG): container finished" podID="c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" containerID="2613c92047f3ad072943a21336e74359ab9a3fc791ecfceb45b786dac36fbcc2" exitCode=0 Apr 16 20:31:58.162874 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.162753 2565 generic.go:358] "Generic (PLEG): container finished" podID="c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" containerID="bb1f627b5a7f64b98b09aed90e98a78a955c3a4b978bb69a57a8b089888dc774" exitCode=0 Apr 16 20:31:58.162874 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.162758 2565 generic.go:358] "Generic (PLEG): container finished" podID="c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" containerID="da1f9c211d6c48096cc51096d2acff6a2973b99c0a72b2070251695d95224071" exitCode=0 Apr 16 20:31:58.162874 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.162764 2565 generic.go:358] "Generic (PLEG): container finished" podID="c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" containerID="2cf5bae9b94d5bab4456c4313f23f13f3683bb800074cf32ea30e5859efb67bd" exitCode=0 Apr 16 20:31:58.162874 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.162801 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1","Type":"ContainerDied","Data":"0a14ce85c742f235487cc76398562f67db05b986bcbccd3b4b0bef6f839d9744"} Apr 16 20:31:58.162874 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.162825 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.162874 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.162836 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1","Type":"ContainerDied","Data":"8f3e71fd8ca5b2353c8fe462479d0a9649fbbccbd14e1e77bd639fc0ca03d2b7"} Apr 16 20:31:58.162874 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.162850 2565 scope.go:117] "RemoveContainer" containerID="0a14ce85c742f235487cc76398562f67db05b986bcbccd3b4b0bef6f839d9744" Apr 16 20:31:58.163179 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.162908 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1","Type":"ContainerDied","Data":"2613c92047f3ad072943a21336e74359ab9a3fc791ecfceb45b786dac36fbcc2"} Apr 16 20:31:58.163179 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.162923 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1","Type":"ContainerDied","Data":"bb1f627b5a7f64b98b09aed90e98a78a955c3a4b978bb69a57a8b089888dc774"} Apr 16 20:31:58.163179 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.162935 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1","Type":"ContainerDied","Data":"da1f9c211d6c48096cc51096d2acff6a2973b99c0a72b2070251695d95224071"} Apr 16 20:31:58.163179 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.162944 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1","Type":"ContainerDied","Data":"2cf5bae9b94d5bab4456c4313f23f13f3683bb800074cf32ea30e5859efb67bd"} Apr 16 20:31:58.163179 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.162955 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c9cbe6a2-2f41-4e82-b07a-84e1361bfca1","Type":"ContainerDied","Data":"16b4f60cde4e68c9e68c2aef8ef1cb222b1c53fdaec466aa5a0cd580e0867eac"} Apr 16 20:31:58.173753 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.173729 2565 scope.go:117] "RemoveContainer" containerID="8f3e71fd8ca5b2353c8fe462479d0a9649fbbccbd14e1e77bd639fc0ca03d2b7" Apr 16 20:31:58.180695 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.180676 2565 scope.go:117] "RemoveContainer" containerID="2613c92047f3ad072943a21336e74359ab9a3fc791ecfceb45b786dac36fbcc2" Apr 16 20:31:58.187826 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.187805 2565 scope.go:117] "RemoveContainer" containerID="bb1f627b5a7f64b98b09aed90e98a78a955c3a4b978bb69a57a8b089888dc774" Apr 16 20:31:58.188118 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.188097 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:31:58.193530 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.193501 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:31:58.194971 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.194957 2565 scope.go:117] "RemoveContainer" containerID="da1f9c211d6c48096cc51096d2acff6a2973b99c0a72b2070251695d95224071" Apr 16 20:31:58.201307 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.201292 2565 scope.go:117] "RemoveContainer" containerID="2cf5bae9b94d5bab4456c4313f23f13f3683bb800074cf32ea30e5859efb67bd" Apr 16 20:31:58.208088 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.208070 2565 scope.go:117] "RemoveContainer" containerID="fdcdce4015d452d749462968da772236fe5f6e35a33c98b11aa4e424d5035ccf" Apr 16 20:31:58.214063 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.214036 2565 scope.go:117] "RemoveContainer" containerID="0a14ce85c742f235487cc76398562f67db05b986bcbccd3b4b0bef6f839d9744" Apr 16 20:31:58.214460 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:31:58.214436 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a14ce85c742f235487cc76398562f67db05b986bcbccd3b4b0bef6f839d9744\": container with ID starting with 0a14ce85c742f235487cc76398562f67db05b986bcbccd3b4b0bef6f839d9744 not found: ID does not exist" containerID="0a14ce85c742f235487cc76398562f67db05b986bcbccd3b4b0bef6f839d9744" Apr 16 20:31:58.214622 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.214471 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a14ce85c742f235487cc76398562f67db05b986bcbccd3b4b0bef6f839d9744"} err="failed to get container status \"0a14ce85c742f235487cc76398562f67db05b986bcbccd3b4b0bef6f839d9744\": rpc error: code = NotFound desc = could not find container \"0a14ce85c742f235487cc76398562f67db05b986bcbccd3b4b0bef6f839d9744\": container with ID starting with 0a14ce85c742f235487cc76398562f67db05b986bcbccd3b4b0bef6f839d9744 not found: ID does not exist" Apr 16 20:31:58.214622 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.214493 2565 scope.go:117] "RemoveContainer" containerID="8f3e71fd8ca5b2353c8fe462479d0a9649fbbccbd14e1e77bd639fc0ca03d2b7" Apr 16 20:31:58.214867 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:31:58.214751 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f3e71fd8ca5b2353c8fe462479d0a9649fbbccbd14e1e77bd639fc0ca03d2b7\": container with ID starting with 8f3e71fd8ca5b2353c8fe462479d0a9649fbbccbd14e1e77bd639fc0ca03d2b7 not found: ID does not exist" containerID="8f3e71fd8ca5b2353c8fe462479d0a9649fbbccbd14e1e77bd639fc0ca03d2b7" Apr 16 20:31:58.214867 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.214783 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f3e71fd8ca5b2353c8fe462479d0a9649fbbccbd14e1e77bd639fc0ca03d2b7"} err="failed to get container status \"8f3e71fd8ca5b2353c8fe462479d0a9649fbbccbd14e1e77bd639fc0ca03d2b7\": rpc error: code = NotFound desc = could not find container \"8f3e71fd8ca5b2353c8fe462479d0a9649fbbccbd14e1e77bd639fc0ca03d2b7\": container with ID starting with 8f3e71fd8ca5b2353c8fe462479d0a9649fbbccbd14e1e77bd639fc0ca03d2b7 not found: ID does not exist" Apr 16 20:31:58.214867 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.214805 2565 scope.go:117] "RemoveContainer" containerID="2613c92047f3ad072943a21336e74359ab9a3fc791ecfceb45b786dac36fbcc2" Apr 16 20:31:58.215202 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:31:58.215125 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2613c92047f3ad072943a21336e74359ab9a3fc791ecfceb45b786dac36fbcc2\": container with ID starting with 2613c92047f3ad072943a21336e74359ab9a3fc791ecfceb45b786dac36fbcc2 not found: ID does not exist" containerID="2613c92047f3ad072943a21336e74359ab9a3fc791ecfceb45b786dac36fbcc2" Apr 16 20:31:58.215202 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.215159 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2613c92047f3ad072943a21336e74359ab9a3fc791ecfceb45b786dac36fbcc2"} err="failed to get container status \"2613c92047f3ad072943a21336e74359ab9a3fc791ecfceb45b786dac36fbcc2\": rpc error: code = NotFound desc = could not find container \"2613c92047f3ad072943a21336e74359ab9a3fc791ecfceb45b786dac36fbcc2\": container with ID starting with 2613c92047f3ad072943a21336e74359ab9a3fc791ecfceb45b786dac36fbcc2 not found: ID does not exist" Apr 16 20:31:58.215409 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.215197 2565 scope.go:117] "RemoveContainer" containerID="bb1f627b5a7f64b98b09aed90e98a78a955c3a4b978bb69a57a8b089888dc774" Apr 16 20:31:58.215592 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:31:58.215575 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb1f627b5a7f64b98b09aed90e98a78a955c3a4b978bb69a57a8b089888dc774\": container with ID starting with bb1f627b5a7f64b98b09aed90e98a78a955c3a4b978bb69a57a8b089888dc774 not found: ID does not exist" containerID="bb1f627b5a7f64b98b09aed90e98a78a955c3a4b978bb69a57a8b089888dc774" Apr 16 20:31:58.215667 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.215598 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb1f627b5a7f64b98b09aed90e98a78a955c3a4b978bb69a57a8b089888dc774"} err="failed to get container status \"bb1f627b5a7f64b98b09aed90e98a78a955c3a4b978bb69a57a8b089888dc774\": rpc error: code = NotFound desc = could not find container \"bb1f627b5a7f64b98b09aed90e98a78a955c3a4b978bb69a57a8b089888dc774\": container with ID starting with bb1f627b5a7f64b98b09aed90e98a78a955c3a4b978bb69a57a8b089888dc774 not found: ID does not exist" Apr 16 20:31:58.215667 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.215612 2565 scope.go:117] "RemoveContainer" containerID="da1f9c211d6c48096cc51096d2acff6a2973b99c0a72b2070251695d95224071" Apr 16 20:31:58.215850 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:31:58.215833 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da1f9c211d6c48096cc51096d2acff6a2973b99c0a72b2070251695d95224071\": container with ID starting with da1f9c211d6c48096cc51096d2acff6a2973b99c0a72b2070251695d95224071 not found: ID does not exist" containerID="da1f9c211d6c48096cc51096d2acff6a2973b99c0a72b2070251695d95224071" Apr 16 20:31:58.215900 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.215855 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da1f9c211d6c48096cc51096d2acff6a2973b99c0a72b2070251695d95224071"} err="failed to get container status \"da1f9c211d6c48096cc51096d2acff6a2973b99c0a72b2070251695d95224071\": rpc error: code = NotFound desc = could not find container \"da1f9c211d6c48096cc51096d2acff6a2973b99c0a72b2070251695d95224071\": container with ID starting with da1f9c211d6c48096cc51096d2acff6a2973b99c0a72b2070251695d95224071 not found: ID does not exist" Apr 16 20:31:58.215900 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.215870 2565 scope.go:117] "RemoveContainer" containerID="2cf5bae9b94d5bab4456c4313f23f13f3683bb800074cf32ea30e5859efb67bd" Apr 16 20:31:58.216106 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:31:58.216087 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cf5bae9b94d5bab4456c4313f23f13f3683bb800074cf32ea30e5859efb67bd\": container with ID starting with 2cf5bae9b94d5bab4456c4313f23f13f3683bb800074cf32ea30e5859efb67bd not found: ID does not exist" containerID="2cf5bae9b94d5bab4456c4313f23f13f3683bb800074cf32ea30e5859efb67bd" Apr 16 20:31:58.216173 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.216114 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cf5bae9b94d5bab4456c4313f23f13f3683bb800074cf32ea30e5859efb67bd"} err="failed to get container status \"2cf5bae9b94d5bab4456c4313f23f13f3683bb800074cf32ea30e5859efb67bd\": rpc error: code = NotFound desc = could not find container \"2cf5bae9b94d5bab4456c4313f23f13f3683bb800074cf32ea30e5859efb67bd\": container with ID starting with 2cf5bae9b94d5bab4456c4313f23f13f3683bb800074cf32ea30e5859efb67bd not found: ID does not exist" Apr 16 20:31:58.216173 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.216135 2565 scope.go:117] "RemoveContainer" containerID="fdcdce4015d452d749462968da772236fe5f6e35a33c98b11aa4e424d5035ccf" Apr 16 20:31:58.216405 ip-10-0-139-150 kubenswrapper[2565]: E0416 20:31:58.216388 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdcdce4015d452d749462968da772236fe5f6e35a33c98b11aa4e424d5035ccf\": container with ID starting with fdcdce4015d452d749462968da772236fe5f6e35a33c98b11aa4e424d5035ccf not found: ID does not exist" containerID="fdcdce4015d452d749462968da772236fe5f6e35a33c98b11aa4e424d5035ccf" Apr 16 20:31:58.216483 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.216413 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdcdce4015d452d749462968da772236fe5f6e35a33c98b11aa4e424d5035ccf"} err="failed to get container status \"fdcdce4015d452d749462968da772236fe5f6e35a33c98b11aa4e424d5035ccf\": rpc error: code = NotFound desc = could not find container \"fdcdce4015d452d749462968da772236fe5f6e35a33c98b11aa4e424d5035ccf\": container with ID starting with fdcdce4015d452d749462968da772236fe5f6e35a33c98b11aa4e424d5035ccf not found: ID does not exist" Apr 16 20:31:58.216483 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.216431 2565 scope.go:117] "RemoveContainer" containerID="0a14ce85c742f235487cc76398562f67db05b986bcbccd3b4b0bef6f839d9744" Apr 16 20:31:58.216641 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.216623 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a14ce85c742f235487cc76398562f67db05b986bcbccd3b4b0bef6f839d9744"} err="failed to get container status \"0a14ce85c742f235487cc76398562f67db05b986bcbccd3b4b0bef6f839d9744\": rpc error: code = NotFound desc = could not find container \"0a14ce85c742f235487cc76398562f67db05b986bcbccd3b4b0bef6f839d9744\": container with ID starting with 0a14ce85c742f235487cc76398562f67db05b986bcbccd3b4b0bef6f839d9744 not found: ID does not exist" Apr 16 20:31:58.216683 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.216644 2565 scope.go:117] "RemoveContainer" containerID="8f3e71fd8ca5b2353c8fe462479d0a9649fbbccbd14e1e77bd639fc0ca03d2b7" Apr 16 20:31:58.216805 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.216786 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:31:58.216871 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.216855 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f3e71fd8ca5b2353c8fe462479d0a9649fbbccbd14e1e77bd639fc0ca03d2b7"} err="failed to get container status \"8f3e71fd8ca5b2353c8fe462479d0a9649fbbccbd14e1e77bd639fc0ca03d2b7\": rpc error: code = NotFound desc = could not find container \"8f3e71fd8ca5b2353c8fe462479d0a9649fbbccbd14e1e77bd639fc0ca03d2b7\": container with ID starting with 8f3e71fd8ca5b2353c8fe462479d0a9649fbbccbd14e1e77bd639fc0ca03d2b7 not found: ID does not exist" Apr 16 20:31:58.216911 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.216872 2565 scope.go:117] "RemoveContainer" containerID="2613c92047f3ad072943a21336e74359ab9a3fc791ecfceb45b786dac36fbcc2" Apr 16 20:31:58.217111 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.217091 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2613c92047f3ad072943a21336e74359ab9a3fc791ecfceb45b786dac36fbcc2"} err="failed to get container status \"2613c92047f3ad072943a21336e74359ab9a3fc791ecfceb45b786dac36fbcc2\": rpc error: code = NotFound desc = could not find container \"2613c92047f3ad072943a21336e74359ab9a3fc791ecfceb45b786dac36fbcc2\": container with ID starting with 2613c92047f3ad072943a21336e74359ab9a3fc791ecfceb45b786dac36fbcc2 not found: ID does not exist" Apr 16 20:31:58.217111 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.217111 2565 scope.go:117] "RemoveContainer" containerID="bb1f627b5a7f64b98b09aed90e98a78a955c3a4b978bb69a57a8b089888dc774" Apr 16 20:31:58.217224 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.217190 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" containerName="kube-rbac-proxy-thanos" Apr 16 20:31:58.217224 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.217204 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" containerName="kube-rbac-proxy-thanos" Apr 16 20:31:58.217224 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.217218 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" containerName="config-reloader" Apr 16 20:31:58.217358 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.217226 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" containerName="config-reloader" Apr 16 20:31:58.217358 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.217238 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" containerName="prometheus" Apr 16 20:31:58.217358 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.217247 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" containerName="prometheus" Apr 16 20:31:58.217358 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.217260 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" containerName="thanos-sidecar" Apr 16 20:31:58.217358 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.217268 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" containerName="thanos-sidecar" Apr 16 20:31:58.217358 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.217306 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" containerName="kube-rbac-proxy" Apr 16 20:31:58.217358 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.217316 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" containerName="kube-rbac-proxy" Apr 16 20:31:58.217358 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.217322 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb1f627b5a7f64b98b09aed90e98a78a955c3a4b978bb69a57a8b089888dc774"} err="failed to get container status \"bb1f627b5a7f64b98b09aed90e98a78a955c3a4b978bb69a57a8b089888dc774\": rpc error: code = NotFound desc = could not find container \"bb1f627b5a7f64b98b09aed90e98a78a955c3a4b978bb69a57a8b089888dc774\": container with ID starting with bb1f627b5a7f64b98b09aed90e98a78a955c3a4b978bb69a57a8b089888dc774 not found: ID does not exist" Apr 16 20:31:58.217358 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.217335 2565 scope.go:117] "RemoveContainer" containerID="da1f9c211d6c48096cc51096d2acff6a2973b99c0a72b2070251695d95224071" Apr 16 20:31:58.217358 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.217339 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" containerName="init-config-reloader" Apr 16 20:31:58.217358 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.217348 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" containerName="init-config-reloader" Apr 16 20:31:58.217358 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.217360 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" containerName="kube-rbac-proxy-web" Apr 16 20:31:58.218000 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.217368 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" containerName="kube-rbac-proxy-web" Apr 16 20:31:58.218000 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.217447 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" containerName="prometheus" Apr 16 20:31:58.218000 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.217461 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" containerName="kube-rbac-proxy-web" Apr 16 20:31:58.218000 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.217471 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" containerName="config-reloader" Apr 16 20:31:58.218000 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.217481 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" containerName="thanos-sidecar" Apr 16 20:31:58.218000 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.217492 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" containerName="kube-rbac-proxy" Apr 16 20:31:58.218000 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.217500 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" containerName="kube-rbac-proxy-thanos" Apr 16 20:31:58.218000 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.217550 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da1f9c211d6c48096cc51096d2acff6a2973b99c0a72b2070251695d95224071"} err="failed to get container status \"da1f9c211d6c48096cc51096d2acff6a2973b99c0a72b2070251695d95224071\": rpc error: code = NotFound desc = could not find container \"da1f9c211d6c48096cc51096d2acff6a2973b99c0a72b2070251695d95224071\": container with ID starting with da1f9c211d6c48096cc51096d2acff6a2973b99c0a72b2070251695d95224071 not found: ID does not exist" Apr 16 20:31:58.218000 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.217563 2565 scope.go:117] "RemoveContainer" containerID="2cf5bae9b94d5bab4456c4313f23f13f3683bb800074cf32ea30e5859efb67bd" Apr 16 20:31:58.218000 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.217792 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cf5bae9b94d5bab4456c4313f23f13f3683bb800074cf32ea30e5859efb67bd"} err="failed to get container status \"2cf5bae9b94d5bab4456c4313f23f13f3683bb800074cf32ea30e5859efb67bd\": rpc error: code = NotFound desc = could not find container \"2cf5bae9b94d5bab4456c4313f23f13f3683bb800074cf32ea30e5859efb67bd\": container with ID starting with 2cf5bae9b94d5bab4456c4313f23f13f3683bb800074cf32ea30e5859efb67bd not found: ID does not exist" Apr 16 20:31:58.218000 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.217814 2565 scope.go:117] "RemoveContainer" containerID="fdcdce4015d452d749462968da772236fe5f6e35a33c98b11aa4e424d5035ccf" Apr 16 20:31:58.218443 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.218058 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdcdce4015d452d749462968da772236fe5f6e35a33c98b11aa4e424d5035ccf"} err="failed to get container status \"fdcdce4015d452d749462968da772236fe5f6e35a33c98b11aa4e424d5035ccf\": rpc error: code = NotFound desc = could not find container \"fdcdce4015d452d749462968da772236fe5f6e35a33c98b11aa4e424d5035ccf\": container with ID starting with fdcdce4015d452d749462968da772236fe5f6e35a33c98b11aa4e424d5035ccf not found: ID does not exist" Apr 16 20:31:58.218443 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.218074 2565 scope.go:117] "RemoveContainer" containerID="0a14ce85c742f235487cc76398562f67db05b986bcbccd3b4b0bef6f839d9744" Apr 16 20:31:58.218443 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.218294 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a14ce85c742f235487cc76398562f67db05b986bcbccd3b4b0bef6f839d9744"} err="failed to get container status \"0a14ce85c742f235487cc76398562f67db05b986bcbccd3b4b0bef6f839d9744\": rpc error: code = NotFound desc = could not find container \"0a14ce85c742f235487cc76398562f67db05b986bcbccd3b4b0bef6f839d9744\": container with ID starting with 0a14ce85c742f235487cc76398562f67db05b986bcbccd3b4b0bef6f839d9744 not found: ID does not exist" Apr 16 20:31:58.218443 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.218310 2565 scope.go:117] "RemoveContainer" containerID="8f3e71fd8ca5b2353c8fe462479d0a9649fbbccbd14e1e77bd639fc0ca03d2b7" Apr 16 20:31:58.218589 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.218520 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f3e71fd8ca5b2353c8fe462479d0a9649fbbccbd14e1e77bd639fc0ca03d2b7"} err="failed to get container status \"8f3e71fd8ca5b2353c8fe462479d0a9649fbbccbd14e1e77bd639fc0ca03d2b7\": rpc error: code = NotFound desc = could not find container \"8f3e71fd8ca5b2353c8fe462479d0a9649fbbccbd14e1e77bd639fc0ca03d2b7\": container with ID starting with 8f3e71fd8ca5b2353c8fe462479d0a9649fbbccbd14e1e77bd639fc0ca03d2b7 not found: ID does not exist" Apr 16 20:31:58.218589 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.218551 2565 scope.go:117] "RemoveContainer" containerID="2613c92047f3ad072943a21336e74359ab9a3fc791ecfceb45b786dac36fbcc2" Apr 16 20:31:58.218768 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.218748 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2613c92047f3ad072943a21336e74359ab9a3fc791ecfceb45b786dac36fbcc2"} err="failed to get container status \"2613c92047f3ad072943a21336e74359ab9a3fc791ecfceb45b786dac36fbcc2\": rpc error: code = NotFound desc = could not find container \"2613c92047f3ad072943a21336e74359ab9a3fc791ecfceb45b786dac36fbcc2\": container with ID starting with 2613c92047f3ad072943a21336e74359ab9a3fc791ecfceb45b786dac36fbcc2 not found: ID does not exist" Apr 16 20:31:58.218817 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.218768 2565 scope.go:117] "RemoveContainer" containerID="bb1f627b5a7f64b98b09aed90e98a78a955c3a4b978bb69a57a8b089888dc774" Apr 16 20:31:58.218962 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.218943 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb1f627b5a7f64b98b09aed90e98a78a955c3a4b978bb69a57a8b089888dc774"} err="failed to get container status \"bb1f627b5a7f64b98b09aed90e98a78a955c3a4b978bb69a57a8b089888dc774\": rpc error: code = NotFound desc = could not find container \"bb1f627b5a7f64b98b09aed90e98a78a955c3a4b978bb69a57a8b089888dc774\": container with ID starting with bb1f627b5a7f64b98b09aed90e98a78a955c3a4b978bb69a57a8b089888dc774 not found: ID does not exist" Apr 16 20:31:58.219002 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.218964 2565 scope.go:117] "RemoveContainer" containerID="da1f9c211d6c48096cc51096d2acff6a2973b99c0a72b2070251695d95224071" Apr 16 20:31:58.219153 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.219132 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da1f9c211d6c48096cc51096d2acff6a2973b99c0a72b2070251695d95224071"} err="failed to get container status \"da1f9c211d6c48096cc51096d2acff6a2973b99c0a72b2070251695d95224071\": rpc error: code = NotFound desc = could not find container \"da1f9c211d6c48096cc51096d2acff6a2973b99c0a72b2070251695d95224071\": container with ID starting with da1f9c211d6c48096cc51096d2acff6a2973b99c0a72b2070251695d95224071 not found: ID does not exist" Apr 16 20:31:58.219214 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.219154 2565 scope.go:117] "RemoveContainer" containerID="2cf5bae9b94d5bab4456c4313f23f13f3683bb800074cf32ea30e5859efb67bd" Apr 16 20:31:58.219389 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.219370 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cf5bae9b94d5bab4456c4313f23f13f3683bb800074cf32ea30e5859efb67bd"} err="failed to get container status \"2cf5bae9b94d5bab4456c4313f23f13f3683bb800074cf32ea30e5859efb67bd\": rpc error: code = NotFound desc = could not find container \"2cf5bae9b94d5bab4456c4313f23f13f3683bb800074cf32ea30e5859efb67bd\": container with ID starting with 2cf5bae9b94d5bab4456c4313f23f13f3683bb800074cf32ea30e5859efb67bd not found: ID does not exist" Apr 16 20:31:58.219469 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.219392 2565 scope.go:117] "RemoveContainer" containerID="fdcdce4015d452d749462968da772236fe5f6e35a33c98b11aa4e424d5035ccf" Apr 16 20:31:58.219587 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.219570 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdcdce4015d452d749462968da772236fe5f6e35a33c98b11aa4e424d5035ccf"} err="failed to get container status \"fdcdce4015d452d749462968da772236fe5f6e35a33c98b11aa4e424d5035ccf\": rpc error: code = NotFound desc = could not find container \"fdcdce4015d452d749462968da772236fe5f6e35a33c98b11aa4e424d5035ccf\": container with ID starting with fdcdce4015d452d749462968da772236fe5f6e35a33c98b11aa4e424d5035ccf not found: ID does not exist" Apr 16 20:31:58.219631 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.219589 2565 scope.go:117] "RemoveContainer" containerID="0a14ce85c742f235487cc76398562f67db05b986bcbccd3b4b0bef6f839d9744" Apr 16 20:31:58.219767 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.219749 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a14ce85c742f235487cc76398562f67db05b986bcbccd3b4b0bef6f839d9744"} err="failed to get container status \"0a14ce85c742f235487cc76398562f67db05b986bcbccd3b4b0bef6f839d9744\": rpc error: code = NotFound desc = could not find container \"0a14ce85c742f235487cc76398562f67db05b986bcbccd3b4b0bef6f839d9744\": container with ID starting with 0a14ce85c742f235487cc76398562f67db05b986bcbccd3b4b0bef6f839d9744 not found: ID does not exist" Apr 16 20:31:58.219833 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.219768 2565 scope.go:117] "RemoveContainer" containerID="8f3e71fd8ca5b2353c8fe462479d0a9649fbbccbd14e1e77bd639fc0ca03d2b7" Apr 16 20:31:58.220011 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.219992 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f3e71fd8ca5b2353c8fe462479d0a9649fbbccbd14e1e77bd639fc0ca03d2b7"} err="failed to get container status \"8f3e71fd8ca5b2353c8fe462479d0a9649fbbccbd14e1e77bd639fc0ca03d2b7\": rpc error: code = NotFound desc = could not find container \"8f3e71fd8ca5b2353c8fe462479d0a9649fbbccbd14e1e77bd639fc0ca03d2b7\": container with ID starting with 8f3e71fd8ca5b2353c8fe462479d0a9649fbbccbd14e1e77bd639fc0ca03d2b7 not found: ID does not exist" Apr 16 20:31:58.220011 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.220010 2565 scope.go:117] "RemoveContainer" containerID="2613c92047f3ad072943a21336e74359ab9a3fc791ecfceb45b786dac36fbcc2" Apr 16 20:31:58.220222 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.220206 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2613c92047f3ad072943a21336e74359ab9a3fc791ecfceb45b786dac36fbcc2"} err="failed to get container status \"2613c92047f3ad072943a21336e74359ab9a3fc791ecfceb45b786dac36fbcc2\": rpc error: code = NotFound desc = could not find container \"2613c92047f3ad072943a21336e74359ab9a3fc791ecfceb45b786dac36fbcc2\": container with ID starting with 2613c92047f3ad072943a21336e74359ab9a3fc791ecfceb45b786dac36fbcc2 not found: ID does not exist" Apr 16 20:31:58.220294 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.220222 2565 scope.go:117] "RemoveContainer" containerID="bb1f627b5a7f64b98b09aed90e98a78a955c3a4b978bb69a57a8b089888dc774" Apr 16 20:31:58.220421 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.220406 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb1f627b5a7f64b98b09aed90e98a78a955c3a4b978bb69a57a8b089888dc774"} err="failed to get container status \"bb1f627b5a7f64b98b09aed90e98a78a955c3a4b978bb69a57a8b089888dc774\": rpc error: code = NotFound desc = could not find container \"bb1f627b5a7f64b98b09aed90e98a78a955c3a4b978bb69a57a8b089888dc774\": container with ID starting with bb1f627b5a7f64b98b09aed90e98a78a955c3a4b978bb69a57a8b089888dc774 not found: ID does not exist" Apr 16 20:31:58.220461 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.220420 2565 scope.go:117] "RemoveContainer" containerID="da1f9c211d6c48096cc51096d2acff6a2973b99c0a72b2070251695d95224071" Apr 16 20:31:58.220611 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.220592 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da1f9c211d6c48096cc51096d2acff6a2973b99c0a72b2070251695d95224071"} err="failed to get container status \"da1f9c211d6c48096cc51096d2acff6a2973b99c0a72b2070251695d95224071\": rpc error: code = NotFound desc = could not find container \"da1f9c211d6c48096cc51096d2acff6a2973b99c0a72b2070251695d95224071\": container with ID starting with da1f9c211d6c48096cc51096d2acff6a2973b99c0a72b2070251695d95224071 not found: ID does not exist" Apr 16 20:31:58.220664 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.220613 2565 scope.go:117] "RemoveContainer" containerID="2cf5bae9b94d5bab4456c4313f23f13f3683bb800074cf32ea30e5859efb67bd" Apr 16 20:31:58.220792 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.220775 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cf5bae9b94d5bab4456c4313f23f13f3683bb800074cf32ea30e5859efb67bd"} err="failed to get container status \"2cf5bae9b94d5bab4456c4313f23f13f3683bb800074cf32ea30e5859efb67bd\": rpc error: code = NotFound desc = could not find container \"2cf5bae9b94d5bab4456c4313f23f13f3683bb800074cf32ea30e5859efb67bd\": container with ID starting with 2cf5bae9b94d5bab4456c4313f23f13f3683bb800074cf32ea30e5859efb67bd not found: ID does not exist" Apr 16 20:31:58.220840 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.220792 2565 scope.go:117] "RemoveContainer" containerID="fdcdce4015d452d749462968da772236fe5f6e35a33c98b11aa4e424d5035ccf" Apr 16 20:31:58.220970 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.220954 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdcdce4015d452d749462968da772236fe5f6e35a33c98b11aa4e424d5035ccf"} err="failed to get container status \"fdcdce4015d452d749462968da772236fe5f6e35a33c98b11aa4e424d5035ccf\": rpc error: code = NotFound desc = could not find container \"fdcdce4015d452d749462968da772236fe5f6e35a33c98b11aa4e424d5035ccf\": container with ID starting with fdcdce4015d452d749462968da772236fe5f6e35a33c98b11aa4e424d5035ccf not found: ID does not exist" Apr 16 20:31:58.221007 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.220970 2565 scope.go:117] "RemoveContainer" containerID="0a14ce85c742f235487cc76398562f67db05b986bcbccd3b4b0bef6f839d9744" Apr 16 20:31:58.221124 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.221104 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a14ce85c742f235487cc76398562f67db05b986bcbccd3b4b0bef6f839d9744"} err="failed to get container status \"0a14ce85c742f235487cc76398562f67db05b986bcbccd3b4b0bef6f839d9744\": rpc error: code = NotFound desc = could not find container \"0a14ce85c742f235487cc76398562f67db05b986bcbccd3b4b0bef6f839d9744\": container with ID starting with 0a14ce85c742f235487cc76398562f67db05b986bcbccd3b4b0bef6f839d9744 not found: ID does not exist" Apr 16 20:31:58.221170 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.221127 2565 scope.go:117] "RemoveContainer" containerID="8f3e71fd8ca5b2353c8fe462479d0a9649fbbccbd14e1e77bd639fc0ca03d2b7" Apr 16 20:31:58.221389 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.221373 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f3e71fd8ca5b2353c8fe462479d0a9649fbbccbd14e1e77bd639fc0ca03d2b7"} err="failed to get container status \"8f3e71fd8ca5b2353c8fe462479d0a9649fbbccbd14e1e77bd639fc0ca03d2b7\": rpc error: code = NotFound desc = could not find container \"8f3e71fd8ca5b2353c8fe462479d0a9649fbbccbd14e1e77bd639fc0ca03d2b7\": container with ID starting with 8f3e71fd8ca5b2353c8fe462479d0a9649fbbccbd14e1e77bd639fc0ca03d2b7 not found: ID does not exist" Apr 16 20:31:58.221389 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.221389 2565 scope.go:117] "RemoveContainer" containerID="2613c92047f3ad072943a21336e74359ab9a3fc791ecfceb45b786dac36fbcc2" Apr 16 20:31:58.221596 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.221579 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2613c92047f3ad072943a21336e74359ab9a3fc791ecfceb45b786dac36fbcc2"} err="failed to get container status \"2613c92047f3ad072943a21336e74359ab9a3fc791ecfceb45b786dac36fbcc2\": rpc error: code = NotFound desc = could not find container \"2613c92047f3ad072943a21336e74359ab9a3fc791ecfceb45b786dac36fbcc2\": container with ID starting with 2613c92047f3ad072943a21336e74359ab9a3fc791ecfceb45b786dac36fbcc2 not found: ID does not exist" Apr 16 20:31:58.221645 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.221596 2565 scope.go:117] "RemoveContainer" containerID="bb1f627b5a7f64b98b09aed90e98a78a955c3a4b978bb69a57a8b089888dc774" Apr 16 20:31:58.221801 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.221785 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb1f627b5a7f64b98b09aed90e98a78a955c3a4b978bb69a57a8b089888dc774"} err="failed to get container status \"bb1f627b5a7f64b98b09aed90e98a78a955c3a4b978bb69a57a8b089888dc774\": rpc error: code = NotFound desc = could not find container \"bb1f627b5a7f64b98b09aed90e98a78a955c3a4b978bb69a57a8b089888dc774\": container with ID starting with bb1f627b5a7f64b98b09aed90e98a78a955c3a4b978bb69a57a8b089888dc774 not found: ID does not exist" Apr 16 20:31:58.221801 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.221801 2565 scope.go:117] "RemoveContainer" containerID="da1f9c211d6c48096cc51096d2acff6a2973b99c0a72b2070251695d95224071" Apr 16 20:31:58.221993 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.221976 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da1f9c211d6c48096cc51096d2acff6a2973b99c0a72b2070251695d95224071"} err="failed to get container status \"da1f9c211d6c48096cc51096d2acff6a2973b99c0a72b2070251695d95224071\": rpc error: code = NotFound desc = could not find container \"da1f9c211d6c48096cc51096d2acff6a2973b99c0a72b2070251695d95224071\": container with ID starting with da1f9c211d6c48096cc51096d2acff6a2973b99c0a72b2070251695d95224071 not found: ID does not exist" Apr 16 20:31:58.222037 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.221994 2565 scope.go:117] "RemoveContainer" containerID="2cf5bae9b94d5bab4456c4313f23f13f3683bb800074cf32ea30e5859efb67bd" Apr 16 20:31:58.222184 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.222167 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cf5bae9b94d5bab4456c4313f23f13f3683bb800074cf32ea30e5859efb67bd"} err="failed to get container status \"2cf5bae9b94d5bab4456c4313f23f13f3683bb800074cf32ea30e5859efb67bd\": rpc error: code = NotFound desc = could not find container \"2cf5bae9b94d5bab4456c4313f23f13f3683bb800074cf32ea30e5859efb67bd\": container with ID starting with 2cf5bae9b94d5bab4456c4313f23f13f3683bb800074cf32ea30e5859efb67bd not found: ID does not exist" Apr 16 20:31:58.222184 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.222184 2565 scope.go:117] "RemoveContainer" containerID="fdcdce4015d452d749462968da772236fe5f6e35a33c98b11aa4e424d5035ccf" Apr 16 20:31:58.222388 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.222367 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdcdce4015d452d749462968da772236fe5f6e35a33c98b11aa4e424d5035ccf"} err="failed to get container status \"fdcdce4015d452d749462968da772236fe5f6e35a33c98b11aa4e424d5035ccf\": rpc error: code = NotFound desc = could not find container \"fdcdce4015d452d749462968da772236fe5f6e35a33c98b11aa4e424d5035ccf\": container with ID starting with fdcdce4015d452d749462968da772236fe5f6e35a33c98b11aa4e424d5035ccf not found: ID does not exist" Apr 16 20:31:58.222388 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.222387 2565 scope.go:117] "RemoveContainer" containerID="0a14ce85c742f235487cc76398562f67db05b986bcbccd3b4b0bef6f839d9744" Apr 16 20:31:58.222660 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.222639 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a14ce85c742f235487cc76398562f67db05b986bcbccd3b4b0bef6f839d9744"} err="failed to get container status \"0a14ce85c742f235487cc76398562f67db05b986bcbccd3b4b0bef6f839d9744\": rpc error: code = NotFound desc = could not find container \"0a14ce85c742f235487cc76398562f67db05b986bcbccd3b4b0bef6f839d9744\": container with ID starting with 0a14ce85c742f235487cc76398562f67db05b986bcbccd3b4b0bef6f839d9744 not found: ID does not exist" Apr 16 20:31:58.222660 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.222657 2565 scope.go:117] "RemoveContainer" containerID="8f3e71fd8ca5b2353c8fe462479d0a9649fbbccbd14e1e77bd639fc0ca03d2b7" Apr 16 20:31:58.222825 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.222807 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f3e71fd8ca5b2353c8fe462479d0a9649fbbccbd14e1e77bd639fc0ca03d2b7"} err="failed to get container status \"8f3e71fd8ca5b2353c8fe462479d0a9649fbbccbd14e1e77bd639fc0ca03d2b7\": rpc error: code = NotFound desc = could not find container \"8f3e71fd8ca5b2353c8fe462479d0a9649fbbccbd14e1e77bd639fc0ca03d2b7\": container with ID starting with 8f3e71fd8ca5b2353c8fe462479d0a9649fbbccbd14e1e77bd639fc0ca03d2b7 not found: ID does not exist" Apr 16 20:31:58.222866 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.222828 2565 scope.go:117] "RemoveContainer" containerID="2613c92047f3ad072943a21336e74359ab9a3fc791ecfceb45b786dac36fbcc2" Apr 16 20:31:58.223008 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.222993 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.223046 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.223020 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2613c92047f3ad072943a21336e74359ab9a3fc791ecfceb45b786dac36fbcc2"} err="failed to get container status \"2613c92047f3ad072943a21336e74359ab9a3fc791ecfceb45b786dac36fbcc2\": rpc error: code = NotFound desc = could not find container \"2613c92047f3ad072943a21336e74359ab9a3fc791ecfceb45b786dac36fbcc2\": container with ID starting with 2613c92047f3ad072943a21336e74359ab9a3fc791ecfceb45b786dac36fbcc2 not found: ID does not exist" Apr 16 20:31:58.223046 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.223034 2565 scope.go:117] "RemoveContainer" containerID="bb1f627b5a7f64b98b09aed90e98a78a955c3a4b978bb69a57a8b089888dc774" Apr 16 20:31:58.223246 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.223214 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb1f627b5a7f64b98b09aed90e98a78a955c3a4b978bb69a57a8b089888dc774"} err="failed to get container status \"bb1f627b5a7f64b98b09aed90e98a78a955c3a4b978bb69a57a8b089888dc774\": rpc error: code = NotFound desc = could not find container \"bb1f627b5a7f64b98b09aed90e98a78a955c3a4b978bb69a57a8b089888dc774\": container with ID starting with bb1f627b5a7f64b98b09aed90e98a78a955c3a4b978bb69a57a8b089888dc774 not found: ID does not exist" Apr 16 20:31:58.223246 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.223241 2565 scope.go:117] "RemoveContainer" containerID="da1f9c211d6c48096cc51096d2acff6a2973b99c0a72b2070251695d95224071" Apr 16 20:31:58.223598 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.223574 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da1f9c211d6c48096cc51096d2acff6a2973b99c0a72b2070251695d95224071"} err="failed to get container status \"da1f9c211d6c48096cc51096d2acff6a2973b99c0a72b2070251695d95224071\": rpc error: code = NotFound desc = could not find container \"da1f9c211d6c48096cc51096d2acff6a2973b99c0a72b2070251695d95224071\": container with ID starting with da1f9c211d6c48096cc51096d2acff6a2973b99c0a72b2070251695d95224071 not found: ID does not exist" Apr 16 20:31:58.223598 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.223598 2565 scope.go:117] "RemoveContainer" containerID="2cf5bae9b94d5bab4456c4313f23f13f3683bb800074cf32ea30e5859efb67bd" Apr 16 20:31:58.223835 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.223812 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cf5bae9b94d5bab4456c4313f23f13f3683bb800074cf32ea30e5859efb67bd"} err="failed to get container status \"2cf5bae9b94d5bab4456c4313f23f13f3683bb800074cf32ea30e5859efb67bd\": rpc error: code = NotFound desc = could not find container \"2cf5bae9b94d5bab4456c4313f23f13f3683bb800074cf32ea30e5859efb67bd\": container with ID starting with 2cf5bae9b94d5bab4456c4313f23f13f3683bb800074cf32ea30e5859efb67bd not found: ID does not exist" Apr 16 20:31:58.223883 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.223839 2565 scope.go:117] "RemoveContainer" containerID="fdcdce4015d452d749462968da772236fe5f6e35a33c98b11aa4e424d5035ccf" Apr 16 20:31:58.224061 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.224043 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdcdce4015d452d749462968da772236fe5f6e35a33c98b11aa4e424d5035ccf"} err="failed to get container status \"fdcdce4015d452d749462968da772236fe5f6e35a33c98b11aa4e424d5035ccf\": rpc error: code = NotFound desc = could not find container \"fdcdce4015d452d749462968da772236fe5f6e35a33c98b11aa4e424d5035ccf\": container with ID starting with fdcdce4015d452d749462968da772236fe5f6e35a33c98b11aa4e424d5035ccf not found: ID does not exist" Apr 16 20:31:58.225784 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.225764 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 20:31:58.225854 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.225793 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 20:31:58.225854 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.225823 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 20:31:58.225854 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.225764 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 20:31:58.226041 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.226019 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 20:31:58.226135 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.226055 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 20:31:58.226135 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.226055 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 20:31:58.226135 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.226115 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-7qv47\"" Apr 16 20:31:58.226135 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.226123 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 20:31:58.226411 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.226392 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 20:31:58.226489 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.226424 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-eaai3e5b500du\"" Apr 16 20:31:58.226489 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.226440 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 20:31:58.237342 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.237322 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 20:31:58.238110 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.238085 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:31:58.239070 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.239057 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 20:31:58.362451 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.362420 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6628500b-21a7-4cc1-ab81-ae7532e675f1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.362451 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.362457 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6628500b-21a7-4cc1-ab81-ae7532e675f1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.362696 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.362478 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6628500b-21a7-4cc1-ab81-ae7532e675f1-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.362696 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.362561 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6628500b-21a7-4cc1-ab81-ae7532e675f1-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.362696 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.362592 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6628500b-21a7-4cc1-ab81-ae7532e675f1-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.362696 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.362614 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6628500b-21a7-4cc1-ab81-ae7532e675f1-web-config\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.362696 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.362640 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6628500b-21a7-4cc1-ab81-ae7532e675f1-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.362945 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.362712 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6628500b-21a7-4cc1-ab81-ae7532e675f1-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.362945 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.362745 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6628500b-21a7-4cc1-ab81-ae7532e675f1-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.362945 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.362774 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6628500b-21a7-4cc1-ab81-ae7532e675f1-config\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.362945 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.362798 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6628500b-21a7-4cc1-ab81-ae7532e675f1-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.362945 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.362835 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6628500b-21a7-4cc1-ab81-ae7532e675f1-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.362945 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.362863 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6628500b-21a7-4cc1-ab81-ae7532e675f1-config-out\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.362945 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.362890 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6628500b-21a7-4cc1-ab81-ae7532e675f1-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.362945 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.362915 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6628500b-21a7-4cc1-ab81-ae7532e675f1-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.362945 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.362941 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6628500b-21a7-4cc1-ab81-ae7532e675f1-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.363208 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.362967 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnkf4\" (UniqueName: \"kubernetes.io/projected/6628500b-21a7-4cc1-ab81-ae7532e675f1-kube-api-access-qnkf4\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.363208 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.363006 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6628500b-21a7-4cc1-ab81-ae7532e675f1-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.463892 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.463819 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6628500b-21a7-4cc1-ab81-ae7532e675f1-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.463892 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.463855 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6628500b-21a7-4cc1-ab81-ae7532e675f1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.463892 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.463873 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6628500b-21a7-4cc1-ab81-ae7532e675f1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.464119 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.464078 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6628500b-21a7-4cc1-ab81-ae7532e675f1-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.464172 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.464129 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6628500b-21a7-4cc1-ab81-ae7532e675f1-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.464172 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.464159 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6628500b-21a7-4cc1-ab81-ae7532e675f1-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.464264 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.464190 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6628500b-21a7-4cc1-ab81-ae7532e675f1-web-config\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.464264 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.464227 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6628500b-21a7-4cc1-ab81-ae7532e675f1-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.464400 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.464263 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6628500b-21a7-4cc1-ab81-ae7532e675f1-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.464400 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.464312 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6628500b-21a7-4cc1-ab81-ae7532e675f1-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.464400 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.464341 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6628500b-21a7-4cc1-ab81-ae7532e675f1-config\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.464400 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.464365 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6628500b-21a7-4cc1-ab81-ae7532e675f1-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.464600 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.464401 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6628500b-21a7-4cc1-ab81-ae7532e675f1-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.465238 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.465210 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6628500b-21a7-4cc1-ab81-ae7532e675f1-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.465378 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.464406 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6628500b-21a7-4cc1-ab81-ae7532e675f1-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.465378 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.465309 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6628500b-21a7-4cc1-ab81-ae7532e675f1-config-out\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.465378 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.465344 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6628500b-21a7-4cc1-ab81-ae7532e675f1-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.465378 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.465373 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6628500b-21a7-4cc1-ab81-ae7532e675f1-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.465578 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.465407 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6628500b-21a7-4cc1-ab81-ae7532e675f1-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.465578 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.465439 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qnkf4\" (UniqueName: \"kubernetes.io/projected/6628500b-21a7-4cc1-ab81-ae7532e675f1-kube-api-access-qnkf4\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.467919 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.467212 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6628500b-21a7-4cc1-ab81-ae7532e675f1-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.467919 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.467247 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6628500b-21a7-4cc1-ab81-ae7532e675f1-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.467919 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.467215 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6628500b-21a7-4cc1-ab81-ae7532e675f1-config\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.467919 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.467215 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6628500b-21a7-4cc1-ab81-ae7532e675f1-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.467919 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.467614 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6628500b-21a7-4cc1-ab81-ae7532e675f1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.468236 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.467919 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6628500b-21a7-4cc1-ab81-ae7532e675f1-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.468236 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.467995 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6628500b-21a7-4cc1-ab81-ae7532e675f1-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.468236 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.468063 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6628500b-21a7-4cc1-ab81-ae7532e675f1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.468236 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.468227 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6628500b-21a7-4cc1-ab81-ae7532e675f1-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.468490 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.468468 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6628500b-21a7-4cc1-ab81-ae7532e675f1-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.468662 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.468644 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6628500b-21a7-4cc1-ab81-ae7532e675f1-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.469029 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.469009 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6628500b-21a7-4cc1-ab81-ae7532e675f1-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.469402 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.469388 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6628500b-21a7-4cc1-ab81-ae7532e675f1-web-config\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.470137 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.470120 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6628500b-21a7-4cc1-ab81-ae7532e675f1-config-out\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.470704 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.470687 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6628500b-21a7-4cc1-ab81-ae7532e675f1-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.473419 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.473400 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnkf4\" (UniqueName: \"kubernetes.io/projected/6628500b-21a7-4cc1-ab81-ae7532e675f1-kube-api-access-qnkf4\") pod \"prometheus-k8s-0\" (UID: \"6628500b-21a7-4cc1-ab81-ae7532e675f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.535467 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.535435 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:31:58.664126 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:58.664033 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 20:31:58.666153 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:31:58.666123 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6628500b_21a7_4cc1_ab81_ae7532e675f1.slice/crio-944ee8d64ac3b3456aad668110e70617a2261c3182c62d6f06113a6ad0d2da66 WatchSource:0}: Error finding container 944ee8d64ac3b3456aad668110e70617a2261c3182c62d6f06113a6ad0d2da66: Status 404 returned error can't find the container with id 944ee8d64ac3b3456aad668110e70617a2261c3182c62d6f06113a6ad0d2da66 Apr 16 20:31:59.167177 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:59.167143 2565 generic.go:358] "Generic (PLEG): container finished" podID="6628500b-21a7-4cc1-ab81-ae7532e675f1" containerID="c67a0595148ee61ccbc8ff0b2dcb125a6a8261b90217151dad704ab728bd30da" exitCode=0 Apr 16 20:31:59.167592 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:59.167234 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6628500b-21a7-4cc1-ab81-ae7532e675f1","Type":"ContainerDied","Data":"c67a0595148ee61ccbc8ff0b2dcb125a6a8261b90217151dad704ab728bd30da"} Apr 16 20:31:59.167592 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:59.167296 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6628500b-21a7-4cc1-ab81-ae7532e675f1","Type":"ContainerStarted","Data":"944ee8d64ac3b3456aad668110e70617a2261c3182c62d6f06113a6ad0d2da66"} Apr 16 20:31:59.314141 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:31:59.314113 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9cbe6a2-2f41-4e82-b07a-84e1361bfca1" path="/var/lib/kubelet/pods/c9cbe6a2-2f41-4e82-b07a-84e1361bfca1/volumes" Apr 16 20:32:00.173537 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:32:00.173509 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6628500b-21a7-4cc1-ab81-ae7532e675f1","Type":"ContainerStarted","Data":"361fd7c3ef7b901ef92ab907132a7126bb39bd90c55579860f9327530d272c0c"} Apr 16 20:32:00.173873 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:32:00.173542 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6628500b-21a7-4cc1-ab81-ae7532e675f1","Type":"ContainerStarted","Data":"a0583f6c51f21084ff5fab34e6a54d10118b30c3025d03407e0f9deb28283829"} Apr 16 20:32:00.173873 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:32:00.173557 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6628500b-21a7-4cc1-ab81-ae7532e675f1","Type":"ContainerStarted","Data":"bb932d58f3e09ef611ee552a7d0598795a967388e3b92840b2e42a72dbefba95"} Apr 16 20:32:00.173873 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:32:00.173568 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6628500b-21a7-4cc1-ab81-ae7532e675f1","Type":"ContainerStarted","Data":"6da47214af4bb9e7841693408fc91943c5da6db4efae434798f677e0c8441e50"} Apr 16 20:32:00.173873 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:32:00.173578 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6628500b-21a7-4cc1-ab81-ae7532e675f1","Type":"ContainerStarted","Data":"b029629ba7b2da5d2be22345b703e39d3f6ec8790d9c7202018a6662729da66e"} Apr 16 20:32:00.173873 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:32:00.173586 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6628500b-21a7-4cc1-ab81-ae7532e675f1","Type":"ContainerStarted","Data":"1eef3e273486b1b6ec73f999147c5e043040e10b954ba5c669c445a1302b200a"} Apr 16 20:32:00.199294 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:32:00.199236 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.199219451 podStartE2EDuration="2.199219451s" podCreationTimestamp="2026-04-16 20:31:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:32:00.197739025 +0000 UTC m=+263.465481514" watchObservedRunningTime="2026-04-16 20:32:00.199219451 +0000 UTC m=+263.466961903" Apr 16 20:32:03.536226 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:32:03.536194 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:32:34.813174 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:32:34.813136 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-8qxb8"] Apr 16 20:32:34.816796 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:32:34.816774 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8qxb8" Apr 16 20:32:34.819260 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:32:34.819242 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 20:32:34.823068 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:32:34.823048 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8qxb8"] Apr 16 20:32:34.863062 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:32:34.863033 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1b7b087b-9564-4949-a98f-ccf2cec67801-dbus\") pod \"global-pull-secret-syncer-8qxb8\" (UID: \"1b7b087b-9564-4949-a98f-ccf2cec67801\") " pod="kube-system/global-pull-secret-syncer-8qxb8" Apr 16 20:32:34.863216 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:32:34.863083 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1b7b087b-9564-4949-a98f-ccf2cec67801-kubelet-config\") pod \"global-pull-secret-syncer-8qxb8\" (UID: \"1b7b087b-9564-4949-a98f-ccf2cec67801\") " pod="kube-system/global-pull-secret-syncer-8qxb8" Apr 16 20:32:34.863298 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:32:34.863243 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1b7b087b-9564-4949-a98f-ccf2cec67801-original-pull-secret\") pod \"global-pull-secret-syncer-8qxb8\" (UID: \"1b7b087b-9564-4949-a98f-ccf2cec67801\") " pod="kube-system/global-pull-secret-syncer-8qxb8" Apr 16 20:32:34.964448 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:32:34.964404 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1b7b087b-9564-4949-a98f-ccf2cec67801-original-pull-secret\") pod \"global-pull-secret-syncer-8qxb8\" (UID: \"1b7b087b-9564-4949-a98f-ccf2cec67801\") " pod="kube-system/global-pull-secret-syncer-8qxb8" Apr 16 20:32:34.964628 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:32:34.964471 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1b7b087b-9564-4949-a98f-ccf2cec67801-dbus\") pod \"global-pull-secret-syncer-8qxb8\" (UID: \"1b7b087b-9564-4949-a98f-ccf2cec67801\") " pod="kube-system/global-pull-secret-syncer-8qxb8" Apr 16 20:32:34.964628 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:32:34.964529 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1b7b087b-9564-4949-a98f-ccf2cec67801-kubelet-config\") pod \"global-pull-secret-syncer-8qxb8\" (UID: \"1b7b087b-9564-4949-a98f-ccf2cec67801\") " pod="kube-system/global-pull-secret-syncer-8qxb8" Apr 16 20:32:34.964628 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:32:34.964614 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1b7b087b-9564-4949-a98f-ccf2cec67801-kubelet-config\") pod \"global-pull-secret-syncer-8qxb8\" (UID: \"1b7b087b-9564-4949-a98f-ccf2cec67801\") " pod="kube-system/global-pull-secret-syncer-8qxb8" Apr 16 20:32:34.964765 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:32:34.964662 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1b7b087b-9564-4949-a98f-ccf2cec67801-dbus\") pod \"global-pull-secret-syncer-8qxb8\" (UID: \"1b7b087b-9564-4949-a98f-ccf2cec67801\") " pod="kube-system/global-pull-secret-syncer-8qxb8" Apr 16 20:32:34.966762 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:32:34.966743 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1b7b087b-9564-4949-a98f-ccf2cec67801-original-pull-secret\") pod \"global-pull-secret-syncer-8qxb8\" (UID: \"1b7b087b-9564-4949-a98f-ccf2cec67801\") " pod="kube-system/global-pull-secret-syncer-8qxb8" Apr 16 20:32:35.127638 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:32:35.127546 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8qxb8" Apr 16 20:32:35.243553 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:32:35.243523 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8qxb8"] Apr 16 20:32:35.246570 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:32:35.246542 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b7b087b_9564_4949_a98f_ccf2cec67801.slice/crio-5f466c02925b3e9e6458b66a9bc29366d1f7689b3eac4aed439995f501b6a9bd WatchSource:0}: Error finding container 5f466c02925b3e9e6458b66a9bc29366d1f7689b3eac4aed439995f501b6a9bd: Status 404 returned error can't find the container with id 5f466c02925b3e9e6458b66a9bc29366d1f7689b3eac4aed439995f501b6a9bd Apr 16 20:32:35.278443 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:32:35.278413 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8qxb8" event={"ID":"1b7b087b-9564-4949-a98f-ccf2cec67801","Type":"ContainerStarted","Data":"5f466c02925b3e9e6458b66a9bc29366d1f7689b3eac4aed439995f501b6a9bd"} Apr 16 20:32:37.179337 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:32:37.179264 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-shwkq_a53ef938-6712-4133-9657-41ecb93318cf/console-operator/2.log" Apr 16 20:32:37.181097 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:32:37.181049 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-shwkq_a53ef938-6712-4133-9657-41ecb93318cf/console-operator/2.log" Apr 16 20:32:37.197096 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:32:37.197076 2565 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 20:32:39.294660 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:32:39.294618 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8qxb8" event={"ID":"1b7b087b-9564-4949-a98f-ccf2cec67801","Type":"ContainerStarted","Data":"42c6db686a49f642246a2078a5eab3b71eb111e51516c1a1316db08a86a620b1"} Apr 16 20:32:39.309495 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:32:39.309446 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-8qxb8" podStartSLOduration=1.474850215 podStartE2EDuration="5.309430372s" podCreationTimestamp="2026-04-16 20:32:34 +0000 UTC" firstStartedPulling="2026-04-16 20:32:35.248143342 +0000 UTC m=+298.515885776" lastFinishedPulling="2026-04-16 20:32:39.082723497 +0000 UTC m=+302.350465933" observedRunningTime="2026-04-16 20:32:39.307991342 +0000 UTC m=+302.575733787" watchObservedRunningTime="2026-04-16 20:32:39.309430372 +0000 UTC m=+302.577172825" Apr 16 20:32:58.536057 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:32:58.536016 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:32:58.551306 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:32:58.551264 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:32:59.366605 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:32:59.366576 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 20:33:38.784480 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:33:38.784402 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-pgnhm"] Apr 16 20:33:38.787587 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:33:38.787571 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-pgnhm" Apr 16 20:33:38.790020 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:33:38.789995 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 20:33:38.790134 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:33:38.790079 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 20:33:38.790134 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:33:38.790099 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-fvvtp\"" Apr 16 20:33:38.798195 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:33:38.798175 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-pgnhm"] Apr 16 20:33:38.857742 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:33:38.857708 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9dcd3c31-4ab6-4bcf-b5d4-b97922621bd6-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-pgnhm\" (UID: \"9dcd3c31-4ab6-4bcf-b5d4-b97922621bd6\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-pgnhm" Apr 16 20:33:38.857893 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:33:38.857756 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cvnf\" (UniqueName: \"kubernetes.io/projected/9dcd3c31-4ab6-4bcf-b5d4-b97922621bd6-kube-api-access-2cvnf\") pod \"cert-manager-cainjector-8966b78d4-pgnhm\" (UID: \"9dcd3c31-4ab6-4bcf-b5d4-b97922621bd6\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-pgnhm" Apr 16 20:33:38.959044 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:33:38.959006 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2cvnf\" (UniqueName: \"kubernetes.io/projected/9dcd3c31-4ab6-4bcf-b5d4-b97922621bd6-kube-api-access-2cvnf\") pod \"cert-manager-cainjector-8966b78d4-pgnhm\" (UID: \"9dcd3c31-4ab6-4bcf-b5d4-b97922621bd6\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-pgnhm" Apr 16 20:33:38.959208 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:33:38.959137 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9dcd3c31-4ab6-4bcf-b5d4-b97922621bd6-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-pgnhm\" (UID: \"9dcd3c31-4ab6-4bcf-b5d4-b97922621bd6\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-pgnhm" Apr 16 20:33:38.972744 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:33:38.972716 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cvnf\" (UniqueName: \"kubernetes.io/projected/9dcd3c31-4ab6-4bcf-b5d4-b97922621bd6-kube-api-access-2cvnf\") pod \"cert-manager-cainjector-8966b78d4-pgnhm\" (UID: \"9dcd3c31-4ab6-4bcf-b5d4-b97922621bd6\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-pgnhm" Apr 16 20:33:38.975234 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:33:38.975217 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9dcd3c31-4ab6-4bcf-b5d4-b97922621bd6-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-pgnhm\" (UID: \"9dcd3c31-4ab6-4bcf-b5d4-b97922621bd6\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-pgnhm" Apr 16 20:33:39.108263 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:33:39.108178 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-pgnhm" Apr 16 20:33:39.233626 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:33:39.233592 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-pgnhm"] Apr 16 20:33:39.237342 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:33:39.237314 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9dcd3c31_4ab6_4bcf_b5d4_b97922621bd6.slice/crio-4397b29ce9d092fec16a7d37e1d914ff2337d667bc5e6275774954baa3366c01 WatchSource:0}: Error finding container 4397b29ce9d092fec16a7d37e1d914ff2337d667bc5e6275774954baa3366c01: Status 404 returned error can't find the container with id 4397b29ce9d092fec16a7d37e1d914ff2337d667bc5e6275774954baa3366c01 Apr 16 20:33:39.239037 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:33:39.239018 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:33:39.464542 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:33:39.464505 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-pgnhm" event={"ID":"9dcd3c31-4ab6-4bcf-b5d4-b97922621bd6","Type":"ContainerStarted","Data":"4397b29ce9d092fec16a7d37e1d914ff2337d667bc5e6275774954baa3366c01"} Apr 16 20:33:43.479843 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:33:43.479809 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-pgnhm" event={"ID":"9dcd3c31-4ab6-4bcf-b5d4-b97922621bd6","Type":"ContainerStarted","Data":"0c0836cb0fb532664d0adf4c672bd7386da4d9d2e2034dda30a1fec72a19d008"} Apr 16 20:33:50.461855 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:33:50.461772 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-pgnhm" podStartSLOduration=9.259047222 podStartE2EDuration="12.461756721s" podCreationTimestamp="2026-04-16 20:33:38 +0000 UTC" firstStartedPulling="2026-04-16 20:33:39.239173419 +0000 UTC m=+362.506915854" lastFinishedPulling="2026-04-16 20:33:42.441882922 +0000 UTC m=+365.709625353" observedRunningTime="2026-04-16 20:33:43.519077077 +0000 UTC m=+366.786819531" watchObservedRunningTime="2026-04-16 20:33:50.461756721 +0000 UTC m=+373.729499173" Apr 16 20:33:50.462786 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:33:50.462764 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-llx7m"] Apr 16 20:33:50.466033 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:33:50.466012 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-llx7m" Apr 16 20:33:50.468650 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:33:50.468630 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-27fxt\"" Apr 16 20:33:50.468734 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:33:50.468651 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:33:50.469659 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:33:50.469637 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 20:33:50.475133 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:33:50.475113 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-llx7m"] Apr 16 20:33:50.555626 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:33:50.555591 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgrrk\" (UniqueName: \"kubernetes.io/projected/0547fd10-3a65-4235-b38c-d0b80af720dc-kube-api-access-rgrrk\") pod \"openshift-lws-operator-bfc7f696d-llx7m\" (UID: \"0547fd10-3a65-4235-b38c-d0b80af720dc\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-llx7m" Apr 16 20:33:50.555786 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:33:50.555642 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0547fd10-3a65-4235-b38c-d0b80af720dc-tmp\") pod \"openshift-lws-operator-bfc7f696d-llx7m\" (UID: \"0547fd10-3a65-4235-b38c-d0b80af720dc\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-llx7m" Apr 16 20:33:50.656365 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:33:50.656332 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rgrrk\" (UniqueName: \"kubernetes.io/projected/0547fd10-3a65-4235-b38c-d0b80af720dc-kube-api-access-rgrrk\") pod \"openshift-lws-operator-bfc7f696d-llx7m\" (UID: \"0547fd10-3a65-4235-b38c-d0b80af720dc\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-llx7m" Apr 16 20:33:50.656557 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:33:50.656393 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0547fd10-3a65-4235-b38c-d0b80af720dc-tmp\") pod \"openshift-lws-operator-bfc7f696d-llx7m\" (UID: \"0547fd10-3a65-4235-b38c-d0b80af720dc\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-llx7m" Apr 16 20:33:50.656786 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:33:50.656766 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0547fd10-3a65-4235-b38c-d0b80af720dc-tmp\") pod \"openshift-lws-operator-bfc7f696d-llx7m\" (UID: \"0547fd10-3a65-4235-b38c-d0b80af720dc\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-llx7m" Apr 16 20:33:50.665240 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:33:50.665200 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgrrk\" (UniqueName: \"kubernetes.io/projected/0547fd10-3a65-4235-b38c-d0b80af720dc-kube-api-access-rgrrk\") pod \"openshift-lws-operator-bfc7f696d-llx7m\" (UID: \"0547fd10-3a65-4235-b38c-d0b80af720dc\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-llx7m" Apr 16 20:33:50.776166 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:33:50.776090 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-llx7m" Apr 16 20:33:50.922848 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:33:50.922814 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-llx7m"] Apr 16 20:33:50.926386 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:33:50.926359 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0547fd10_3a65_4235_b38c_d0b80af720dc.slice/crio-8a98abdfdd830d4ad26280b30ba9a5e4eb43b74e6dc81adb3dedd991d019f85d WatchSource:0}: Error finding container 8a98abdfdd830d4ad26280b30ba9a5e4eb43b74e6dc81adb3dedd991d019f85d: Status 404 returned error can't find the container with id 8a98abdfdd830d4ad26280b30ba9a5e4eb43b74e6dc81adb3dedd991d019f85d Apr 16 20:33:51.505419 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:33:51.505389 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-llx7m" event={"ID":"0547fd10-3a65-4235-b38c-d0b80af720dc","Type":"ContainerStarted","Data":"8a98abdfdd830d4ad26280b30ba9a5e4eb43b74e6dc81adb3dedd991d019f85d"} Apr 16 20:33:54.516680 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:33:54.516644 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-llx7m" event={"ID":"0547fd10-3a65-4235-b38c-d0b80af720dc","Type":"ContainerStarted","Data":"7431ba5372fd0aa1c1f6561b137aa9b4406bb03fbbe285243a533cbc6fe969fe"} Apr 16 20:33:54.532832 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:33:54.532779 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-llx7m" podStartSLOduration=1.814185292 podStartE2EDuration="4.532762319s" podCreationTimestamp="2026-04-16 20:33:50 +0000 UTC" firstStartedPulling="2026-04-16 20:33:50.928208755 +0000 UTC m=+374.195951186" lastFinishedPulling="2026-04-16 20:33:53.646785783 +0000 UTC m=+376.914528213" observedRunningTime="2026-04-16 20:33:54.532314886 +0000 UTC m=+377.800057339" watchObservedRunningTime="2026-04-16 20:33:54.532762319 +0000 UTC m=+377.800504775" Apr 16 20:34:11.709931 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:11.709890 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6cc777b675-bdjvc"] Apr 16 20:34:11.782479 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:11.782448 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6cc777b675-bdjvc"] Apr 16 20:34:11.782636 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:11.782566 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-bdjvc" Apr 16 20:34:11.787137 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:11.787115 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 16 20:34:11.787652 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:11.787631 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 16 20:34:11.787652 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:11.787642 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 16 20:34:11.787808 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:11.787673 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-k9btq\"" Apr 16 20:34:11.787808 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:11.787677 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 16 20:34:11.927004 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:11.926969 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/557d6b85-3ae0-4346-8a5c-ac3d9fc92eb8-webhook-cert\") pod \"opendatahub-operator-controller-manager-6cc777b675-bdjvc\" (UID: \"557d6b85-3ae0-4346-8a5c-ac3d9fc92eb8\") " pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-bdjvc" Apr 16 20:34:11.927004 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:11.927007 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl449\" (UniqueName: \"kubernetes.io/projected/557d6b85-3ae0-4346-8a5c-ac3d9fc92eb8-kube-api-access-xl449\") pod \"opendatahub-operator-controller-manager-6cc777b675-bdjvc\" (UID: \"557d6b85-3ae0-4346-8a5c-ac3d9fc92eb8\") " pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-bdjvc" Apr 16 20:34:11.927231 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:11.927063 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/557d6b85-3ae0-4346-8a5c-ac3d9fc92eb8-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6cc777b675-bdjvc\" (UID: \"557d6b85-3ae0-4346-8a5c-ac3d9fc92eb8\") " pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-bdjvc" Apr 16 20:34:12.028411 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:12.028324 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/557d6b85-3ae0-4346-8a5c-ac3d9fc92eb8-webhook-cert\") pod \"opendatahub-operator-controller-manager-6cc777b675-bdjvc\" (UID: \"557d6b85-3ae0-4346-8a5c-ac3d9fc92eb8\") " pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-bdjvc" Apr 16 20:34:12.028411 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:12.028359 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xl449\" (UniqueName: \"kubernetes.io/projected/557d6b85-3ae0-4346-8a5c-ac3d9fc92eb8-kube-api-access-xl449\") pod \"opendatahub-operator-controller-manager-6cc777b675-bdjvc\" (UID: \"557d6b85-3ae0-4346-8a5c-ac3d9fc92eb8\") " pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-bdjvc" Apr 16 20:34:12.028596 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:12.028436 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/557d6b85-3ae0-4346-8a5c-ac3d9fc92eb8-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6cc777b675-bdjvc\" (UID: \"557d6b85-3ae0-4346-8a5c-ac3d9fc92eb8\") " pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-bdjvc" Apr 16 20:34:12.031012 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:12.030986 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/557d6b85-3ae0-4346-8a5c-ac3d9fc92eb8-webhook-cert\") pod \"opendatahub-operator-controller-manager-6cc777b675-bdjvc\" (UID: \"557d6b85-3ae0-4346-8a5c-ac3d9fc92eb8\") " pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-bdjvc" Apr 16 20:34:12.031124 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:12.031054 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/557d6b85-3ae0-4346-8a5c-ac3d9fc92eb8-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6cc777b675-bdjvc\" (UID: \"557d6b85-3ae0-4346-8a5c-ac3d9fc92eb8\") " pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-bdjvc" Apr 16 20:34:12.040577 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:12.040549 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl449\" (UniqueName: \"kubernetes.io/projected/557d6b85-3ae0-4346-8a5c-ac3d9fc92eb8-kube-api-access-xl449\") pod \"opendatahub-operator-controller-manager-6cc777b675-bdjvc\" (UID: \"557d6b85-3ae0-4346-8a5c-ac3d9fc92eb8\") " pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-bdjvc" Apr 16 20:34:12.094293 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:12.094238 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-bdjvc" Apr 16 20:34:12.229154 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:12.229126 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6cc777b675-bdjvc"] Apr 16 20:34:12.231871 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:34:12.231839 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod557d6b85_3ae0_4346_8a5c_ac3d9fc92eb8.slice/crio-be78d0f11a0ee6fa379a47f8c30b95cbd1a76a2f49aa355e9dc1d42374c232e0 WatchSource:0}: Error finding container be78d0f11a0ee6fa379a47f8c30b95cbd1a76a2f49aa355e9dc1d42374c232e0: Status 404 returned error can't find the container with id be78d0f11a0ee6fa379a47f8c30b95cbd1a76a2f49aa355e9dc1d42374c232e0 Apr 16 20:34:12.570407 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:12.570374 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-bdjvc" event={"ID":"557d6b85-3ae0-4346-8a5c-ac3d9fc92eb8","Type":"ContainerStarted","Data":"be78d0f11a0ee6fa379a47f8c30b95cbd1a76a2f49aa355e9dc1d42374c232e0"} Apr 16 20:34:15.585763 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:15.585722 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-bdjvc" event={"ID":"557d6b85-3ae0-4346-8a5c-ac3d9fc92eb8","Type":"ContainerStarted","Data":"c713a023dceda9b7e6796de0b5d676f52346e073d01a233754e46c74fbae4834"} Apr 16 20:34:15.586228 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:15.585856 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-bdjvc" Apr 16 20:34:15.613210 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:15.613160 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-bdjvc" podStartSLOduration=2.201902915 podStartE2EDuration="4.613149178s" podCreationTimestamp="2026-04-16 20:34:11 +0000 UTC" firstStartedPulling="2026-04-16 20:34:12.233761863 +0000 UTC m=+395.501504293" lastFinishedPulling="2026-04-16 20:34:14.645008118 +0000 UTC m=+397.912750556" observedRunningTime="2026-04-16 20:34:15.611851666 +0000 UTC m=+398.879594119" watchObservedRunningTime="2026-04-16 20:34:15.613149178 +0000 UTC m=+398.880891630" Apr 16 20:34:17.519258 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:17.519227 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-7b555bff64-fwvlp"] Apr 16 20:34:17.522693 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:17.522678 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7b555bff64-fwvlp" Apr 16 20:34:17.525435 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:17.525405 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-drdrr\"" Apr 16 20:34:17.526457 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:17.526434 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 20:34:17.526564 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:17.526487 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 20:34:17.526564 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:17.526498 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 20:34:17.538017 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:17.537993 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7b555bff64-fwvlp"] Apr 16 20:34:17.678478 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:17.678436 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/43212ae0-1f10-4f82-99c6-ddfe53cb697c-metrics-cert\") pod \"lws-controller-manager-7b555bff64-fwvlp\" (UID: \"43212ae0-1f10-4f82-99c6-ddfe53cb697c\") " pod="openshift-lws-operator/lws-controller-manager-7b555bff64-fwvlp" Apr 16 20:34:17.678629 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:17.678531 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/43212ae0-1f10-4f82-99c6-ddfe53cb697c-manager-config\") pod \"lws-controller-manager-7b555bff64-fwvlp\" (UID: \"43212ae0-1f10-4f82-99c6-ddfe53cb697c\") " pod="openshift-lws-operator/lws-controller-manager-7b555bff64-fwvlp" Apr 16 20:34:17.678629 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:17.678577 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-482tk\" (UniqueName: \"kubernetes.io/projected/43212ae0-1f10-4f82-99c6-ddfe53cb697c-kube-api-access-482tk\") pod \"lws-controller-manager-7b555bff64-fwvlp\" (UID: \"43212ae0-1f10-4f82-99c6-ddfe53cb697c\") " pod="openshift-lws-operator/lws-controller-manager-7b555bff64-fwvlp" Apr 16 20:34:17.678715 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:17.678636 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43212ae0-1f10-4f82-99c6-ddfe53cb697c-cert\") pod \"lws-controller-manager-7b555bff64-fwvlp\" (UID: \"43212ae0-1f10-4f82-99c6-ddfe53cb697c\") " pod="openshift-lws-operator/lws-controller-manager-7b555bff64-fwvlp" Apr 16 20:34:17.779528 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:17.779427 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/43212ae0-1f10-4f82-99c6-ddfe53cb697c-manager-config\") pod \"lws-controller-manager-7b555bff64-fwvlp\" (UID: \"43212ae0-1f10-4f82-99c6-ddfe53cb697c\") " pod="openshift-lws-operator/lws-controller-manager-7b555bff64-fwvlp" Apr 16 20:34:17.779528 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:17.779482 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-482tk\" (UniqueName: \"kubernetes.io/projected/43212ae0-1f10-4f82-99c6-ddfe53cb697c-kube-api-access-482tk\") pod \"lws-controller-manager-7b555bff64-fwvlp\" (UID: \"43212ae0-1f10-4f82-99c6-ddfe53cb697c\") " pod="openshift-lws-operator/lws-controller-manager-7b555bff64-fwvlp" Apr 16 20:34:17.779528 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:17.779527 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43212ae0-1f10-4f82-99c6-ddfe53cb697c-cert\") pod \"lws-controller-manager-7b555bff64-fwvlp\" (UID: \"43212ae0-1f10-4f82-99c6-ddfe53cb697c\") " pod="openshift-lws-operator/lws-controller-manager-7b555bff64-fwvlp" Apr 16 20:34:17.779808 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:17.779576 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/43212ae0-1f10-4f82-99c6-ddfe53cb697c-metrics-cert\") pod \"lws-controller-manager-7b555bff64-fwvlp\" (UID: \"43212ae0-1f10-4f82-99c6-ddfe53cb697c\") " pod="openshift-lws-operator/lws-controller-manager-7b555bff64-fwvlp" Apr 16 20:34:17.780225 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:17.780193 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/43212ae0-1f10-4f82-99c6-ddfe53cb697c-manager-config\") pod \"lws-controller-manager-7b555bff64-fwvlp\" (UID: \"43212ae0-1f10-4f82-99c6-ddfe53cb697c\") " pod="openshift-lws-operator/lws-controller-manager-7b555bff64-fwvlp" Apr 16 20:34:17.782075 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:17.782052 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43212ae0-1f10-4f82-99c6-ddfe53cb697c-cert\") pod \"lws-controller-manager-7b555bff64-fwvlp\" (UID: \"43212ae0-1f10-4f82-99c6-ddfe53cb697c\") " pod="openshift-lws-operator/lws-controller-manager-7b555bff64-fwvlp" Apr 16 20:34:17.783204 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:17.783178 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/43212ae0-1f10-4f82-99c6-ddfe53cb697c-metrics-cert\") pod \"lws-controller-manager-7b555bff64-fwvlp\" (UID: \"43212ae0-1f10-4f82-99c6-ddfe53cb697c\") " pod="openshift-lws-operator/lws-controller-manager-7b555bff64-fwvlp" Apr 16 20:34:17.787925 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:17.787902 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-482tk\" (UniqueName: \"kubernetes.io/projected/43212ae0-1f10-4f82-99c6-ddfe53cb697c-kube-api-access-482tk\") pod \"lws-controller-manager-7b555bff64-fwvlp\" (UID: \"43212ae0-1f10-4f82-99c6-ddfe53cb697c\") " pod="openshift-lws-operator/lws-controller-manager-7b555bff64-fwvlp" Apr 16 20:34:17.832285 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:17.832260 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7b555bff64-fwvlp" Apr 16 20:34:17.961087 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:17.961064 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7b555bff64-fwvlp"] Apr 16 20:34:17.963789 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:34:17.963759 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43212ae0_1f10_4f82_99c6_ddfe53cb697c.slice/crio-6dccd157c9c857849550114d58cf0d254543af5bcd5c76395120aed42e8ea18a WatchSource:0}: Error finding container 6dccd157c9c857849550114d58cf0d254543af5bcd5c76395120aed42e8ea18a: Status 404 returned error can't find the container with id 6dccd157c9c857849550114d58cf0d254543af5bcd5c76395120aed42e8ea18a Apr 16 20:34:18.595401 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:18.595366 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7b555bff64-fwvlp" event={"ID":"43212ae0-1f10-4f82-99c6-ddfe53cb697c","Type":"ContainerStarted","Data":"6dccd157c9c857849550114d58cf0d254543af5bcd5c76395120aed42e8ea18a"} Apr 16 20:34:19.600676 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:19.600641 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7b555bff64-fwvlp" event={"ID":"43212ae0-1f10-4f82-99c6-ddfe53cb697c","Type":"ContainerStarted","Data":"2b15e66dd757be629d4f083051cbfecda133fb5efb79f1dc6640cab7f71f11f5"} Apr 16 20:34:19.601098 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:19.600734 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-7b555bff64-fwvlp" Apr 16 20:34:19.617743 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:19.617694 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-7b555bff64-fwvlp" podStartSLOduration=1.082758957 podStartE2EDuration="2.617677046s" podCreationTimestamp="2026-04-16 20:34:17 +0000 UTC" firstStartedPulling="2026-04-16 20:34:17.965831286 +0000 UTC m=+401.233573717" lastFinishedPulling="2026-04-16 20:34:19.500749361 +0000 UTC m=+402.768491806" observedRunningTime="2026-04-16 20:34:19.616446749 +0000 UTC m=+402.884189194" watchObservedRunningTime="2026-04-16 20:34:19.617677046 +0000 UTC m=+402.885419499" Apr 16 20:34:26.590494 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:26.590462 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-bdjvc" Apr 16 20:34:30.606305 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:30.606257 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-7b555bff64-fwvlp" Apr 16 20:34:31.521384 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:31.521352 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-86666bf97-ddqx5"] Apr 16 20:34:31.524683 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:31.524664 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-86666bf97-ddqx5" Apr 16 20:34:31.528556 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:31.528534 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 20:34:31.528823 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:31.528805 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 20:34:31.528906 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:31.528822 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-7kbg2\"" Apr 16 20:34:31.528906 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:31.528861 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 16 20:34:31.528989 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:31.528913 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 16 20:34:31.533792 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:31.533647 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-86666bf97-ddqx5"] Apr 16 20:34:31.591644 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:31.591615 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cbe64ec3-72ae-4601-bec9-c068f6132c15-tmp\") pod \"kube-auth-proxy-86666bf97-ddqx5\" (UID: \"cbe64ec3-72ae-4601-bec9-c068f6132c15\") " pod="openshift-ingress/kube-auth-proxy-86666bf97-ddqx5" Apr 16 20:34:31.591772 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:31.591657 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c95zx\" (UniqueName: \"kubernetes.io/projected/cbe64ec3-72ae-4601-bec9-c068f6132c15-kube-api-access-c95zx\") pod \"kube-auth-proxy-86666bf97-ddqx5\" (UID: \"cbe64ec3-72ae-4601-bec9-c068f6132c15\") " pod="openshift-ingress/kube-auth-proxy-86666bf97-ddqx5" Apr 16 20:34:31.591772 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:31.591716 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cbe64ec3-72ae-4601-bec9-c068f6132c15-tls-certs\") pod \"kube-auth-proxy-86666bf97-ddqx5\" (UID: \"cbe64ec3-72ae-4601-bec9-c068f6132c15\") " pod="openshift-ingress/kube-auth-proxy-86666bf97-ddqx5" Apr 16 20:34:31.693138 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:31.693097 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cbe64ec3-72ae-4601-bec9-c068f6132c15-tmp\") pod \"kube-auth-proxy-86666bf97-ddqx5\" (UID: \"cbe64ec3-72ae-4601-bec9-c068f6132c15\") " pod="openshift-ingress/kube-auth-proxy-86666bf97-ddqx5" Apr 16 20:34:31.693552 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:31.693160 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c95zx\" (UniqueName: \"kubernetes.io/projected/cbe64ec3-72ae-4601-bec9-c068f6132c15-kube-api-access-c95zx\") pod \"kube-auth-proxy-86666bf97-ddqx5\" (UID: \"cbe64ec3-72ae-4601-bec9-c068f6132c15\") " pod="openshift-ingress/kube-auth-proxy-86666bf97-ddqx5" Apr 16 20:34:31.693552 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:31.693196 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cbe64ec3-72ae-4601-bec9-c068f6132c15-tls-certs\") pod \"kube-auth-proxy-86666bf97-ddqx5\" (UID: \"cbe64ec3-72ae-4601-bec9-c068f6132c15\") " pod="openshift-ingress/kube-auth-proxy-86666bf97-ddqx5" Apr 16 20:34:31.695579 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:31.695546 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cbe64ec3-72ae-4601-bec9-c068f6132c15-tmp\") pod \"kube-auth-proxy-86666bf97-ddqx5\" (UID: \"cbe64ec3-72ae-4601-bec9-c068f6132c15\") " pod="openshift-ingress/kube-auth-proxy-86666bf97-ddqx5" Apr 16 20:34:31.695755 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:31.695737 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cbe64ec3-72ae-4601-bec9-c068f6132c15-tls-certs\") pod \"kube-auth-proxy-86666bf97-ddqx5\" (UID: \"cbe64ec3-72ae-4601-bec9-c068f6132c15\") " pod="openshift-ingress/kube-auth-proxy-86666bf97-ddqx5" Apr 16 20:34:31.701079 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:31.701061 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c95zx\" (UniqueName: \"kubernetes.io/projected/cbe64ec3-72ae-4601-bec9-c068f6132c15-kube-api-access-c95zx\") pod \"kube-auth-proxy-86666bf97-ddqx5\" (UID: \"cbe64ec3-72ae-4601-bec9-c068f6132c15\") " pod="openshift-ingress/kube-auth-proxy-86666bf97-ddqx5" Apr 16 20:34:31.837196 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:31.837108 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-86666bf97-ddqx5" Apr 16 20:34:31.957309 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:31.957263 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-86666bf97-ddqx5"] Apr 16 20:34:31.960571 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:34:31.960541 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbe64ec3_72ae_4601_bec9_c068f6132c15.slice/crio-e3e4fed77c5d6a599e6e063be7228514537f1bd3af37c9ea5fbbb7a88e1ff5ba WatchSource:0}: Error finding container e3e4fed77c5d6a599e6e063be7228514537f1bd3af37c9ea5fbbb7a88e1ff5ba: Status 404 returned error can't find the container with id e3e4fed77c5d6a599e6e063be7228514537f1bd3af37c9ea5fbbb7a88e1ff5ba Apr 16 20:34:32.643506 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:32.643470 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-86666bf97-ddqx5" event={"ID":"cbe64ec3-72ae-4601-bec9-c068f6132c15","Type":"ContainerStarted","Data":"e3e4fed77c5d6a599e6e063be7228514537f1bd3af37c9ea5fbbb7a88e1ff5ba"} Apr 16 20:34:35.655907 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:35.655872 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-86666bf97-ddqx5" event={"ID":"cbe64ec3-72ae-4601-bec9-c068f6132c15","Type":"ContainerStarted","Data":"d78b755e9ec04e492e899a0776ebea138f7e597db729a3877f23e498f229c33d"} Apr 16 20:34:35.673830 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:34:35.673773 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-86666bf97-ddqx5" podStartSLOduration=1.79022851 podStartE2EDuration="4.673756735s" podCreationTimestamp="2026-04-16 20:34:31 +0000 UTC" firstStartedPulling="2026-04-16 20:34:31.962327286 +0000 UTC m=+415.230069718" lastFinishedPulling="2026-04-16 20:34:34.845855507 +0000 UTC m=+418.113597943" observedRunningTime="2026-04-16 20:34:35.672230736 +0000 UTC m=+418.939973193" watchObservedRunningTime="2026-04-16 20:34:35.673756735 +0000 UTC m=+418.941499188" Apr 16 20:36:19.727608 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:36:19.727576 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-q64l2"] Apr 16 20:36:19.730113 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:36:19.730094 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-q64l2" Apr 16 20:36:19.732667 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:36:19.732629 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 16 20:36:19.732667 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:36:19.732631 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 16 20:36:19.732858 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:36:19.732706 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-w2krl\"" Apr 16 20:36:19.733691 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:36:19.733674 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 20:36:19.733795 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:36:19.733778 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 20:36:19.741374 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:36:19.741356 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-q64l2"] Apr 16 20:36:19.831358 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:36:19.831332 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/01e0bd69-d4a7-46ef-8d6b-5586854ac189-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-q64l2\" (UID: \"01e0bd69-d4a7-46ef-8d6b-5586854ac189\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-q64l2" Apr 16 20:36:19.831482 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:36:19.831371 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/01e0bd69-d4a7-46ef-8d6b-5586854ac189-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-q64l2\" (UID: \"01e0bd69-d4a7-46ef-8d6b-5586854ac189\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-q64l2" Apr 16 20:36:19.831482 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:36:19.831397 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqkm5\" (UniqueName: \"kubernetes.io/projected/01e0bd69-d4a7-46ef-8d6b-5586854ac189-kube-api-access-mqkm5\") pod \"kuadrant-console-plugin-6cb54b5c86-q64l2\" (UID: \"01e0bd69-d4a7-46ef-8d6b-5586854ac189\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-q64l2" Apr 16 20:36:19.932342 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:36:19.932316 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/01e0bd69-d4a7-46ef-8d6b-5586854ac189-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-q64l2\" (UID: \"01e0bd69-d4a7-46ef-8d6b-5586854ac189\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-q64l2" Apr 16 20:36:19.932481 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:36:19.932356 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/01e0bd69-d4a7-46ef-8d6b-5586854ac189-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-q64l2\" (UID: \"01e0bd69-d4a7-46ef-8d6b-5586854ac189\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-q64l2" Apr 16 20:36:19.932481 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:36:19.932384 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mqkm5\" (UniqueName: \"kubernetes.io/projected/01e0bd69-d4a7-46ef-8d6b-5586854ac189-kube-api-access-mqkm5\") pod \"kuadrant-console-plugin-6cb54b5c86-q64l2\" (UID: \"01e0bd69-d4a7-46ef-8d6b-5586854ac189\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-q64l2" Apr 16 20:36:19.933032 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:36:19.933013 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/01e0bd69-d4a7-46ef-8d6b-5586854ac189-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-q64l2\" (UID: \"01e0bd69-d4a7-46ef-8d6b-5586854ac189\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-q64l2" Apr 16 20:36:19.934616 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:36:19.934594 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/01e0bd69-d4a7-46ef-8d6b-5586854ac189-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-q64l2\" (UID: \"01e0bd69-d4a7-46ef-8d6b-5586854ac189\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-q64l2" Apr 16 20:36:19.939616 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:36:19.939598 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqkm5\" (UniqueName: \"kubernetes.io/projected/01e0bd69-d4a7-46ef-8d6b-5586854ac189-kube-api-access-mqkm5\") pod \"kuadrant-console-plugin-6cb54b5c86-q64l2\" (UID: \"01e0bd69-d4a7-46ef-8d6b-5586854ac189\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-q64l2" Apr 16 20:36:20.042036 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:36:20.041962 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-q64l2" Apr 16 20:36:20.160671 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:36:20.160583 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-q64l2"] Apr 16 20:36:20.163499 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:36:20.163471 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01e0bd69_d4a7_46ef_8d6b_5586854ac189.slice/crio-3fec4199b0d04228e2f2fe5de3ff46819915b46f2f7eccd6d31597421ab9ac6f WatchSource:0}: Error finding container 3fec4199b0d04228e2f2fe5de3ff46819915b46f2f7eccd6d31597421ab9ac6f: Status 404 returned error can't find the container with id 3fec4199b0d04228e2f2fe5de3ff46819915b46f2f7eccd6d31597421ab9ac6f Apr 16 20:36:21.009199 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:36:21.009164 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-q64l2" event={"ID":"01e0bd69-d4a7-46ef-8d6b-5586854ac189","Type":"ContainerStarted","Data":"3fec4199b0d04228e2f2fe5de3ff46819915b46f2f7eccd6d31597421ab9ac6f"} Apr 16 20:36:46.101121 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:36:46.101084 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-q64l2" event={"ID":"01e0bd69-d4a7-46ef-8d6b-5586854ac189","Type":"ContainerStarted","Data":"cd386c8bd048b91424736813b69e2dd6634a7adc7bc8a5a4cc53618a2523c368"} Apr 16 20:36:46.116963 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:36:46.116910 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-q64l2" podStartSLOduration=1.95569609 podStartE2EDuration="27.116897051s" podCreationTimestamp="2026-04-16 20:36:19 +0000 UTC" firstStartedPulling="2026-04-16 20:36:20.164849838 +0000 UTC m=+523.432592270" lastFinishedPulling="2026-04-16 20:36:45.3260508 +0000 UTC m=+548.593793231" observedRunningTime="2026-04-16 20:36:46.115420411 +0000 UTC m=+549.383162867" watchObservedRunningTime="2026-04-16 20:36:46.116897051 +0000 UTC m=+549.384639504" Apr 16 20:37:03.239357 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:37:03.239324 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:37:03.242828 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:37:03.242805 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-cx7lq" Apr 16 20:37:03.245328 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:37:03.245308 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 20:37:03.250730 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:37:03.250708 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:37:03.261442 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:37:03.261419 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:37:03.318514 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:37:03.318493 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7b85\" (UniqueName: \"kubernetes.io/projected/90566346-6d7a-42ea-b45e-3f1113131883-kube-api-access-c7b85\") pod \"limitador-limitador-78c99df468-cx7lq\" (UID: \"90566346-6d7a-42ea-b45e-3f1113131883\") " pod="kuadrant-system/limitador-limitador-78c99df468-cx7lq" Apr 16 20:37:03.318608 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:37:03.318538 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/90566346-6d7a-42ea-b45e-3f1113131883-config-file\") pod \"limitador-limitador-78c99df468-cx7lq\" (UID: \"90566346-6d7a-42ea-b45e-3f1113131883\") " pod="kuadrant-system/limitador-limitador-78c99df468-cx7lq" Apr 16 20:37:03.419244 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:37:03.419219 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c7b85\" (UniqueName: \"kubernetes.io/projected/90566346-6d7a-42ea-b45e-3f1113131883-kube-api-access-c7b85\") pod \"limitador-limitador-78c99df468-cx7lq\" (UID: \"90566346-6d7a-42ea-b45e-3f1113131883\") " pod="kuadrant-system/limitador-limitador-78c99df468-cx7lq" Apr 16 20:37:03.419353 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:37:03.419265 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/90566346-6d7a-42ea-b45e-3f1113131883-config-file\") pod \"limitador-limitador-78c99df468-cx7lq\" (UID: \"90566346-6d7a-42ea-b45e-3f1113131883\") " pod="kuadrant-system/limitador-limitador-78c99df468-cx7lq" Apr 16 20:37:03.419852 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:37:03.419835 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/90566346-6d7a-42ea-b45e-3f1113131883-config-file\") pod \"limitador-limitador-78c99df468-cx7lq\" (UID: \"90566346-6d7a-42ea-b45e-3f1113131883\") " pod="kuadrant-system/limitador-limitador-78c99df468-cx7lq" Apr 16 20:37:03.427332 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:37:03.427300 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7b85\" (UniqueName: \"kubernetes.io/projected/90566346-6d7a-42ea-b45e-3f1113131883-kube-api-access-c7b85\") pod \"limitador-limitador-78c99df468-cx7lq\" (UID: \"90566346-6d7a-42ea-b45e-3f1113131883\") " pod="kuadrant-system/limitador-limitador-78c99df468-cx7lq" Apr 16 20:37:03.554111 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:37:03.554053 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-cx7lq" Apr 16 20:37:03.669472 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:37:03.669446 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:37:03.671524 ip-10-0-139-150 kubenswrapper[2565]: W0416 20:37:03.671495 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90566346_6d7a_42ea_b45e_3f1113131883.slice/crio-3989ae86779e0d8a0a808736241292ad80603191e2f953dd251ab8b88f64daff WatchSource:0}: Error finding container 3989ae86779e0d8a0a808736241292ad80603191e2f953dd251ab8b88f64daff: Status 404 returned error can't find the container with id 3989ae86779e0d8a0a808736241292ad80603191e2f953dd251ab8b88f64daff Apr 16 20:37:04.162550 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:37:04.162512 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-cx7lq" event={"ID":"90566346-6d7a-42ea-b45e-3f1113131883","Type":"ContainerStarted","Data":"3989ae86779e0d8a0a808736241292ad80603191e2f953dd251ab8b88f64daff"} Apr 16 20:37:07.174225 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:37:07.174173 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-cx7lq" event={"ID":"90566346-6d7a-42ea-b45e-3f1113131883","Type":"ContainerStarted","Data":"aea26ea9f22aa95412917a680f783f1a5726a30b53e6570a43f27161ba5c28a2"} Apr 16 20:37:07.174616 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:37:07.174305 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-cx7lq" Apr 16 20:37:07.192438 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:37:07.192388 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-cx7lq" podStartSLOduration=1.724867622 podStartE2EDuration="4.19237708s" podCreationTimestamp="2026-04-16 20:37:03 +0000 UTC" firstStartedPulling="2026-04-16 20:37:03.6733361 +0000 UTC m=+566.941078530" lastFinishedPulling="2026-04-16 20:37:06.140845548 +0000 UTC m=+569.408587988" observedRunningTime="2026-04-16 20:37:07.190454188 +0000 UTC m=+570.458196640" watchObservedRunningTime="2026-04-16 20:37:07.19237708 +0000 UTC m=+570.460119544" Apr 16 20:37:18.178901 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:37:18.178875 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-cx7lq" Apr 16 20:37:37.212140 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:37:37.212115 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-shwkq_a53ef938-6712-4133-9657-41ecb93318cf/console-operator/2.log" Apr 16 20:37:37.215288 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:37:37.215254 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-shwkq_a53ef938-6712-4133-9657-41ecb93318cf/console-operator/2.log" Apr 16 20:37:38.488593 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:37:38.488563 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:38:18.615830 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:38:18.615797 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:38:31.090730 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:38:31.090696 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:38:51.892697 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:38:51.892661 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:38:58.187950 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:38:58.187919 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:39:31.101408 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:39:31.101330 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:39:36.691615 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:39:36.691582 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:40:27.394009 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:40:27.393981 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:40:37.699889 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:40:37.699854 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:40:45.608321 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:40:45.608285 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:40:56.997236 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:40:56.997158 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:41:05.704375 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:41:05.704337 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:41:16.487547 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:41:16.487510 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:42:19.986334 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:42:19.986291 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:42:35.585962 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:42:35.585888 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:42:37.243065 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:42:37.243036 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-shwkq_a53ef938-6712-4133-9657-41ecb93318cf/console-operator/2.log" Apr 16 20:42:37.244933 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:42:37.244908 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-shwkq_a53ef938-6712-4133-9657-41ecb93318cf/console-operator/2.log" Apr 16 20:43:13.987077 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:43:13.986950 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:43:31.088237 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:43:31.088194 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:43:45.488035 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:43:45.488001 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:44:01.390648 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:44:01.390546 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:44:53.784224 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:44:53.784189 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:45:02.292456 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:45:02.292418 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:45:19.390895 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:45:19.390866 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:45:27.198335 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:45:27.198240 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:45:43.801379 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:45:43.801343 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:45:52.399214 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:45:52.399174 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:46:25.292829 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:46:25.292799 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:46:34.183490 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:46:34.183453 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:46:41.688991 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:46:41.688958 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:46:50.489042 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:46:50.489007 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:46:59.130131 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:46:59.130050 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:47:16.090092 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:47:16.090049 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:47:29.206350 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:47:29.206309 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:47:37.268389 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:47:37.268357 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-shwkq_a53ef938-6712-4133-9657-41ecb93318cf/console-operator/2.log" Apr 16 20:47:37.270990 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:47:37.270967 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-shwkq_a53ef938-6712-4133-9657-41ecb93318cf/console-operator/2.log" Apr 16 20:48:16.492667 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:48:16.492634 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:48:24.399069 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:48:24.399036 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:48:33.886036 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:48:33.885948 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:48:42.106582 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:48:42.106546 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:48:51.285399 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:48:51.285363 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:48:59.898670 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:48:59.898627 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:49:08.888685 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:49:08.888650 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:49:17.789708 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:49:17.789673 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:49:26.592159 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:49:26.592124 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:49:34.586069 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:49:34.586035 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:49:43.805604 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:49:43.805558 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:49:52.396470 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:49:52.396432 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:50:00.887361 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:50:00.887286 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:50:10.190801 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:50:10.190763 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:50:19.383011 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:50:19.382976 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:50:26.395215 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:50:26.395180 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:50:35.398223 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:50:35.398188 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:50:44.685331 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:50:44.685296 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:52:37.294697 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:52:37.294668 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-shwkq_a53ef938-6712-4133-9657-41ecb93318cf/console-operator/2.log" Apr 16 20:52:37.299034 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:52:37.299007 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-shwkq_a53ef938-6712-4133-9657-41ecb93318cf/console-operator/2.log" Apr 16 20:53:02.401687 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:53:02.401601 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:53:07.000690 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:53:07.000659 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:53:32.701663 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:53:32.701627 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:53:39.082441 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:53:39.082405 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:53:49.327718 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:53:49.327688 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:53:59.597422 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:53:59.597390 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:54:08.193644 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:54:08.193605 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:54:18.389720 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:54:18.389682 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:54:27.187694 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:54:27.187611 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:54:37.686726 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:54:37.686690 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:54:47.196433 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:54:47.196391 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:54:57.290158 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:54:57.290121 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:55:06.486063 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:55:06.486027 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:55:39.497196 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:55:39.497160 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:56:21.496520 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:56:21.496443 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:56:30.781093 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:56:30.781053 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:56:39.991856 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:56:39.991823 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:56:47.982654 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:56:47.982623 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:56:56.693864 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:56:56.693821 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:57:07.588027 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:57:07.587984 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:57:16.489075 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:57:16.489038 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:57:23.892776 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:57:23.892741 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:57:32.796738 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:57:32.796705 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:57:37.321093 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:57:37.321068 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-shwkq_a53ef938-6712-4133-9657-41ecb93318cf/console-operator/2.log" Apr 16 20:57:37.324568 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:57:37.324542 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-shwkq_a53ef938-6712-4133-9657-41ecb93318cf/console-operator/2.log" Apr 16 20:57:40.194907 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:57:40.194864 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:57:49.589517 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:57:49.589486 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:58:01.682329 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:58:01.682294 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:58:18.595060 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:58:18.595024 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:58:27.391752 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:58:27.391720 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:58:36.091544 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:58:36.091501 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:58:44.285826 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:58:44.285790 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:59:01.794390 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:59:01.794288 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:59:10.008598 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:59:10.008560 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:59:18.987889 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:59:18.987848 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:59:27.185008 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:59:27.184972 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:59:35.782053 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:59:35.782014 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:59:45.281411 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:59:45.281375 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 20:59:56.487553 ip-10-0-139-150 kubenswrapper[2565]: I0416 20:59:56.487521 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 21:00:07.193105 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:00:07.193072 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 21:00:15.994713 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:00:15.994678 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 21:00:29.122682 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:00:29.122600 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 21:00:39.021712 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:00:39.021680 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 21:00:46.701250 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:00:46.701214 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 21:00:54.622357 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:00:54.622320 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 21:01:02.422608 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:01:02.422576 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 21:01:18.910999 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:01:18.910964 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 21:01:28.639556 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:01:28.639525 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 21:01:36.790401 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:01:36.790366 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 21:01:44.794771 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:01:44.790384 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 21:02:09.630636 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:09.630552 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 21:02:21.396116 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:21.396084 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx7lq"] Apr 16 21:02:27.803688 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:27.803656 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6cc777b675-bdjvc_557d6b85-3ae0-4346-8a5c-ac3d9fc92eb8/manager/0.log" Apr 16 21:02:29.770227 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:29.770196 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-q64l2_01e0bd69-d4a7-46ef-8d6b-5586854ac189/kuadrant-console-plugin/0.log" Apr 16 21:02:30.155426 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:30.155356 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-cx7lq_90566346-6d7a-42ea-b45e-3f1113131883/limitador/0.log" Apr 16 21:02:31.002540 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:31.002509 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-86666bf97-ddqx5_cbe64ec3-72ae-4601-bec9-c068f6132c15/kube-auth-proxy/0.log" Apr 16 21:02:37.346715 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:37.346686 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-shwkq_a53ef938-6712-4133-9657-41ecb93318cf/console-operator/2.log" Apr 16 21:02:37.352215 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:37.352181 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-shwkq_a53ef938-6712-4133-9657-41ecb93318cf/console-operator/2.log" Apr 16 21:02:39.472162 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:39.472131 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-8qxb8_1b7b087b-9564-4949-a98f-ccf2cec67801/global-pull-secret-syncer/0.log" Apr 16 21:02:39.659251 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:39.659223 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-z6p8j_d95a170f-35f2-4732-bbb5-f8e2f6768efc/konnectivity-agent/0.log" Apr 16 21:02:39.733912 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:39.733841 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-150.ec2.internal_babf34b6062f196401e1c6b676b7c7db/haproxy/0.log" Apr 16 21:02:44.135979 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:44.135955 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-q64l2_01e0bd69-d4a7-46ef-8d6b-5586854ac189/kuadrant-console-plugin/0.log" Apr 16 21:02:44.288254 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:44.288218 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-cx7lq_90566346-6d7a-42ea-b45e-3f1113131883/limitador/0.log" Apr 16 21:02:45.821388 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:45.821362 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d5b60c80-add7-47a6-96f9-55ecb755c5d5/alertmanager/0.log" Apr 16 21:02:45.857652 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:45.857621 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d5b60c80-add7-47a6-96f9-55ecb755c5d5/config-reloader/0.log" Apr 16 21:02:45.893111 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:45.893055 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d5b60c80-add7-47a6-96f9-55ecb755c5d5/kube-rbac-proxy-web/0.log" Apr 16 21:02:45.928843 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:45.928821 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d5b60c80-add7-47a6-96f9-55ecb755c5d5/kube-rbac-proxy/0.log" Apr 16 21:02:45.952591 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:45.952568 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d5b60c80-add7-47a6-96f9-55ecb755c5d5/kube-rbac-proxy-metric/0.log" Apr 16 21:02:45.979811 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:45.979795 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d5b60c80-add7-47a6-96f9-55ecb755c5d5/prom-label-proxy/0.log" Apr 16 21:02:46.001246 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:46.001228 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d5b60c80-add7-47a6-96f9-55ecb755c5d5/init-config-reloader/0.log" Apr 16 21:02:46.079557 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:46.079522 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-rffls_390e92b4-466b-4c16-8ae4-95b92628be89/kube-state-metrics/0.log" Apr 16 21:02:46.098698 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:46.098671 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-rffls_390e92b4-466b-4c16-8ae4-95b92628be89/kube-rbac-proxy-main/0.log" Apr 16 21:02:46.117619 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:46.117597 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-rffls_390e92b4-466b-4c16-8ae4-95b92628be89/kube-rbac-proxy-self/0.log" Apr 16 21:02:46.141849 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:46.141824 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-5bdd65d46d-bpprw_501c40e3-b641-489e-b6a9-5314521662b0/metrics-server/0.log" Apr 16 21:02:46.171765 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:46.171742 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-hn9tf_81f9c12b-ce8f-4485-bd78-ddb944272d6b/monitoring-plugin/0.log" Apr 16 21:02:46.360695 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:46.360668 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vwwqh_c0695fb2-0f19-4495-91ba-ec07f29dbbc0/node-exporter/0.log" Apr 16 21:02:46.379984 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:46.379959 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vwwqh_c0695fb2-0f19-4495-91ba-ec07f29dbbc0/kube-rbac-proxy/0.log" Apr 16 21:02:46.402464 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:46.402444 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vwwqh_c0695fb2-0f19-4495-91ba-ec07f29dbbc0/init-textfile/0.log" Apr 16 21:02:46.526077 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:46.526002 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6628500b-21a7-4cc1-ab81-ae7532e675f1/prometheus/0.log" Apr 16 21:02:46.547240 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:46.547217 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6628500b-21a7-4cc1-ab81-ae7532e675f1/config-reloader/0.log" Apr 16 21:02:46.570600 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:46.570580 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6628500b-21a7-4cc1-ab81-ae7532e675f1/thanos-sidecar/0.log" Apr 16 21:02:46.591738 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:46.591717 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6628500b-21a7-4cc1-ab81-ae7532e675f1/kube-rbac-proxy-web/0.log" Apr 16 21:02:46.611524 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:46.611506 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6628500b-21a7-4cc1-ab81-ae7532e675f1/kube-rbac-proxy/0.log" Apr 16 21:02:46.634435 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:46.634410 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6628500b-21a7-4cc1-ab81-ae7532e675f1/kube-rbac-proxy-thanos/0.log" Apr 16 21:02:46.656506 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:46.656489 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6628500b-21a7-4cc1-ab81-ae7532e675f1/init-config-reloader/0.log" Apr 16 21:02:46.763067 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:46.763046 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-775c5544d8-4gndf_e4a3994e-037a-4c38-bdcb-1fb57bfec1e8/telemeter-client/0.log" Apr 16 21:02:46.784735 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:46.784671 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-775c5544d8-4gndf_e4a3994e-037a-4c38-bdcb-1fb57bfec1e8/reload/0.log" Apr 16 21:02:46.813081 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:46.813042 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-775c5544d8-4gndf_e4a3994e-037a-4c38-bdcb-1fb57bfec1e8/kube-rbac-proxy/0.log" Apr 16 21:02:46.842602 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:46.842581 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7f76dcb489-d8kdg_34913715-f62a-4682-9a2d-a76391be2411/thanos-query/0.log" Apr 16 21:02:46.862813 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:46.862787 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7f76dcb489-d8kdg_34913715-f62a-4682-9a2d-a76391be2411/kube-rbac-proxy-web/0.log" Apr 16 21:02:46.882795 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:46.882775 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7f76dcb489-d8kdg_34913715-f62a-4682-9a2d-a76391be2411/kube-rbac-proxy/0.log" Apr 16 21:02:46.903798 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:46.903769 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7f76dcb489-d8kdg_34913715-f62a-4682-9a2d-a76391be2411/prom-label-proxy/0.log" Apr 16 21:02:46.925842 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:46.925825 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7f76dcb489-d8kdg_34913715-f62a-4682-9a2d-a76391be2411/kube-rbac-proxy-rules/0.log" Apr 16 21:02:46.948883 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:46.948862 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7f76dcb489-d8kdg_34913715-f62a-4682-9a2d-a76391be2411/kube-rbac-proxy-metrics/0.log" Apr 16 21:02:47.940478 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:47.940447 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hdxxb/perf-node-gather-daemonset-w7vcv"] Apr 16 21:02:47.944039 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:47.944017 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-w7vcv" Apr 16 21:02:47.946928 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:47.946905 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-hdxxb\"/\"kube-root-ca.crt\"" Apr 16 21:02:47.947843 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:47.947824 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-hdxxb\"/\"openshift-service-ca.crt\"" Apr 16 21:02:47.947979 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:47.947859 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-hdxxb\"/\"default-dockercfg-brd7j\"" Apr 16 21:02:47.952194 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:47.952175 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hdxxb/perf-node-gather-daemonset-w7vcv"] Apr 16 21:02:47.970604 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:47.970582 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/41aa120f-8451-4a4f-924c-fee90390f51d-proc\") pod \"perf-node-gather-daemonset-w7vcv\" (UID: \"41aa120f-8451-4a4f-924c-fee90390f51d\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-w7vcv" Apr 16 21:02:47.970696 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:47.970645 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/41aa120f-8451-4a4f-924c-fee90390f51d-lib-modules\") pod \"perf-node-gather-daemonset-w7vcv\" (UID: \"41aa120f-8451-4a4f-924c-fee90390f51d\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-w7vcv" Apr 16 21:02:47.970696 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:47.970689 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/41aa120f-8451-4a4f-924c-fee90390f51d-podres\") pod \"perf-node-gather-daemonset-w7vcv\" (UID: \"41aa120f-8451-4a4f-924c-fee90390f51d\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-w7vcv" Apr 16 21:02:47.970769 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:47.970713 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/41aa120f-8451-4a4f-924c-fee90390f51d-sys\") pod \"perf-node-gather-daemonset-w7vcv\" (UID: \"41aa120f-8451-4a4f-924c-fee90390f51d\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-w7vcv" Apr 16 21:02:47.970769 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:47.970734 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhsjz\" (UniqueName: \"kubernetes.io/projected/41aa120f-8451-4a4f-924c-fee90390f51d-kube-api-access-dhsjz\") pod \"perf-node-gather-daemonset-w7vcv\" (UID: \"41aa120f-8451-4a4f-924c-fee90390f51d\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-w7vcv" Apr 16 21:02:48.071871 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:48.071848 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/41aa120f-8451-4a4f-924c-fee90390f51d-sys\") pod \"perf-node-gather-daemonset-w7vcv\" (UID: \"41aa120f-8451-4a4f-924c-fee90390f51d\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-w7vcv" Apr 16 21:02:48.071993 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:48.071881 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dhsjz\" (UniqueName: \"kubernetes.io/projected/41aa120f-8451-4a4f-924c-fee90390f51d-kube-api-access-dhsjz\") pod \"perf-node-gather-daemonset-w7vcv\" (UID: \"41aa120f-8451-4a4f-924c-fee90390f51d\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-w7vcv" Apr 16 21:02:48.071993 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:48.071936 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/41aa120f-8451-4a4f-924c-fee90390f51d-proc\") pod \"perf-node-gather-daemonset-w7vcv\" (UID: \"41aa120f-8451-4a4f-924c-fee90390f51d\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-w7vcv" Apr 16 21:02:48.071993 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:48.071979 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/41aa120f-8451-4a4f-924c-fee90390f51d-sys\") pod \"perf-node-gather-daemonset-w7vcv\" (UID: \"41aa120f-8451-4a4f-924c-fee90390f51d\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-w7vcv" Apr 16 21:02:48.072102 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:48.072036 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/41aa120f-8451-4a4f-924c-fee90390f51d-proc\") pod \"perf-node-gather-daemonset-w7vcv\" (UID: \"41aa120f-8451-4a4f-924c-fee90390f51d\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-w7vcv" Apr 16 21:02:48.072102 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:48.072037 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/41aa120f-8451-4a4f-924c-fee90390f51d-lib-modules\") pod \"perf-node-gather-daemonset-w7vcv\" (UID: \"41aa120f-8451-4a4f-924c-fee90390f51d\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-w7vcv" Apr 16 21:02:48.072102 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:48.072081 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/41aa120f-8451-4a4f-924c-fee90390f51d-podres\") pod \"perf-node-gather-daemonset-w7vcv\" (UID: \"41aa120f-8451-4a4f-924c-fee90390f51d\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-w7vcv" Apr 16 21:02:48.072224 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:48.072134 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/41aa120f-8451-4a4f-924c-fee90390f51d-lib-modules\") pod \"perf-node-gather-daemonset-w7vcv\" (UID: \"41aa120f-8451-4a4f-924c-fee90390f51d\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-w7vcv" Apr 16 21:02:48.072224 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:48.072165 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/41aa120f-8451-4a4f-924c-fee90390f51d-podres\") pod \"perf-node-gather-daemonset-w7vcv\" (UID: \"41aa120f-8451-4a4f-924c-fee90390f51d\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-w7vcv" Apr 16 21:02:48.080478 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:48.080461 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhsjz\" (UniqueName: \"kubernetes.io/projected/41aa120f-8451-4a4f-924c-fee90390f51d-kube-api-access-dhsjz\") pod \"perf-node-gather-daemonset-w7vcv\" (UID: \"41aa120f-8451-4a4f-924c-fee90390f51d\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-w7vcv" Apr 16 21:02:48.255373 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:48.255314 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-w7vcv" Apr 16 21:02:48.378507 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:48.378484 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hdxxb/perf-node-gather-daemonset-w7vcv"] Apr 16 21:02:48.381042 ip-10-0-139-150 kubenswrapper[2565]: W0416 21:02:48.381013 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod41aa120f_8451_4a4f_924c_fee90390f51d.slice/crio-a963e096206a07b25901000015b84131d1917db5a9f2720acaed86999944f523 WatchSource:0}: Error finding container a963e096206a07b25901000015b84131d1917db5a9f2720acaed86999944f523: Status 404 returned error can't find the container with id a963e096206a07b25901000015b84131d1917db5a9f2720acaed86999944f523 Apr 16 21:02:48.382680 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:48.382662 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 21:02:48.404325 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:48.404303 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-w7vcv" event={"ID":"41aa120f-8451-4a4f-924c-fee90390f51d","Type":"ContainerStarted","Data":"a963e096206a07b25901000015b84131d1917db5a9f2720acaed86999944f523"} Apr 16 21:02:48.533833 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:48.533759 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-shwkq_a53ef938-6712-4133-9657-41ecb93318cf/console-operator/2.log" Apr 16 21:02:48.538324 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:48.538303 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-shwkq_a53ef938-6712-4133-9657-41ecb93318cf/console-operator/3.log" Apr 16 21:02:49.409187 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:49.409155 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-w7vcv" event={"ID":"41aa120f-8451-4a4f-924c-fee90390f51d","Type":"ContainerStarted","Data":"166f16bb04a34ff818f9775310e5e9a915ec594b4f4509edb713bfa6dbf00dab"} Apr 16 21:02:49.409559 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:49.409205 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-w7vcv" Apr 16 21:02:49.426804 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:49.426756 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-w7vcv" podStartSLOduration=2.426741548 podStartE2EDuration="2.426741548s" podCreationTimestamp="2026-04-16 21:02:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 21:02:49.425334054 +0000 UTC m=+2112.693076508" watchObservedRunningTime="2026-04-16 21:02:49.426741548 +0000 UTC m=+2112.694484000" Apr 16 21:02:49.479378 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:49.479357 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-5dss8_505458c7-696b-4e52-94fd-8c6620a8cf96/volume-data-source-validator/0.log" Apr 16 21:02:50.178364 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:50.178339 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9gqr8_ec99b398-3371-4b5d-b2f7-ce06fed2c67c/dns/0.log" Apr 16 21:02:50.198329 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:50.198300 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9gqr8_ec99b398-3371-4b5d-b2f7-ce06fed2c67c/kube-rbac-proxy/0.log" Apr 16 21:02:50.305662 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:50.305639 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7qfs5_22753460-b103-4969-8aca-1ea39040795b/dns-node-resolver/0.log" Apr 16 21:02:50.921958 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:50.921930 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-n2tj2_72204d28-677e-4d89-a353-b087ce28c38f/node-ca/0.log" Apr 16 21:02:51.941178 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:51.941147 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-86666bf97-ddqx5_cbe64ec3-72ae-4601-bec9-c068f6132c15/kube-auth-proxy/0.log" Apr 16 21:02:52.545611 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:52.545581 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-bcshd_394dcb43-4d46-4c81-bcad-73d0aadfc01c/serve-healthcheck-canary/0.log" Apr 16 21:02:53.067043 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:53.067016 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-l9zgr_60ac5c46-c967-4000-bd8b-4f0c90324ecb/insights-operator/0.log" Apr 16 21:02:53.067461 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:53.067092 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-l9zgr_60ac5c46-c967-4000-bd8b-4f0c90324ecb/insights-operator/1.log" Apr 16 21:02:53.088717 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:53.088692 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2fvld_94d85daa-e508-46da-a759-2a5e804e6a61/kube-rbac-proxy/0.log" Apr 16 21:02:53.118312 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:53.118269 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2fvld_94d85daa-e508-46da-a759-2a5e804e6a61/exporter/0.log" Apr 16 21:02:53.138460 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:53.138444 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2fvld_94d85daa-e508-46da-a759-2a5e804e6a61/extractor/0.log" Apr 16 21:02:55.392930 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:55.392904 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6cc777b675-bdjvc_557d6b85-3ae0-4346-8a5c-ac3d9fc92eb8/manager/0.log" Apr 16 21:02:55.422764 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:55.422738 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-w7vcv" Apr 16 21:02:56.724607 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:56.724582 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-7b555bff64-fwvlp_43212ae0-1f10-4f82-99c6-ddfe53cb697c/manager/0.log" Apr 16 21:02:56.786251 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:02:56.786225 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-llx7m_0547fd10-3a65-4235-b38c-d0b80af720dc/openshift-lws-operator/0.log" Apr 16 21:03:01.151898 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:03:01.151862 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-d6hrz_99557dd7-ec96-4c42-9baf-cbe25a9d29da/migrator/0.log" Apr 16 21:03:01.171607 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:03:01.171581 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-d6hrz_99557dd7-ec96-4c42-9baf-cbe25a9d29da/graceful-termination/0.log" Apr 16 21:03:02.722967 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:03:02.722940 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hthg8_1aac658f-e4b3-4b53-a125-6cae725d6fcd/kube-multus-additional-cni-plugins/0.log" Apr 16 21:03:02.749960 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:03:02.749933 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hthg8_1aac658f-e4b3-4b53-a125-6cae725d6fcd/egress-router-binary-copy/0.log" Apr 16 21:03:02.770820 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:03:02.770795 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hthg8_1aac658f-e4b3-4b53-a125-6cae725d6fcd/cni-plugins/0.log" Apr 16 21:03:02.789907 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:03:02.789885 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hthg8_1aac658f-e4b3-4b53-a125-6cae725d6fcd/bond-cni-plugin/0.log" Apr 16 21:03:02.809343 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:03:02.809319 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hthg8_1aac658f-e4b3-4b53-a125-6cae725d6fcd/routeoverride-cni/0.log" Apr 16 21:03:02.830877 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:03:02.830855 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hthg8_1aac658f-e4b3-4b53-a125-6cae725d6fcd/whereabouts-cni-bincopy/0.log" Apr 16 21:03:02.857978 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:03:02.857951 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hthg8_1aac658f-e4b3-4b53-a125-6cae725d6fcd/whereabouts-cni/0.log" Apr 16 21:03:03.221308 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:03:03.221265 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cb7jp_dd8e84a9-042c-4346-8ef7-68dcca064683/kube-multus/0.log" Apr 16 21:03:03.298312 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:03:03.298266 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-jzdqd_90b993f2-207d-4894-bbdf-e2219dbf690b/network-metrics-daemon/0.log" Apr 16 21:03:03.319400 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:03:03.319380 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-jzdqd_90b993f2-207d-4894-bbdf-e2219dbf690b/kube-rbac-proxy/0.log" Apr 16 21:03:04.208783 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:03:04.208746 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-59rwg_97e2c184-5abb-438f-8f9b-2df48f93e465/ovn-controller/0.log" Apr 16 21:03:04.238697 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:03:04.238663 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-59rwg_97e2c184-5abb-438f-8f9b-2df48f93e465/ovn-acl-logging/0.log" Apr 16 21:03:04.257092 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:03:04.257055 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-59rwg_97e2c184-5abb-438f-8f9b-2df48f93e465/kube-rbac-proxy-node/0.log" Apr 16 21:03:04.277608 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:03:04.277586 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-59rwg_97e2c184-5abb-438f-8f9b-2df48f93e465/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 21:03:04.294026 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:03:04.293987 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-59rwg_97e2c184-5abb-438f-8f9b-2df48f93e465/northd/0.log" Apr 16 21:03:04.316883 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:03:04.316863 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-59rwg_97e2c184-5abb-438f-8f9b-2df48f93e465/nbdb/0.log" Apr 16 21:03:04.336660 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:03:04.336634 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-59rwg_97e2c184-5abb-438f-8f9b-2df48f93e465/sbdb/0.log" Apr 16 21:03:04.431948 ip-10-0-139-150 kubenswrapper[2565]: I0416 21:03:04.431916 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-59rwg_97e2c184-5abb-438f-8f9b-2df48f93e465/ovnkube-controller/0.log"