Apr 16 18:07:52.410354 ip-10-0-139-96 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 18:07:52.410365 ip-10-0-139-96 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 18:07:52.410372 ip-10-0-139-96 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 18:07:52.410579 ip-10-0-139-96 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 18:08:03.864270 ip-10-0-139-96 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 18:08:03.864290 ip-10-0-139-96 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 7119c587cca440da978ea4b3fdbe3535 -- Apr 16 18:10:40.695980 ip-10-0-139-96 systemd[1]: Starting Kubernetes Kubelet... Apr 16 18:10:41.070465 ip-10-0-139-96 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:10:41.070465 ip-10-0-139-96 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 18:10:41.070465 ip-10-0-139-96 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:10:41.070465 ip-10-0-139-96 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 18:10:41.070465 ip-10-0-139-96 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:10:41.072535 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.072372 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 18:10:41.077967 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.077947 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:10:41.077967 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.077964 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:10:41.077967 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.077968 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:10:41.077967 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.077972 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:10:41.078119 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.077976 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:10:41.078119 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.077982 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:10:41.078119 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.077987 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:10:41.078119 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.077991 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:10:41.078119 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.077994 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:10:41.078119 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.077997 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:10:41.078119 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078000 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:10:41.078119 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078003 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:10:41.078119 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078006 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:10:41.078119 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078015 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:10:41.078119 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078018 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:10:41.078119 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078021 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:10:41.078119 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078024 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:10:41.078119 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078026 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:10:41.078119 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078032 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:10:41.078119 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078035 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:10:41.078119 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078038 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:10:41.078119 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078041 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:10:41.078119 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078044 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:10:41.078119 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078047 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:10:41.078603 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078049 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:10:41.078603 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078052 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:10:41.078603 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078055 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:10:41.078603 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078057 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:10:41.078603 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078060 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:10:41.078603 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078063 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:10:41.078603 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078068 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:10:41.078603 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078071 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:10:41.078603 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078073 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:10:41.078603 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078076 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:10:41.078603 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078079 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:10:41.078603 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078082 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:10:41.078603 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078084 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:10:41.078603 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078089 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:10:41.078603 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078092 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:10:41.078603 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078095 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:10:41.078603 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078097 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:10:41.078603 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078100 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:10:41.078603 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078103 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:10:41.078603 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078108 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:10:41.079103 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078111 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:10:41.079103 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078113 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:10:41.079103 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078116 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:10:41.079103 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078119 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:10:41.079103 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078122 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:10:41.079103 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078124 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:10:41.079103 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078127 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:10:41.079103 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078130 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:10:41.079103 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078133 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:10:41.079103 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078136 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:10:41.079103 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078140 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:10:41.079103 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078146 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:10:41.079103 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078148 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:10:41.079103 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078151 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:10:41.079103 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078154 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:10:41.079103 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078157 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:10:41.079103 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078160 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:10:41.079103 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078162 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:10:41.079103 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078165 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:10:41.079103 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078168 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:10:41.079632 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078170 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:10:41.079632 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078173 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:10:41.079632 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078176 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:10:41.079632 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078178 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:10:41.079632 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078183 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:10:41.079632 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078188 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:10:41.079632 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078192 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:10:41.079632 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078194 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:10:41.079632 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078197 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:10:41.079632 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078200 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:10:41.079632 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078203 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:10:41.079632 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078214 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:10:41.079632 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078217 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:10:41.079632 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078220 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:10:41.079632 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078223 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:10:41.079632 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078228 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:10:41.079632 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078231 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:10:41.079632 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078233 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:10:41.079632 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078236 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:10:41.079632 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078239 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:10:41.080322 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078241 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:10:41.080322 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078245 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:10:41.080322 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078924 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:10:41.080322 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078931 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:10:41.080322 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078934 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:10:41.080322 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078937 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:10:41.080322 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078941 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:10:41.080322 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078944 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:10:41.080322 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078947 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:10:41.080322 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078950 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:10:41.080322 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078952 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:10:41.080322 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078958 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:10:41.080322 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078961 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:10:41.080322 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078964 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:10:41.080322 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078966 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:10:41.080322 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078969 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:10:41.080322 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078973 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:10:41.080322 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078978 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:10:41.080322 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078982 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:10:41.080817 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078992 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:10:41.080817 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078995 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:10:41.080817 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.078998 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:10:41.080817 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079002 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:10:41.080817 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079007 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:10:41.080817 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079010 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:10:41.080817 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079013 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:10:41.080817 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079016 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:10:41.080817 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079020 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:10:41.080817 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079023 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:10:41.080817 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079026 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:10:41.080817 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079028 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:10:41.080817 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079031 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:10:41.080817 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079034 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:10:41.080817 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079036 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:10:41.080817 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079039 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:10:41.080817 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079044 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:10:41.080817 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079047 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:10:41.080817 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079050 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:10:41.080817 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079054 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:10:41.081331 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079056 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:10:41.081331 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079059 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:10:41.081331 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079061 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:10:41.081331 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079064 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:10:41.081331 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079067 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:10:41.081331 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079069 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:10:41.081331 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079072 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:10:41.081331 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079075 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:10:41.081331 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079080 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:10:41.081331 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079082 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:10:41.081331 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079086 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:10:41.081331 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079089 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:10:41.081331 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079092 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:10:41.081331 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079095 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:10:41.081331 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079098 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:10:41.081331 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079101 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:10:41.081331 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079103 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:10:41.081331 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079107 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:10:41.081331 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079110 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:10:41.081331 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079113 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:10:41.081842 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079116 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:10:41.081842 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079121 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:10:41.081842 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079124 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:10:41.081842 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079126 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:10:41.081842 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079129 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:10:41.081842 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079132 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:10:41.081842 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079135 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:10:41.081842 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079138 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:10:41.081842 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079140 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:10:41.081842 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079143 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:10:41.081842 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079145 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:10:41.081842 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079148 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:10:41.081842 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079151 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:10:41.081842 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079157 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:10:41.081842 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079160 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:10:41.081842 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079162 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:10:41.081842 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079165 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:10:41.081842 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079168 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:10:41.081842 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079170 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:10:41.082298 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079173 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:10:41.082298 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079176 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:10:41.082298 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079178 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:10:41.082298 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079182 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:10:41.082298 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079185 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:10:41.082298 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079188 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:10:41.082298 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079193 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:10:41.082298 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079196 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:10:41.082298 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079198 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:10:41.082298 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.079201 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:10:41.082298 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.079873 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 18:10:41.082298 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080027 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 18:10:41.082298 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080035 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 18:10:41.082298 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080040 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 18:10:41.082298 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080045 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 18:10:41.082298 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080048 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 18:10:41.082298 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080053 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 18:10:41.082298 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080057 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 18:10:41.082298 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080060 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 18:10:41.082298 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080063 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 18:10:41.082298 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080067 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 18:10:41.082824 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080070 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 18:10:41.082824 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080073 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 18:10:41.082824 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080076 2576 flags.go:64] FLAG: --cgroup-root="" Apr 16 18:10:41.082824 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080079 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 18:10:41.082824 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080082 2576 flags.go:64] FLAG: --client-ca-file="" Apr 16 18:10:41.082824 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080085 2576 flags.go:64] FLAG: --cloud-config="" Apr 16 18:10:41.082824 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080088 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 16 18:10:41.082824 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080090 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 18:10:41.082824 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080095 2576 flags.go:64] FLAG: --cluster-domain="" Apr 16 18:10:41.082824 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080098 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 18:10:41.082824 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080101 2576 flags.go:64] FLAG: --config-dir="" Apr 16 18:10:41.082824 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080104 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 18:10:41.082824 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080108 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 18:10:41.082824 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080112 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 18:10:41.082824 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080115 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 18:10:41.082824 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080118 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 18:10:41.082824 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080122 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 18:10:41.082824 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080124 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 16 18:10:41.082824 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080128 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 18:10:41.082824 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080131 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 18:10:41.082824 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080134 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 18:10:41.082824 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080139 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 18:10:41.082824 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080144 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 18:10:41.082824 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080147 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 18:10:41.082824 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080151 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 18:10:41.083432 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080154 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 18:10:41.083432 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080159 2576 flags.go:64] FLAG: --enable-server="true" Apr 16 18:10:41.083432 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080162 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 18:10:41.083432 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080167 2576 flags.go:64] FLAG: --event-burst="100" Apr 16 18:10:41.083432 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080172 2576 flags.go:64] FLAG: --event-qps="50" Apr 16 18:10:41.083432 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080175 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 18:10:41.083432 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080179 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 18:10:41.083432 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080184 2576 flags.go:64] FLAG: --eviction-hard="" Apr 16 18:10:41.083432 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080189 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 18:10:41.083432 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080192 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 18:10:41.083432 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080195 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 18:10:41.083432 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080198 2576 flags.go:64] FLAG: --eviction-soft="" Apr 16 18:10:41.083432 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080203 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 18:10:41.083432 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080205 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 18:10:41.083432 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080208 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 18:10:41.083432 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080213 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 18:10:41.083432 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080216 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 18:10:41.083432 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080220 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 18:10:41.083432 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080223 2576 flags.go:64] FLAG: --feature-gates="" Apr 16 18:10:41.083432 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080229 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 18:10:41.083432 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080232 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 18:10:41.083432 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080237 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 18:10:41.083432 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080240 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 18:10:41.083432 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080243 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 16 18:10:41.083432 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080248 2576 flags.go:64] FLAG: --help="false" Apr 16 18:10:41.084067 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080251 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-139-96.ec2.internal" Apr 16 18:10:41.084067 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080256 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 18:10:41.084067 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080259 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 18:10:41.084067 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080262 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 18:10:41.084067 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080267 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 18:10:41.084067 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080272 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 18:10:41.084067 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080275 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 18:10:41.084067 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080278 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 18:10:41.084067 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080281 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 18:10:41.084067 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080287 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 18:10:41.084067 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080291 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 18:10:41.084067 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080296 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 18:10:41.084067 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080298 2576 flags.go:64] FLAG: --kube-reserved="" Apr 16 18:10:41.084067 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080302 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 18:10:41.084067 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080304 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 18:10:41.084067 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080307 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 18:10:41.084067 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080310 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 18:10:41.084067 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080313 2576 flags.go:64] FLAG: --lock-file="" Apr 16 18:10:41.084067 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080316 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 18:10:41.084067 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080318 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 18:10:41.084067 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080321 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 18:10:41.084067 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080327 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 18:10:41.084067 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080330 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 18:10:41.084645 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080333 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 18:10:41.084645 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080336 2576 flags.go:64] FLAG: --logging-format="text" Apr 16 18:10:41.084645 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080339 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 18:10:41.084645 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080342 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 18:10:41.084645 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080345 2576 flags.go:64] FLAG: --manifest-url="" Apr 16 18:10:41.084645 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080348 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 16 18:10:41.084645 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080353 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 18:10:41.084645 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080357 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 18:10:41.084645 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080361 2576 flags.go:64] FLAG: --max-pods="110" Apr 16 18:10:41.084645 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080364 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 18:10:41.084645 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080367 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 18:10:41.084645 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080370 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 18:10:41.084645 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080373 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 18:10:41.084645 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080376 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 18:10:41.084645 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080378 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 18:10:41.084645 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080382 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 18:10:41.084645 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080390 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 18:10:41.084645 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080393 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 18:10:41.084645 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080396 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 18:10:41.084645 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080399 2576 flags.go:64] FLAG: --pod-cidr="" Apr 16 18:10:41.084645 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080402 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 18:10:41.084645 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080408 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 18:10:41.084645 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080411 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 18:10:41.084645 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080414 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 16 18:10:41.085239 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080417 2576 flags.go:64] FLAG: --port="10250" Apr 16 18:10:41.085239 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080420 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 18:10:41.085239 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080423 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0689c7eafa09eca3a" Apr 16 18:10:41.085239 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080426 2576 flags.go:64] FLAG: --qos-reserved="" Apr 16 18:10:41.085239 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080429 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 16 18:10:41.085239 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080432 2576 flags.go:64] FLAG: --register-node="true" Apr 16 18:10:41.085239 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080435 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 16 18:10:41.085239 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080438 2576 flags.go:64] FLAG: --register-with-taints="" Apr 16 18:10:41.085239 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080442 2576 flags.go:64] FLAG: --registry-burst="10" Apr 16 18:10:41.085239 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080444 2576 flags.go:64] FLAG: --registry-qps="5" Apr 16 18:10:41.085239 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080447 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 16 18:10:41.085239 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080450 2576 flags.go:64] FLAG: --reserved-memory="" Apr 16 18:10:41.085239 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080454 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 18:10:41.085239 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080457 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 18:10:41.085239 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080460 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 18:10:41.085239 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080463 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 18:10:41.085239 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080466 2576 flags.go:64] FLAG: --runonce="false" Apr 16 18:10:41.085239 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080468 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 18:10:41.085239 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080471 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 18:10:41.085239 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080474 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 16 18:10:41.085239 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080477 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 18:10:41.085239 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080480 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 18:10:41.085239 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080482 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 18:10:41.085239 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080487 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 18:10:41.085239 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080490 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 18:10:41.085239 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080493 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 18:10:41.085879 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080496 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 18:10:41.085879 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080499 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 18:10:41.085879 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080502 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 18:10:41.085879 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080505 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 18:10:41.085879 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080508 2576 flags.go:64] FLAG: --system-cgroups="" Apr 16 18:10:41.085879 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080511 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 18:10:41.085879 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080517 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 18:10:41.085879 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080520 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 16 18:10:41.085879 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080523 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 18:10:41.085879 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080527 2576 flags.go:64] FLAG: --tls-min-version="" Apr 16 18:10:41.085879 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080530 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 18:10:41.085879 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080532 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 18:10:41.085879 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080535 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 18:10:41.085879 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080538 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 18:10:41.085879 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080541 2576 flags.go:64] FLAG: --v="2" Apr 16 18:10:41.085879 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080545 2576 flags.go:64] FLAG: --version="false" Apr 16 18:10:41.085879 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080549 2576 flags.go:64] FLAG: --vmodule="" Apr 16 18:10:41.085879 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080554 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 18:10:41.085879 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.080557 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 18:10:41.085879 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080655 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:10:41.085879 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080661 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:10:41.085879 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080664 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:10:41.085879 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080667 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:10:41.085879 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080671 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:10:41.086477 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080675 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:10:41.086477 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080678 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:10:41.086477 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080681 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:10:41.086477 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080684 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:10:41.086477 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080687 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:10:41.086477 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080691 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:10:41.086477 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080694 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:10:41.086477 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080711 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:10:41.086477 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080714 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:10:41.086477 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080717 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:10:41.086477 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080720 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:10:41.086477 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080723 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:10:41.086477 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080726 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:10:41.086477 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080729 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:10:41.086477 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080732 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:10:41.086477 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080734 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:10:41.086477 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080737 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:10:41.086477 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080739 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:10:41.086477 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080742 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:10:41.086477 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080745 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:10:41.087016 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080747 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:10:41.087016 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080750 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:10:41.087016 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080761 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:10:41.087016 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080764 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:10:41.087016 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080767 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:10:41.087016 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080770 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:10:41.087016 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080772 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:10:41.087016 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080775 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:10:41.087016 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080779 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:10:41.087016 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080781 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:10:41.087016 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080784 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:10:41.087016 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080786 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:10:41.087016 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080789 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:10:41.087016 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080791 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:10:41.087016 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080794 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:10:41.087016 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080796 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:10:41.087016 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080799 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:10:41.087016 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080804 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:10:41.087016 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080806 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:10:41.087016 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080809 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:10:41.087503 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080811 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:10:41.087503 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080814 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:10:41.087503 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080831 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:10:41.087503 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080837 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:10:41.087503 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080841 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:10:41.087503 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080844 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:10:41.087503 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080847 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:10:41.087503 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080850 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:10:41.087503 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080853 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:10:41.087503 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080856 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:10:41.087503 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080859 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:10:41.087503 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080862 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:10:41.087503 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080865 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:10:41.087503 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080867 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:10:41.087503 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080870 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:10:41.087503 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080872 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:10:41.087503 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080875 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:10:41.087503 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080877 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:10:41.087503 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080880 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:10:41.087503 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080882 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:10:41.088066 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080886 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:10:41.088066 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080889 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:10:41.088066 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080892 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:10:41.088066 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080894 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:10:41.088066 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080897 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:10:41.088066 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080900 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:10:41.088066 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080902 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:10:41.088066 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080905 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:10:41.088066 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080908 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:10:41.088066 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080915 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:10:41.088066 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080917 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:10:41.088066 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080920 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:10:41.088066 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080922 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:10:41.088066 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080925 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:10:41.088066 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080928 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:10:41.088066 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080931 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:10:41.088066 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080933 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:10:41.088066 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080936 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:10:41.088066 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080938 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:10:41.088528 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080941 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:10:41.088528 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.080943 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:10:41.088528 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.081835 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:10:41.088528 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.088024 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 18:10:41.088528 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.088040 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 18:10:41.088528 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088094 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:10:41.088528 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088100 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:10:41.088528 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088103 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:10:41.088528 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088107 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:10:41.088528 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088111 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:10:41.088528 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088114 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:10:41.088528 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088117 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:10:41.088528 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088120 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:10:41.088528 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088123 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:10:41.088528 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088126 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:10:41.088528 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088128 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:10:41.088942 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088131 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:10:41.088942 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088134 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:10:41.088942 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088137 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:10:41.088942 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088140 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:10:41.088942 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088142 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:10:41.088942 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088145 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:10:41.088942 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088148 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:10:41.088942 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088150 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:10:41.088942 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088153 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:10:41.088942 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088155 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:10:41.088942 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088158 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:10:41.088942 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088161 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:10:41.088942 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088163 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:10:41.088942 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088166 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:10:41.088942 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088169 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:10:41.088942 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088172 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:10:41.088942 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088174 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:10:41.088942 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088177 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:10:41.088942 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088180 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:10:41.088942 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088182 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:10:41.089510 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088186 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:10:41.089510 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088188 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:10:41.089510 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088191 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:10:41.089510 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088194 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:10:41.089510 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088196 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:10:41.089510 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088199 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:10:41.089510 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088201 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:10:41.089510 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088204 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:10:41.089510 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088207 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:10:41.089510 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088209 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:10:41.089510 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088213 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:10:41.089510 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088218 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:10:41.089510 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088221 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:10:41.089510 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088224 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:10:41.089510 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088227 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:10:41.089510 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088229 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:10:41.089510 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088232 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:10:41.089510 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088235 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:10:41.089510 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088238 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:10:41.090017 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088241 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:10:41.090017 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088244 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:10:41.090017 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088247 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:10:41.090017 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088250 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:10:41.090017 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088253 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:10:41.090017 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088256 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:10:41.090017 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088258 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:10:41.090017 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088261 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:10:41.090017 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088264 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:10:41.090017 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088266 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:10:41.090017 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088269 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:10:41.090017 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088272 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:10:41.090017 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088274 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:10:41.090017 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088278 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:10:41.090017 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088280 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:10:41.090017 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088284 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:10:41.090017 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088288 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:10:41.090017 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088291 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:10:41.090017 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088294 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:10:41.090478 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088296 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:10:41.090478 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088299 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:10:41.090478 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088301 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:10:41.090478 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088304 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:10:41.090478 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088306 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:10:41.090478 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088309 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:10:41.090478 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088312 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:10:41.090478 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088314 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:10:41.090478 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088318 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:10:41.090478 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088321 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:10:41.090478 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088323 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:10:41.090478 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088326 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:10:41.090478 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088329 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:10:41.090478 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088331 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:10:41.090478 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088334 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:10:41.090478 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088337 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:10:41.090478 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088339 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:10:41.090910 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.088345 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:10:41.090910 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088438 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:10:41.090910 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088442 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:10:41.090910 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088445 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:10:41.090910 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088448 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:10:41.090910 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088451 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:10:41.090910 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088454 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:10:41.090910 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088457 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:10:41.090910 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088460 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:10:41.090910 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088463 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:10:41.090910 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088466 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:10:41.090910 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088469 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:10:41.090910 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088472 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:10:41.090910 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088474 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:10:41.090910 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088477 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:10:41.091275 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088479 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:10:41.091275 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088482 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:10:41.091275 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088485 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:10:41.091275 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088487 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:10:41.091275 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088490 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:10:41.091275 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088492 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:10:41.091275 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088495 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:10:41.091275 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088497 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:10:41.091275 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088500 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:10:41.091275 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088503 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:10:41.091275 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088505 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:10:41.091275 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088508 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:10:41.091275 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088510 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:10:41.091275 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088513 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:10:41.091275 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088515 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:10:41.091275 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088518 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:10:41.091275 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088521 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:10:41.091275 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088523 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:10:41.091275 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088526 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:10:41.091275 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088529 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:10:41.091822 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088533 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:10:41.091822 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088536 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:10:41.091822 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088539 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:10:41.091822 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088542 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:10:41.091822 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088545 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:10:41.091822 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088547 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:10:41.091822 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088550 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:10:41.091822 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088554 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:10:41.091822 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088556 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:10:41.091822 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088559 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:10:41.091822 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088562 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:10:41.091822 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088564 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:10:41.091822 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088567 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:10:41.091822 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088569 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:10:41.091822 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088572 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:10:41.091822 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088574 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:10:41.091822 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088577 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:10:41.091822 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088581 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:10:41.091822 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088584 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:10:41.091822 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088587 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:10:41.092324 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088590 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:10:41.092324 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088593 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:10:41.092324 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088595 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:10:41.092324 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088598 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:10:41.092324 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088601 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:10:41.092324 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088604 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:10:41.092324 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088607 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:10:41.092324 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088609 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:10:41.092324 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088612 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:10:41.092324 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088615 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:10:41.092324 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088617 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:10:41.092324 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088619 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:10:41.092324 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088622 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:10:41.092324 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088624 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:10:41.092324 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088627 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:10:41.092324 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088629 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:10:41.092324 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088633 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:10:41.092324 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088635 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:10:41.092324 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088638 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:10:41.092324 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088641 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:10:41.092853 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088643 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:10:41.092853 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088646 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:10:41.092853 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088648 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:10:41.092853 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088651 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:10:41.092853 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088654 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:10:41.092853 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088657 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:10:41.092853 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088659 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:10:41.092853 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088661 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:10:41.092853 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088664 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:10:41.092853 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088667 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:10:41.092853 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088669 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:10:41.092853 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:41.088671 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:10:41.092853 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.088676 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:10:41.092853 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.090672 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 18:10:41.092853 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.092662 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 18:10:41.093457 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.093445 2576 server.go:1019] "Starting client certificate rotation" Apr 16 18:10:41.093560 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.093542 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:10:41.093598 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.093582 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:10:41.115457 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.115438 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:10:41.117719 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.117685 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:10:41.137045 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.137024 2576 log.go:25] "Validated CRI v1 runtime API" Apr 16 18:10:41.141965 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.141942 2576 log.go:25] "Validated CRI v1 image API" Apr 16 18:10:41.143294 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.143279 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 18:10:41.145276 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.145260 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:10:41.145863 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.145838 2576 fs.go:135] Filesystem UUIDs: map[74d6dca3-95bd-45b1-a1f6-8886870ea5a0:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 fe3d90a6-7ccc-4ea1-8d8e-0a9c29bfbc04:/dev/nvme0n1p3] Apr 16 18:10:41.145943 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.145862 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 18:10:41.152337 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.152227 2576 manager.go:217] Machine: {Timestamp:2026-04-16 18:10:41.150351481 +0000 UTC m=+0.351926740 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099511 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec211099ae7ea1ac8907c7a888198e5d SystemUUID:ec211099-ae7e-a1ac-8907-c7a888198e5d BootID:7119c587-cca4-40da-978e-a4b3fdbe3535 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:eb:42:ab:c1:a7 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:eb:42:ab:c1:a7 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:76:1c:82:cd:85:27 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 18:10:41.152337 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.152325 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 18:10:41.152473 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.152407 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 18:10:41.155138 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.155116 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 18:10:41.155268 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.155140 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-96.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 18:10:41.155316 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.155276 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 18:10:41.155316 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.155284 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 18:10:41.155316 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.155297 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:10:41.156207 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.156197 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:10:41.157849 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.157839 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:10:41.157954 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.157945 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 18:10:41.160122 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.160113 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 16 18:10:41.160159 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.160125 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 18:10:41.160159 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.160136 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 18:10:41.160159 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.160144 2576 kubelet.go:397] "Adding apiserver pod source" Apr 16 18:10:41.160159 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.160152 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 18:10:41.161111 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.161099 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:10:41.161160 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.161117 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:10:41.164097 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.164081 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 18:10:41.165647 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.165631 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-cj876" Apr 16 18:10:41.165784 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.165772 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 18:10:41.166992 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.166980 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 18:10:41.167029 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.166996 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 18:10:41.167029 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.167003 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 18:10:41.167029 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.167009 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 18:10:41.167029 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.167014 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 18:10:41.167029 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.167020 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 18:10:41.167029 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.167025 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 18:10:41.167029 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.167032 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 18:10:41.167205 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.167038 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 18:10:41.167205 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.167044 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 18:10:41.167205 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.167053 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 18:10:41.167205 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.167061 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 18:10:41.168468 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.168455 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 18:10:41.168501 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.168470 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 18:10:41.171043 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:41.171025 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-139-96.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 18:10:41.171187 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:41.171173 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 18:10:41.171796 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.171783 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-cj876" Apr 16 18:10:41.171996 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.171983 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 18:10:41.172038 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.172022 2576 server.go:1295] "Started kubelet" Apr 16 18:10:41.172132 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.172108 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 18:10:41.172236 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.172179 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 18:10:41.172277 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.172265 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 18:10:41.172670 ip-10-0-139-96 systemd[1]: Started Kubernetes Kubelet. Apr 16 18:10:41.173571 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.173554 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 16 18:10:41.173899 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.173883 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 18:10:41.178493 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.178475 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 18:10:41.179028 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.179013 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 18:10:41.179859 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:41.179836 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-96.ec2.internal\" not found" Apr 16 18:10:41.179938 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.179870 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 18:10:41.179938 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.179878 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 18:10:41.179938 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.179913 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 18:10:41.180230 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.180082 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 16 18:10:41.180230 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.180102 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 16 18:10:41.181183 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.181164 2576 factory.go:55] Registering systemd factory Apr 16 18:10:41.181183 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.181185 2576 factory.go:223] Registration of the systemd container factory successfully Apr 16 18:10:41.181431 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:41.181411 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 18:10:41.181557 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.181541 2576 factory.go:153] Registering CRI-O factory Apr 16 18:10:41.181614 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.181560 2576 factory.go:223] Registration of the crio container factory successfully Apr 16 18:10:41.181729 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.181646 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:10:41.181837 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.181820 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 18:10:41.181889 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.181853 2576 factory.go:103] Registering Raw factory Apr 16 18:10:41.181933 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.181902 2576 manager.go:1196] Started watching for new ooms in manager Apr 16 18:10:41.183269 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.183252 2576 manager.go:319] Starting recovery of all containers Apr 16 18:10:41.187139 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:41.187117 2576 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-139-96.ec2.internal\" not found" node="ip-10-0-139-96.ec2.internal" Apr 16 18:10:41.190157 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.190137 2576 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-139-96.ec2.internal" not found Apr 16 18:10:41.196255 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.196238 2576 manager.go:324] Recovery completed Apr 16 18:10:41.200480 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.200462 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:10:41.202740 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.202716 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-96.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:10:41.202808 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.202754 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-96.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:10:41.202808 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.202763 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-96.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:10:41.203187 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.203173 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 18:10:41.203187 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.203185 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 18:10:41.203301 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.203204 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:10:41.205325 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.205311 2576 policy_none.go:49] "None policy: Start" Apr 16 18:10:41.205325 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.205326 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 18:10:41.205414 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.205335 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 16 18:10:41.206086 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.206074 2576 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-139-96.ec2.internal" not found Apr 16 18:10:41.238348 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.238335 2576 manager.go:341] "Starting Device Plugin manager" Apr 16 18:10:41.241989 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:41.238362 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 18:10:41.241989 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.238373 2576 server.go:85] "Starting device plugin registration server" Apr 16 18:10:41.241989 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.238582 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 18:10:41.241989 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.238593 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 18:10:41.241989 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.238682 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 18:10:41.241989 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.238772 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 18:10:41.241989 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.238783 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 18:10:41.241989 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:41.239201 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 18:10:41.241989 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:41.239227 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-96.ec2.internal\" not found" Apr 16 18:10:41.266162 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.266142 2576 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-139-96.ec2.internal" not found Apr 16 18:10:41.320147 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.320119 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 18:10:41.321306 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.321290 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 18:10:41.321360 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.321323 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 18:10:41.321360 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.321341 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 18:10:41.321360 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.321349 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 18:10:41.321460 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:41.321386 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 18:10:41.325098 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.325084 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:10:41.339112 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.339097 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:10:41.340083 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.340067 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-96.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:10:41.340147 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.340097 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-96.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:10:41.340147 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.340107 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-96.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:10:41.340147 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.340133 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-96.ec2.internal" Apr 16 18:10:41.351014 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.350997 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-96.ec2.internal" Apr 16 18:10:41.351057 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:41.351014 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-139-96.ec2.internal\": node \"ip-10-0-139-96.ec2.internal\" not found" Apr 16 18:10:41.374899 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:41.374879 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-96.ec2.internal\" not found" Apr 16 18:10:41.422428 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.422396 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-139-96.ec2.internal"] Apr 16 18:10:41.422486 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.422476 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:10:41.423767 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.423750 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-96.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:10:41.423819 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.423779 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-96.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:10:41.423819 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.423788 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-96.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:10:41.424797 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.424785 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:10:41.424940 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.424928 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal" Apr 16 18:10:41.424981 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.424956 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:10:41.425411 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.425395 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-96.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:10:41.425467 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.425423 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-96.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:10:41.425467 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.425396 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-96.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:10:41.425467 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.425439 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-96.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:10:41.425467 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.425449 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-96.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:10:41.425467 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.425462 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-96.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:10:41.426966 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.426947 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-96.ec2.internal" Apr 16 18:10:41.427010 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.426982 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:10:41.427587 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.427571 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-96.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:10:41.427679 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.427594 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-96.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:10:41.427679 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.427608 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-96.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:10:41.452342 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:41.452295 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-96.ec2.internal\" not found" node="ip-10-0-139-96.ec2.internal" Apr 16 18:10:41.456680 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:41.456664 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-96.ec2.internal\" not found" node="ip-10-0-139-96.ec2.internal" Apr 16 18:10:41.474944 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:41.474921 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-96.ec2.internal\" not found" Apr 16 18:10:41.481441 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.481426 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5192824af33e3f669380bf45ab3cf23a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal\" (UID: \"5192824af33e3f669380bf45ab3cf23a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal" Apr 16 18:10:41.481490 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.481449 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5192824af33e3f669380bf45ab3cf23a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal\" (UID: \"5192824af33e3f669380bf45ab3cf23a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal" Apr 16 18:10:41.481490 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.481465 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ed8c920fb76e0e328ed5f4aa00cb172f-config\") pod \"kube-apiserver-proxy-ip-10-0-139-96.ec2.internal\" (UID: \"ed8c920fb76e0e328ed5f4aa00cb172f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-96.ec2.internal" Apr 16 18:10:41.575367 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:41.575348 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-96.ec2.internal\" not found" Apr 16 18:10:41.581682 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.581667 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5192824af33e3f669380bf45ab3cf23a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal\" (UID: \"5192824af33e3f669380bf45ab3cf23a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal" Apr 16 18:10:41.581739 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.581725 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5192824af33e3f669380bf45ab3cf23a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal\" (UID: \"5192824af33e3f669380bf45ab3cf23a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal" Apr 16 18:10:41.581772 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.581745 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5192824af33e3f669380bf45ab3cf23a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal\" (UID: \"5192824af33e3f669380bf45ab3cf23a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal" Apr 16 18:10:41.581772 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.581764 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ed8c920fb76e0e328ed5f4aa00cb172f-config\") pod \"kube-apiserver-proxy-ip-10-0-139-96.ec2.internal\" (UID: \"ed8c920fb76e0e328ed5f4aa00cb172f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-96.ec2.internal" Apr 16 18:10:41.581839 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.581775 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5192824af33e3f669380bf45ab3cf23a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal\" (UID: \"5192824af33e3f669380bf45ab3cf23a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal" Apr 16 18:10:41.581839 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.581787 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ed8c920fb76e0e328ed5f4aa00cb172f-config\") pod \"kube-apiserver-proxy-ip-10-0-139-96.ec2.internal\" (UID: \"ed8c920fb76e0e328ed5f4aa00cb172f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-96.ec2.internal" Apr 16 18:10:41.676099 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:41.676069 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-96.ec2.internal\" not found" Apr 16 18:10:41.756607 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.756544 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal" Apr 16 18:10:41.759040 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:41.759023 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-96.ec2.internal" Apr 16 18:10:41.776772 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:41.776758 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-96.ec2.internal\" not found" Apr 16 18:10:41.877316 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:41.877290 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-96.ec2.internal\" not found" Apr 16 18:10:41.977815 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:41.977797 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-96.ec2.internal\" not found" Apr 16 18:10:42.065934 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:42.065878 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:10:42.078381 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:42.078364 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-96.ec2.internal\" not found" Apr 16 18:10:42.093794 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:42.093780 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 18:10:42.093888 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:42.093874 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:10:42.093933 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:42.093916 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:10:42.093966 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:42.093918 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:10:42.174333 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:42.174306 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 18:05:41 +0000 UTC" deadline="2027-12-23 10:21:10.654234404 +0000 UTC" Apr 16 18:10:42.174333 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:42.174331 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14776h10m28.479905729s" Apr 16 18:10:42.179458 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:42.179442 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-96.ec2.internal\" not found" Apr 16 18:10:42.179539 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:42.179466 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 18:10:42.200587 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:42.200568 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:10:42.222189 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:42.222166 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-t7zgw" Apr 16 18:10:42.229842 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:42.229823 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-t7zgw" Apr 16 18:10:42.270999 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:42.270975 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5192824af33e3f669380bf45ab3cf23a.slice/crio-6b7b47fa85b59feccc1508652a2a924bff3c861299c835259993b3dff1f228d8 WatchSource:0}: Error finding container 6b7b47fa85b59feccc1508652a2a924bff3c861299c835259993b3dff1f228d8: Status 404 returned error can't find the container with id 6b7b47fa85b59feccc1508652a2a924bff3c861299c835259993b3dff1f228d8 Apr 16 18:10:42.271820 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:42.271802 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded8c920fb76e0e328ed5f4aa00cb172f.slice/crio-5e7d4ecc60d8eec4c31aac67cada6ef1a3cd2dc2a8c1c613223d659d04d64453 WatchSource:0}: Error finding container 5e7d4ecc60d8eec4c31aac67cada6ef1a3cd2dc2a8c1c613223d659d04d64453: Status 404 returned error can't find the container with id 5e7d4ecc60d8eec4c31aac67cada6ef1a3cd2dc2a8c1c613223d659d04d64453 Apr 16 18:10:42.276621 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:42.276608 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:10:42.280309 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:42.280293 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-96.ec2.internal\" not found" Apr 16 18:10:42.323535 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:42.323464 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal" event={"ID":"5192824af33e3f669380bf45ab3cf23a","Type":"ContainerStarted","Data":"6b7b47fa85b59feccc1508652a2a924bff3c861299c835259993b3dff1f228d8"} Apr 16 18:10:42.324291 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:42.324274 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-96.ec2.internal" event={"ID":"ed8c920fb76e0e328ed5f4aa00cb172f","Type":"ContainerStarted","Data":"5e7d4ecc60d8eec4c31aac67cada6ef1a3cd2dc2a8c1c613223d659d04d64453"} Apr 16 18:10:42.380539 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:42.380519 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-96.ec2.internal\" not found" Apr 16 18:10:42.444489 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:42.444469 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:10:42.479932 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:42.479915 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-96.ec2.internal" Apr 16 18:10:42.488582 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:42.488559 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:10:42.489587 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:42.489575 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal" Apr 16 18:10:42.500198 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:42.500184 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:10:43.101442 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.101414 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:10:43.161642 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.161621 2576 apiserver.go:52] "Watching apiserver" Apr 16 18:10:43.169828 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.169805 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 18:10:43.172339 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.172312 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-r2j4b","kube-system/konnectivity-agent-d949z","openshift-cluster-node-tuning-operator/tuned-lsm5b","openshift-multus/network-metrics-daemon-fwqx5","openshift-network-operator/iptables-alerter-22zxr","openshift-ovn-kubernetes/ovnkube-node-b6sfk","kube-system/kube-apiserver-proxy-ip-10-0-139-96.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pqkzx","openshift-dns/node-resolver-mb8s7","openshift-image-registry/node-ca-jmzcb","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal","openshift-multus/multus-additional-cni-plugins-k5m7r","openshift-multus/multus-tkxn5"] Apr 16 18:10:43.174588 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.174536 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jmzcb" Apr 16 18:10:43.175401 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.175372 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-d949z" Apr 16 18:10:43.176582 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.176560 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.176798 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.176778 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwqx5" Apr 16 18:10:43.176887 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:43.176851 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwqx5" podUID="22eec1cf-b2b9-495f-9507-ee4b6c6a9204" Apr 16 18:10:43.177164 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.177143 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 18:10:43.177254 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.177150 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 18:10:43.177462 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.177442 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 18:10:43.177553 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.177471 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-rppnt\"" Apr 16 18:10:43.177917 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.177899 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-22zxr" Apr 16 18:10:43.178745 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.178617 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-gtcbv\"" Apr 16 18:10:43.178745 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.178715 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 18:10:43.178745 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.178726 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 18:10:43.181731 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.179554 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 18:10:43.181731 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.179766 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-gkmbz\"" Apr 16 18:10:43.181731 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.179557 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:10:43.181731 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.181095 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 18:10:43.181731 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.181306 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 18:10:43.181731 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.181442 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:10:43.181731 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.181506 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-7rjld\"" Apr 16 18:10:43.182337 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.182320 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.184599 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.184582 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pqkzx" Apr 16 18:10:43.185863 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.185845 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mb8s7" Apr 16 18:10:43.186834 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.186765 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 18:10:43.186935 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.186916 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 18:10:43.187111 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.187093 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r2j4b" Apr 16 18:10:43.187194 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:43.187151 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r2j4b" podUID="5ad84774-79a9-4253-9451-f7e900a7cb4d" Apr 16 18:10:43.187257 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.187231 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 18:10:43.188425 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.188405 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-k5m7r" Apr 16 18:10:43.188425 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.188421 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 18:10:43.188571 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.188463 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-slxgm\"" Apr 16 18:10:43.188764 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.188748 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 18:10:43.188909 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.188891 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 18:10:43.188909 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.188904 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 18:10:43.189047 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.188961 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-rdpj2\"" Apr 16 18:10:43.189047 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.188984 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 18:10:43.189047 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.188906 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 18:10:43.189538 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.189503 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 18:10:43.189628 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.189566 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 18:10:43.189628 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.189583 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-2phrw\"" Apr 16 18:10:43.189884 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.189862 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.189884 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.189875 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-host-cni-netd\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.190006 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.189906 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5e9d6b21-2120-44f2-8a4a-d991547263f2-env-overrides\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.190006 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.189928 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91a81544-645b-4829-be2c-a425f6f14d64-etc-kubernetes\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.190006 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.189946 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/91a81544-645b-4829-be2c-a425f6f14d64-etc-sysctl-d\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.190006 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.189962 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qwsc\" (UniqueName: \"kubernetes.io/projected/1b075c15-1f6a-48c0-af94-65cd41bdc367-kube-api-access-2qwsc\") pod \"iptables-alerter-22zxr\" (UID: \"1b075c15-1f6a-48c0-af94-65cd41bdc367\") " pod="openshift-network-operator/iptables-alerter-22zxr" Apr 16 18:10:43.190006 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.189980 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-host-slash\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.190006 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.189993 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-host-run-netns\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.190297 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.190012 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqbcs\" (UniqueName: \"kubernetes.io/projected/76ccc7ff-6855-49c3-a0b5-185487ae8516-kube-api-access-kqbcs\") pod \"node-ca-jmzcb\" (UID: \"76ccc7ff-6855-49c3-a0b5-185487ae8516\") " pod="openshift-image-registry/node-ca-jmzcb" Apr 16 18:10:43.190297 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.190029 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/91a81544-645b-4829-be2c-a425f6f14d64-lib-modules\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.190297 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.190048 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/22eec1cf-b2b9-495f-9507-ee4b6c6a9204-metrics-certs\") pod \"network-metrics-daemon-fwqx5\" (UID: \"22eec1cf-b2b9-495f-9507-ee4b6c6a9204\") " pod="openshift-multus/network-metrics-daemon-fwqx5" Apr 16 18:10:43.190297 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.190090 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-systemd-units\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.190297 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.190108 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/91a81544-645b-4829-be2c-a425f6f14d64-tmp\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.190297 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.190169 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2jgz\" (UniqueName: \"kubernetes.io/projected/91a81544-645b-4829-be2c-a425f6f14d64-kube-api-access-w2jgz\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.190297 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.190211 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b6db\" (UniqueName: \"kubernetes.io/projected/22eec1cf-b2b9-495f-9507-ee4b6c6a9204-kube-api-access-8b6db\") pod \"network-metrics-daemon-fwqx5\" (UID: \"22eec1cf-b2b9-495f-9507-ee4b6c6a9204\") " pod="openshift-multus/network-metrics-daemon-fwqx5" Apr 16 18:10:43.190297 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.190236 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/1b075c15-1f6a-48c0-af94-65cd41bdc367-iptables-alerter-script\") pod \"iptables-alerter-22zxr\" (UID: \"1b075c15-1f6a-48c0-af94-65cd41bdc367\") " pod="openshift-network-operator/iptables-alerter-22zxr" Apr 16 18:10:43.190297 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.190263 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-etc-openvswitch\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.190297 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.190289 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5e9d6b21-2120-44f2-8a4a-d991547263f2-ovn-node-metrics-cert\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.190795 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.190311 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5e9d6b21-2120-44f2-8a4a-d991547263f2-ovnkube-script-lib\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.190795 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.190328 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/20b76c8e-45ad-41f5-b3f5-1cc2a66d4881-agent-certs\") pod \"konnectivity-agent-d949z\" (UID: \"20b76c8e-45ad-41f5-b3f5-1cc2a66d4881\") " pod="kube-system/konnectivity-agent-d949z" Apr 16 18:10:43.190795 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.190346 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/91a81544-645b-4829-be2c-a425f6f14d64-etc-systemd\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.190795 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.190367 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/91a81544-645b-4829-be2c-a425f6f14d64-run\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.190795 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.190387 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/91a81544-645b-4829-be2c-a425f6f14d64-sys\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.190795 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.190408 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/91a81544-645b-4829-be2c-a425f6f14d64-etc-tuned\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.190795 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.190425 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-host-kubelet\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.190795 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.190470 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-var-lib-openvswitch\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.190795 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.190506 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wxwh\" (UniqueName: \"kubernetes.io/projected/5e9d6b21-2120-44f2-8a4a-d991547263f2-kube-api-access-8wxwh\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.190795 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.190543 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/91a81544-645b-4829-be2c-a425f6f14d64-etc-sysctl-conf\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.190795 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.190568 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/91a81544-645b-4829-be2c-a425f6f14d64-var-lib-kubelet\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.190795 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.190593 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 18:10:43.190795 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.190596 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1b075c15-1f6a-48c0-af94-65cd41bdc367-host-slash\") pod \"iptables-alerter-22zxr\" (UID: \"1b075c15-1f6a-48c0-af94-65cd41bdc367\") " pod="openshift-network-operator/iptables-alerter-22zxr" Apr 16 18:10:43.190795 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.190626 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-log-socket\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.190795 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.190649 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-run-systemd\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.190795 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.190670 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-run-openvswitch\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.190795 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.190691 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-run-ovn\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.191562 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.190732 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-host-cni-bin\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.191562 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.190755 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.191562 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.190778 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/76ccc7ff-6855-49c3-a0b5-185487ae8516-host\") pod \"node-ca-jmzcb\" (UID: \"76ccc7ff-6855-49c3-a0b5-185487ae8516\") " pod="openshift-image-registry/node-ca-jmzcb" Apr 16 18:10:43.191562 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.190814 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/91a81544-645b-4829-be2c-a425f6f14d64-host\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.191562 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.190839 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 18:10:43.191562 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.190845 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-host-run-ovn-kubernetes\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.191562 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.190877 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5e9d6b21-2120-44f2-8a4a-d991547263f2-ovnkube-config\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.191562 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.190907 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/76ccc7ff-6855-49c3-a0b5-185487ae8516-serviceca\") pod \"node-ca-jmzcb\" (UID: \"76ccc7ff-6855-49c3-a0b5-185487ae8516\") " pod="openshift-image-registry/node-ca-jmzcb" Apr 16 18:10:43.191562 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.190935 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/20b76c8e-45ad-41f5-b3f5-1cc2a66d4881-konnectivity-ca\") pod \"konnectivity-agent-d949z\" (UID: \"20b76c8e-45ad-41f5-b3f5-1cc2a66d4881\") " pod="kube-system/konnectivity-agent-d949z" Apr 16 18:10:43.191562 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.190958 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/91a81544-645b-4829-be2c-a425f6f14d64-etc-modprobe-d\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.191562 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.190981 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/91a81544-645b-4829-be2c-a425f6f14d64-etc-sysconfig\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.191562 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.191003 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-node-log\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.192069 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.191594 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 18:10:43.192069 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.191602 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-h459l\"" Apr 16 18:10:43.192069 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.191595 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 18:10:43.192069 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.191904 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 18:10:43.192877 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.192775 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 18:10:43.192877 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.192796 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-x7chk\"" Apr 16 18:10:43.231104 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.231073 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:05:42 +0000 UTC" deadline="2027-09-14 16:58:35.377972603 +0000 UTC" Apr 16 18:10:43.231104 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.231099 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12382h47m52.146877485s" Apr 16 18:10:43.281496 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.281478 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 18:10:43.291346 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.291323 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/91a81544-645b-4829-be2c-a425f6f14d64-etc-sysctl-conf\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.291446 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.291357 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/91a81544-645b-4829-be2c-a425f6f14d64-var-lib-kubelet\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.291446 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.291388 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8d97fc03-363d-4c04-9bad-08b08817cd63-sys-fs\") pod \"aws-ebs-csi-driver-node-pqkzx\" (UID: \"8d97fc03-363d-4c04-9bad-08b08817cd63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pqkzx" Apr 16 18:10:43.291446 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.291415 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fb4955c9-d4b5-4e21-a2ca-4d700832a59c-hosts-file\") pod \"node-resolver-mb8s7\" (UID: \"fb4955c9-d4b5-4e21-a2ca-4d700832a59c\") " pod="openshift-dns/node-resolver-mb8s7" Apr 16 18:10:43.291446 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.291435 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-cnibin\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.291622 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.291449 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-hostroot\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.291622 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.291464 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-log-socket\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.291622 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.291469 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/91a81544-645b-4829-be2c-a425f6f14d64-var-lib-kubelet\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.291622 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.291484 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-run-systemd\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.291622 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.291521 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/91a81544-645b-4829-be2c-a425f6f14d64-etc-sysctl-conf\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.291622 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.291530 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-host-cni-bin\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.291622 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.291580 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-host-cni-bin\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.291622 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.291593 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-run-systemd\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.291622 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.291606 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-log-socket\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.291622 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.291609 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.292033 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.291645 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpmct\" (UniqueName: \"kubernetes.io/projected/82232800-52de-476a-a364-558f49009263-kube-api-access-xpmct\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.292033 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.291681 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.292033 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.291691 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/91a81544-645b-4829-be2c-a425f6f14d64-host\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.292033 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.291732 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-host-run-ovn-kubernetes\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.292033 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.291758 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5e9d6b21-2120-44f2-8a4a-d991547263f2-ovnkube-config\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.292033 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.291788 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-host-run-ovn-kubernetes\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.292033 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.291782 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/20b76c8e-45ad-41f5-b3f5-1cc2a66d4881-konnectivity-ca\") pod \"konnectivity-agent-d949z\" (UID: \"20b76c8e-45ad-41f5-b3f5-1cc2a66d4881\") " pod="kube-system/konnectivity-agent-d949z" Apr 16 18:10:43.292033 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.291758 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/91a81544-645b-4829-be2c-a425f6f14d64-host\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.292033 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.291826 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/91a81544-645b-4829-be2c-a425f6f14d64-etc-modprobe-d\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.292033 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.291851 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/439613fb-5d3f-4d29-b662-f86a49f8e289-cnibin\") pod \"multus-additional-cni-plugins-k5m7r\" (UID: \"439613fb-5d3f-4d29-b662-f86a49f8e289\") " pod="openshift-multus/multus-additional-cni-plugins-k5m7r" Apr 16 18:10:43.292033 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.291876 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmf8w\" (UniqueName: \"kubernetes.io/projected/439613fb-5d3f-4d29-b662-f86a49f8e289-kube-api-access-lmf8w\") pod \"multus-additional-cni-plugins-k5m7r\" (UID: \"439613fb-5d3f-4d29-b662-f86a49f8e289\") " pod="openshift-multus/multus-additional-cni-plugins-k5m7r" Apr 16 18:10:43.292033 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.291900 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-host-cni-netd\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.292033 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.291923 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2qwsc\" (UniqueName: \"kubernetes.io/projected/1b075c15-1f6a-48c0-af94-65cd41bdc367-kube-api-access-2qwsc\") pod \"iptables-alerter-22zxr\" (UID: \"1b075c15-1f6a-48c0-af94-65cd41bdc367\") " pod="openshift-network-operator/iptables-alerter-22zxr" Apr 16 18:10:43.292033 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.291942 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/91a81544-645b-4829-be2c-a425f6f14d64-etc-modprobe-d\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.292033 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.291951 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/439613fb-5d3f-4d29-b662-f86a49f8e289-system-cni-dir\") pod \"multus-additional-cni-plugins-k5m7r\" (UID: \"439613fb-5d3f-4d29-b662-f86a49f8e289\") " pod="openshift-multus/multus-additional-cni-plugins-k5m7r" Apr 16 18:10:43.292033 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.291977 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-host-var-lib-cni-multus\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.292033 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292010 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kqbcs\" (UniqueName: \"kubernetes.io/projected/76ccc7ff-6855-49c3-a0b5-185487ae8516-kube-api-access-kqbcs\") pod \"node-ca-jmzcb\" (UID: \"76ccc7ff-6855-49c3-a0b5-185487ae8516\") " pod="openshift-image-registry/node-ca-jmzcb" Apr 16 18:10:43.292837 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292018 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-host-cni-netd\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.292837 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292034 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8d97fc03-363d-4c04-9bad-08b08817cd63-device-dir\") pod \"aws-ebs-csi-driver-node-pqkzx\" (UID: \"8d97fc03-363d-4c04-9bad-08b08817cd63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pqkzx" Apr 16 18:10:43.292837 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292060 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/82232800-52de-476a-a364-558f49009263-cni-binary-copy\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.292837 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292085 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/82232800-52de-476a-a364-558f49009263-multus-daemon-config\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.292837 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292113 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-systemd-units\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.292837 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292158 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-systemd-units\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.292837 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292162 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/91a81544-645b-4829-be2c-a425f6f14d64-tmp\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.292837 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292204 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/1b075c15-1f6a-48c0-af94-65cd41bdc367-iptables-alerter-script\") pod \"iptables-alerter-22zxr\" (UID: \"1b075c15-1f6a-48c0-af94-65cd41bdc367\") " pod="openshift-network-operator/iptables-alerter-22zxr" Apr 16 18:10:43.292837 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292235 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-host-run-netns\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.292837 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292263 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-host-run-multus-certs\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.292837 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292291 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8d97fc03-363d-4c04-9bad-08b08817cd63-etc-selinux\") pod \"aws-ebs-csi-driver-node-pqkzx\" (UID: \"8d97fc03-363d-4c04-9bad-08b08817cd63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pqkzx" Apr 16 18:10:43.292837 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292317 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-etc-openvswitch\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.292837 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292325 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/20b76c8e-45ad-41f5-b3f5-1cc2a66d4881-konnectivity-ca\") pod \"konnectivity-agent-d949z\" (UID: \"20b76c8e-45ad-41f5-b3f5-1cc2a66d4881\") " pod="kube-system/konnectivity-agent-d949z" Apr 16 18:10:43.292837 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292342 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5e9d6b21-2120-44f2-8a4a-d991547263f2-ovn-node-metrics-cert\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.292837 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292405 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 18:10:43.292837 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292396 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-etc-openvswitch\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.292837 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292450 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5e9d6b21-2120-44f2-8a4a-d991547263f2-ovnkube-script-lib\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.293563 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292458 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5e9d6b21-2120-44f2-8a4a-d991547263f2-ovnkube-config\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.293563 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292535 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/91a81544-645b-4829-be2c-a425f6f14d64-run\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.293563 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292575 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/91a81544-645b-4829-be2c-a425f6f14d64-sys\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.293563 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292601 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/91a81544-645b-4829-be2c-a425f6f14d64-run\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.293563 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292645 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-host-kubelet\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.293563 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292649 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/91a81544-645b-4829-be2c-a425f6f14d64-sys\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.293563 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292601 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-host-kubelet\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.293563 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292684 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1b075c15-1f6a-48c0-af94-65cd41bdc367-host-slash\") pod \"iptables-alerter-22zxr\" (UID: \"1b075c15-1f6a-48c0-af94-65cd41bdc367\") " pod="openshift-network-operator/iptables-alerter-22zxr" Apr 16 18:10:43.293563 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292732 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/439613fb-5d3f-4d29-b662-f86a49f8e289-tuning-conf-dir\") pod \"multus-additional-cni-plugins-k5m7r\" (UID: \"439613fb-5d3f-4d29-b662-f86a49f8e289\") " pod="openshift-multus/multus-additional-cni-plugins-k5m7r" Apr 16 18:10:43.293563 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292742 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/1b075c15-1f6a-48c0-af94-65cd41bdc367-iptables-alerter-script\") pod \"iptables-alerter-22zxr\" (UID: \"1b075c15-1f6a-48c0-af94-65cd41bdc367\") " pod="openshift-network-operator/iptables-alerter-22zxr" Apr 16 18:10:43.293563 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292746 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1b075c15-1f6a-48c0-af94-65cd41bdc367-host-slash\") pod \"iptables-alerter-22zxr\" (UID: \"1b075c15-1f6a-48c0-af94-65cd41bdc367\") " pod="openshift-network-operator/iptables-alerter-22zxr" Apr 16 18:10:43.293563 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292762 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-multus-socket-dir-parent\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.293563 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292799 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-run-openvswitch\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.293563 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292825 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-run-ovn\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.293563 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292848 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8d97fc03-363d-4c04-9bad-08b08817cd63-registration-dir\") pod \"aws-ebs-csi-driver-node-pqkzx\" (UID: \"8d97fc03-363d-4c04-9bad-08b08817cd63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pqkzx" Apr 16 18:10:43.293563 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292889 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-run-openvswitch\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.293563 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292894 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-run-ovn\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.294328 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292909 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-host-var-lib-cni-bin\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.294328 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292928 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5e9d6b21-2120-44f2-8a4a-d991547263f2-ovnkube-script-lib\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.294328 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292935 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-host-var-lib-kubelet\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.294328 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292969 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/76ccc7ff-6855-49c3-a0b5-185487ae8516-host\") pod \"node-ca-jmzcb\" (UID: \"76ccc7ff-6855-49c3-a0b5-185487ae8516\") " pod="openshift-image-registry/node-ca-jmzcb" Apr 16 18:10:43.294328 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.292997 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfwhj\" (UniqueName: \"kubernetes.io/projected/5ad84774-79a9-4253-9451-f7e900a7cb4d-kube-api-access-cfwhj\") pod \"network-check-target-r2j4b\" (UID: \"5ad84774-79a9-4253-9451-f7e900a7cb4d\") " pod="openshift-network-diagnostics/network-check-target-r2j4b" Apr 16 18:10:43.294328 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.293013 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/76ccc7ff-6855-49c3-a0b5-185487ae8516-host\") pod \"node-ca-jmzcb\" (UID: \"76ccc7ff-6855-49c3-a0b5-185487ae8516\") " pod="openshift-image-registry/node-ca-jmzcb" Apr 16 18:10:43.294328 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.293036 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/439613fb-5d3f-4d29-b662-f86a49f8e289-cni-binary-copy\") pod \"multus-additional-cni-plugins-k5m7r\" (UID: \"439613fb-5d3f-4d29-b662-f86a49f8e289\") " pod="openshift-multus/multus-additional-cni-plugins-k5m7r" Apr 16 18:10:43.294328 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.293062 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-os-release\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.294328 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.293086 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-multus-conf-dir\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.294328 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.293113 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/76ccc7ff-6855-49c3-a0b5-185487ae8516-serviceca\") pod \"node-ca-jmzcb\" (UID: \"76ccc7ff-6855-49c3-a0b5-185487ae8516\") " pod="openshift-image-registry/node-ca-jmzcb" Apr 16 18:10:43.294328 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.293140 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/91a81544-645b-4829-be2c-a425f6f14d64-etc-sysconfig\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.294328 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.293163 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-etc-kubernetes\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.294328 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.293185 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-node-log\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.294328 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.293207 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5e9d6b21-2120-44f2-8a4a-d991547263f2-env-overrides\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.294328 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.293232 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91a81544-645b-4829-be2c-a425f6f14d64-etc-kubernetes\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.294328 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.293250 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-node-log\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.294328 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.293269 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/91a81544-645b-4829-be2c-a425f6f14d64-etc-sysctl-d\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.295004 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.293281 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/91a81544-645b-4829-be2c-a425f6f14d64-etc-sysconfig\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.295004 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.293294 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8d97fc03-363d-4c04-9bad-08b08817cd63-socket-dir\") pod \"aws-ebs-csi-driver-node-pqkzx\" (UID: \"8d97fc03-363d-4c04-9bad-08b08817cd63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pqkzx" Apr 16 18:10:43.295004 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.293319 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/439613fb-5d3f-4d29-b662-f86a49f8e289-os-release\") pod \"multus-additional-cni-plugins-k5m7r\" (UID: \"439613fb-5d3f-4d29-b662-f86a49f8e289\") " pod="openshift-multus/multus-additional-cni-plugins-k5m7r" Apr 16 18:10:43.295004 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.293336 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91a81544-645b-4829-be2c-a425f6f14d64-etc-kubernetes\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.295004 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.293344 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m8ss\" (UniqueName: \"kubernetes.io/projected/fb4955c9-d4b5-4e21-a2ca-4d700832a59c-kube-api-access-8m8ss\") pod \"node-resolver-mb8s7\" (UID: \"fb4955c9-d4b5-4e21-a2ca-4d700832a59c\") " pod="openshift-dns/node-resolver-mb8s7" Apr 16 18:10:43.295004 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.293369 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-host-slash\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.295004 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.293408 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-host-run-netns\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.295004 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.293453 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/91a81544-645b-4829-be2c-a425f6f14d64-lib-modules\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.295004 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.293482 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/22eec1cf-b2b9-495f-9507-ee4b6c6a9204-metrics-certs\") pod \"network-metrics-daemon-fwqx5\" (UID: \"22eec1cf-b2b9-495f-9507-ee4b6c6a9204\") " pod="openshift-multus/network-metrics-daemon-fwqx5" Apr 16 18:10:43.295004 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.293484 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/91a81544-645b-4829-be2c-a425f6f14d64-etc-sysctl-d\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.295004 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.293513 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d97fc03-363d-4c04-9bad-08b08817cd63-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pqkzx\" (UID: \"8d97fc03-363d-4c04-9bad-08b08817cd63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pqkzx" Apr 16 18:10:43.295004 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.293553 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/76ccc7ff-6855-49c3-a0b5-185487ae8516-serviceca\") pod \"node-ca-jmzcb\" (UID: \"76ccc7ff-6855-49c3-a0b5-185487ae8516\") " pod="openshift-image-registry/node-ca-jmzcb" Apr 16 18:10:43.295004 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:43.293572 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:43.295004 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.293613 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/439613fb-5d3f-4d29-b662-f86a49f8e289-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-k5m7r\" (UID: \"439613fb-5d3f-4d29-b662-f86a49f8e289\") " pod="openshift-multus/multus-additional-cni-plugins-k5m7r" Apr 16 18:10:43.295004 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.293627 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5e9d6b21-2120-44f2-8a4a-d991547263f2-env-overrides\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.295004 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.293634 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-host-slash\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.295004 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.293641 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-multus-cni-dir\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.295972 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.293669 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w2jgz\" (UniqueName: \"kubernetes.io/projected/91a81544-645b-4829-be2c-a425f6f14d64-kube-api-access-w2jgz\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.295972 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:43.293767 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22eec1cf-b2b9-495f-9507-ee4b6c6a9204-metrics-certs podName:22eec1cf-b2b9-495f-9507-ee4b6c6a9204 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:43.793719256 +0000 UTC m=+2.995294549 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/22eec1cf-b2b9-495f-9507-ee4b6c6a9204-metrics-certs") pod "network-metrics-daemon-fwqx5" (UID: "22eec1cf-b2b9-495f-9507-ee4b6c6a9204") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:43.295972 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.293802 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/91a81544-645b-4829-be2c-a425f6f14d64-lib-modules\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.295972 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.293814 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-host-run-netns\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.295972 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.293942 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8b6db\" (UniqueName: \"kubernetes.io/projected/22eec1cf-b2b9-495f-9507-ee4b6c6a9204-kube-api-access-8b6db\") pod \"network-metrics-daemon-fwqx5\" (UID: \"22eec1cf-b2b9-495f-9507-ee4b6c6a9204\") " pod="openshift-multus/network-metrics-daemon-fwqx5" Apr 16 18:10:43.295972 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.293974 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fb4955c9-d4b5-4e21-a2ca-4d700832a59c-tmp-dir\") pod \"node-resolver-mb8s7\" (UID: \"fb4955c9-d4b5-4e21-a2ca-4d700832a59c\") " pod="openshift-dns/node-resolver-mb8s7" Apr 16 18:10:43.295972 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.294014 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-host-run-k8s-cni-cncf-io\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.295972 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.294042 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6t2t\" (UniqueName: \"kubernetes.io/projected/8d97fc03-363d-4c04-9bad-08b08817cd63-kube-api-access-c6t2t\") pod \"aws-ebs-csi-driver-node-pqkzx\" (UID: \"8d97fc03-363d-4c04-9bad-08b08817cd63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pqkzx" Apr 16 18:10:43.295972 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.294069 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/20b76c8e-45ad-41f5-b3f5-1cc2a66d4881-agent-certs\") pod \"konnectivity-agent-d949z\" (UID: \"20b76c8e-45ad-41f5-b3f5-1cc2a66d4881\") " pod="kube-system/konnectivity-agent-d949z" Apr 16 18:10:43.295972 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.294103 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/91a81544-645b-4829-be2c-a425f6f14d64-etc-systemd\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.295972 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.294130 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/91a81544-645b-4829-be2c-a425f6f14d64-etc-tuned\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.295972 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.294155 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/439613fb-5d3f-4d29-b662-f86a49f8e289-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-k5m7r\" (UID: \"439613fb-5d3f-4d29-b662-f86a49f8e289\") " pod="openshift-multus/multus-additional-cni-plugins-k5m7r" Apr 16 18:10:43.295972 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.294191 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-system-cni-dir\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.295972 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.294215 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-var-lib-openvswitch\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.295972 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.294240 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8wxwh\" (UniqueName: \"kubernetes.io/projected/5e9d6b21-2120-44f2-8a4a-d991547263f2-kube-api-access-8wxwh\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.295972 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.294481 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/91a81544-645b-4829-be2c-a425f6f14d64-etc-systemd\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.296739 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.294524 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5e9d6b21-2120-44f2-8a4a-d991547263f2-var-lib-openvswitch\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.296739 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.296011 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5e9d6b21-2120-44f2-8a4a-d991547263f2-ovn-node-metrics-cert\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.296739 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.296047 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/91a81544-645b-4829-be2c-a425f6f14d64-tmp\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.296739 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.296546 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/91a81544-645b-4829-be2c-a425f6f14d64-etc-tuned\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.296739 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.296590 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/20b76c8e-45ad-41f5-b3f5-1cc2a66d4881-agent-certs\") pod \"konnectivity-agent-d949z\" (UID: \"20b76c8e-45ad-41f5-b3f5-1cc2a66d4881\") " pod="kube-system/konnectivity-agent-d949z" Apr 16 18:10:43.301106 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.301085 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqbcs\" (UniqueName: \"kubernetes.io/projected/76ccc7ff-6855-49c3-a0b5-185487ae8516-kube-api-access-kqbcs\") pod \"node-ca-jmzcb\" (UID: \"76ccc7ff-6855-49c3-a0b5-185487ae8516\") " pod="openshift-image-registry/node-ca-jmzcb" Apr 16 18:10:43.301649 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.301628 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qwsc\" (UniqueName: \"kubernetes.io/projected/1b075c15-1f6a-48c0-af94-65cd41bdc367-kube-api-access-2qwsc\") pod \"iptables-alerter-22zxr\" (UID: \"1b075c15-1f6a-48c0-af94-65cd41bdc367\") " pod="openshift-network-operator/iptables-alerter-22zxr" Apr 16 18:10:43.302612 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.302587 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2jgz\" (UniqueName: \"kubernetes.io/projected/91a81544-645b-4829-be2c-a425f6f14d64-kube-api-access-w2jgz\") pod \"tuned-lsm5b\" (UID: \"91a81544-645b-4829-be2c-a425f6f14d64\") " pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.303173 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.303149 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b6db\" (UniqueName: \"kubernetes.io/projected/22eec1cf-b2b9-495f-9507-ee4b6c6a9204-kube-api-access-8b6db\") pod \"network-metrics-daemon-fwqx5\" (UID: \"22eec1cf-b2b9-495f-9507-ee4b6c6a9204\") " pod="openshift-multus/network-metrics-daemon-fwqx5" Apr 16 18:10:43.304009 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.303993 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wxwh\" (UniqueName: \"kubernetes.io/projected/5e9d6b21-2120-44f2-8a4a-d991547263f2-kube-api-access-8wxwh\") pod \"ovnkube-node-b6sfk\" (UID: \"5e9d6b21-2120-44f2-8a4a-d991547263f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.395003 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.394917 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d97fc03-363d-4c04-9bad-08b08817cd63-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pqkzx\" (UID: \"8d97fc03-363d-4c04-9bad-08b08817cd63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pqkzx" Apr 16 18:10:43.395003 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.394975 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/439613fb-5d3f-4d29-b662-f86a49f8e289-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-k5m7r\" (UID: \"439613fb-5d3f-4d29-b662-f86a49f8e289\") " pod="openshift-multus/multus-additional-cni-plugins-k5m7r" Apr 16 18:10:43.395158 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395049 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d97fc03-363d-4c04-9bad-08b08817cd63-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pqkzx\" (UID: \"8d97fc03-363d-4c04-9bad-08b08817cd63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pqkzx" Apr 16 18:10:43.395158 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395088 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-multus-cni-dir\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.395158 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395109 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fb4955c9-d4b5-4e21-a2ca-4d700832a59c-tmp-dir\") pod \"node-resolver-mb8s7\" (UID: \"fb4955c9-d4b5-4e21-a2ca-4d700832a59c\") " pod="openshift-dns/node-resolver-mb8s7" Apr 16 18:10:43.395158 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395125 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-host-run-k8s-cni-cncf-io\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.395158 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395142 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c6t2t\" (UniqueName: \"kubernetes.io/projected/8d97fc03-363d-4c04-9bad-08b08817cd63-kube-api-access-c6t2t\") pod \"aws-ebs-csi-driver-node-pqkzx\" (UID: \"8d97fc03-363d-4c04-9bad-08b08817cd63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pqkzx" Apr 16 18:10:43.395387 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395159 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/439613fb-5d3f-4d29-b662-f86a49f8e289-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-k5m7r\" (UID: \"439613fb-5d3f-4d29-b662-f86a49f8e289\") " pod="openshift-multus/multus-additional-cni-plugins-k5m7r" Apr 16 18:10:43.395387 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395176 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-system-cni-dir\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.395387 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395196 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8d97fc03-363d-4c04-9bad-08b08817cd63-sys-fs\") pod \"aws-ebs-csi-driver-node-pqkzx\" (UID: \"8d97fc03-363d-4c04-9bad-08b08817cd63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pqkzx" Apr 16 18:10:43.395387 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395221 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fb4955c9-d4b5-4e21-a2ca-4d700832a59c-hosts-file\") pod \"node-resolver-mb8s7\" (UID: \"fb4955c9-d4b5-4e21-a2ca-4d700832a59c\") " pod="openshift-dns/node-resolver-mb8s7" Apr 16 18:10:43.395387 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395246 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-cnibin\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.395387 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395270 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-hostroot\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.395387 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395271 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-system-cni-dir\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.395387 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395297 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xpmct\" (UniqueName: \"kubernetes.io/projected/82232800-52de-476a-a364-558f49009263-kube-api-access-xpmct\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.395387 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395327 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/439613fb-5d3f-4d29-b662-f86a49f8e289-cnibin\") pod \"multus-additional-cni-plugins-k5m7r\" (UID: \"439613fb-5d3f-4d29-b662-f86a49f8e289\") " pod="openshift-multus/multus-additional-cni-plugins-k5m7r" Apr 16 18:10:43.395387 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395351 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lmf8w\" (UniqueName: \"kubernetes.io/projected/439613fb-5d3f-4d29-b662-f86a49f8e289-kube-api-access-lmf8w\") pod \"multus-additional-cni-plugins-k5m7r\" (UID: \"439613fb-5d3f-4d29-b662-f86a49f8e289\") " pod="openshift-multus/multus-additional-cni-plugins-k5m7r" Apr 16 18:10:43.395387 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395377 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/439613fb-5d3f-4d29-b662-f86a49f8e289-system-cni-dir\") pod \"multus-additional-cni-plugins-k5m7r\" (UID: \"439613fb-5d3f-4d29-b662-f86a49f8e289\") " pod="openshift-multus/multus-additional-cni-plugins-k5m7r" Apr 16 18:10:43.395910 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395399 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-host-var-lib-cni-multus\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.395910 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395427 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8d97fc03-363d-4c04-9bad-08b08817cd63-device-dir\") pod \"aws-ebs-csi-driver-node-pqkzx\" (UID: \"8d97fc03-363d-4c04-9bad-08b08817cd63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pqkzx" Apr 16 18:10:43.395910 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395456 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/82232800-52de-476a-a364-558f49009263-cni-binary-copy\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.395910 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395465 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/439613fb-5d3f-4d29-b662-f86a49f8e289-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-k5m7r\" (UID: \"439613fb-5d3f-4d29-b662-f86a49f8e289\") " pod="openshift-multus/multus-additional-cni-plugins-k5m7r" Apr 16 18:10:43.395910 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395483 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/82232800-52de-476a-a364-558f49009263-multus-daemon-config\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.395910 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395457 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-host-run-k8s-cni-cncf-io\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.395910 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395515 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-host-run-netns\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.395910 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395522 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8d97fc03-363d-4c04-9bad-08b08817cd63-sys-fs\") pod \"aws-ebs-csi-driver-node-pqkzx\" (UID: \"8d97fc03-363d-4c04-9bad-08b08817cd63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pqkzx" Apr 16 18:10:43.395910 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395537 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-host-var-lib-cni-multus\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.395910 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395542 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-host-run-multus-certs\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.395910 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395568 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8d97fc03-363d-4c04-9bad-08b08817cd63-etc-selinux\") pod \"aws-ebs-csi-driver-node-pqkzx\" (UID: \"8d97fc03-363d-4c04-9bad-08b08817cd63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pqkzx" Apr 16 18:10:43.395910 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395572 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fb4955c9-d4b5-4e21-a2ca-4d700832a59c-hosts-file\") pod \"node-resolver-mb8s7\" (UID: \"fb4955c9-d4b5-4e21-a2ca-4d700832a59c\") " pod="openshift-dns/node-resolver-mb8s7" Apr 16 18:10:43.395910 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395592 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-multus-cni-dir\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.395910 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395603 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/439613fb-5d3f-4d29-b662-f86a49f8e289-tuning-conf-dir\") pod \"multus-additional-cni-plugins-k5m7r\" (UID: \"439613fb-5d3f-4d29-b662-f86a49f8e289\") " pod="openshift-multus/multus-additional-cni-plugins-k5m7r" Apr 16 18:10:43.395910 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395463 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fb4955c9-d4b5-4e21-a2ca-4d700832a59c-tmp-dir\") pod \"node-resolver-mb8s7\" (UID: \"fb4955c9-d4b5-4e21-a2ca-4d700832a59c\") " pod="openshift-dns/node-resolver-mb8s7" Apr 16 18:10:43.395910 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395666 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/439613fb-5d3f-4d29-b662-f86a49f8e289-cnibin\") pod \"multus-additional-cni-plugins-k5m7r\" (UID: \"439613fb-5d3f-4d29-b662-f86a49f8e289\") " pod="openshift-multus/multus-additional-cni-plugins-k5m7r" Apr 16 18:10:43.395910 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395672 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-multus-socket-dir-parent\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.396606 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395733 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8d97fc03-363d-4c04-9bad-08b08817cd63-etc-selinux\") pod \"aws-ebs-csi-driver-node-pqkzx\" (UID: \"8d97fc03-363d-4c04-9bad-08b08817cd63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pqkzx" Apr 16 18:10:43.396606 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395761 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8d97fc03-363d-4c04-9bad-08b08817cd63-device-dir\") pod \"aws-ebs-csi-driver-node-pqkzx\" (UID: \"8d97fc03-363d-4c04-9bad-08b08817cd63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pqkzx" Apr 16 18:10:43.396606 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395775 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-host-run-multus-certs\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.396606 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395786 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-cnibin\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.396606 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395668 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-host-run-netns\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.396606 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395792 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/439613fb-5d3f-4d29-b662-f86a49f8e289-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-k5m7r\" (UID: \"439613fb-5d3f-4d29-b662-f86a49f8e289\") " pod="openshift-multus/multus-additional-cni-plugins-k5m7r" Apr 16 18:10:43.396606 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395829 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/439613fb-5d3f-4d29-b662-f86a49f8e289-system-cni-dir\") pod \"multus-additional-cni-plugins-k5m7r\" (UID: \"439613fb-5d3f-4d29-b662-f86a49f8e289\") " pod="openshift-multus/multus-additional-cni-plugins-k5m7r" Apr 16 18:10:43.396606 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395796 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8d97fc03-363d-4c04-9bad-08b08817cd63-registration-dir\") pod \"aws-ebs-csi-driver-node-pqkzx\" (UID: \"8d97fc03-363d-4c04-9bad-08b08817cd63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pqkzx" Apr 16 18:10:43.396606 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395791 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-multus-socket-dir-parent\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.396606 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395843 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-hostroot\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.396606 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395852 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8d97fc03-363d-4c04-9bad-08b08817cd63-registration-dir\") pod \"aws-ebs-csi-driver-node-pqkzx\" (UID: \"8d97fc03-363d-4c04-9bad-08b08817cd63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pqkzx" Apr 16 18:10:43.396606 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395883 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-host-var-lib-cni-bin\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.396606 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395909 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-host-var-lib-kubelet\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.396606 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395935 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfwhj\" (UniqueName: \"kubernetes.io/projected/5ad84774-79a9-4253-9451-f7e900a7cb4d-kube-api-access-cfwhj\") pod \"network-check-target-r2j4b\" (UID: \"5ad84774-79a9-4253-9451-f7e900a7cb4d\") " pod="openshift-network-diagnostics/network-check-target-r2j4b" Apr 16 18:10:43.396606 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395940 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-host-var-lib-cni-bin\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.396606 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395960 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/439613fb-5d3f-4d29-b662-f86a49f8e289-cni-binary-copy\") pod \"multus-additional-cni-plugins-k5m7r\" (UID: \"439613fb-5d3f-4d29-b662-f86a49f8e289\") " pod="openshift-multus/multus-additional-cni-plugins-k5m7r" Apr 16 18:10:43.396606 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395956 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-host-var-lib-kubelet\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.397104 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.395984 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-os-release\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.397104 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.396008 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-multus-conf-dir\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.397104 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.396035 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-etc-kubernetes\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.397104 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.396060 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8d97fc03-363d-4c04-9bad-08b08817cd63-socket-dir\") pod \"aws-ebs-csi-driver-node-pqkzx\" (UID: \"8d97fc03-363d-4c04-9bad-08b08817cd63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pqkzx" Apr 16 18:10:43.397104 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.396083 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/439613fb-5d3f-4d29-b662-f86a49f8e289-os-release\") pod \"multus-additional-cni-plugins-k5m7r\" (UID: \"439613fb-5d3f-4d29-b662-f86a49f8e289\") " pod="openshift-multus/multus-additional-cni-plugins-k5m7r" Apr 16 18:10:43.397104 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.396088 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-os-release\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.397104 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.396108 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8m8ss\" (UniqueName: \"kubernetes.io/projected/fb4955c9-d4b5-4e21-a2ca-4d700832a59c-kube-api-access-8m8ss\") pod \"node-resolver-mb8s7\" (UID: \"fb4955c9-d4b5-4e21-a2ca-4d700832a59c\") " pod="openshift-dns/node-resolver-mb8s7" Apr 16 18:10:43.397104 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.396164 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-multus-conf-dir\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.397104 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.396230 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8d97fc03-363d-4c04-9bad-08b08817cd63-socket-dir\") pod \"aws-ebs-csi-driver-node-pqkzx\" (UID: \"8d97fc03-363d-4c04-9bad-08b08817cd63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pqkzx" Apr 16 18:10:43.397104 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.396235 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/82232800-52de-476a-a364-558f49009263-cni-binary-copy\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.397104 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.396248 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/82232800-52de-476a-a364-558f49009263-multus-daemon-config\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.397104 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.396233 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/439613fb-5d3f-4d29-b662-f86a49f8e289-os-release\") pod \"multus-additional-cni-plugins-k5m7r\" (UID: \"439613fb-5d3f-4d29-b662-f86a49f8e289\") " pod="openshift-multus/multus-additional-cni-plugins-k5m7r" Apr 16 18:10:43.397104 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.396264 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82232800-52de-476a-a364-558f49009263-etc-kubernetes\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.397104 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.396415 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/439613fb-5d3f-4d29-b662-f86a49f8e289-tuning-conf-dir\") pod \"multus-additional-cni-plugins-k5m7r\" (UID: \"439613fb-5d3f-4d29-b662-f86a49f8e289\") " pod="openshift-multus/multus-additional-cni-plugins-k5m7r" Apr 16 18:10:43.397104 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.396485 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/439613fb-5d3f-4d29-b662-f86a49f8e289-cni-binary-copy\") pod \"multus-additional-cni-plugins-k5m7r\" (UID: \"439613fb-5d3f-4d29-b662-f86a49f8e289\") " pod="openshift-multus/multus-additional-cni-plugins-k5m7r" Apr 16 18:10:43.403808 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:43.403788 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:10:43.403808 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:43.403810 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:10:43.403982 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:43.403822 2576 projected.go:194] Error preparing data for projected volume kube-api-access-cfwhj for pod openshift-network-diagnostics/network-check-target-r2j4b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:43.403982 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:43.403882 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ad84774-79a9-4253-9451-f7e900a7cb4d-kube-api-access-cfwhj podName:5ad84774-79a9-4253-9451-f7e900a7cb4d nodeName:}" failed. No retries permitted until 2026-04-16 18:10:43.903864935 +0000 UTC m=+3.105440185 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cfwhj" (UniqueName: "kubernetes.io/projected/5ad84774-79a9-4253-9451-f7e900a7cb4d-kube-api-access-cfwhj") pod "network-check-target-r2j4b" (UID: "5ad84774-79a9-4253-9451-f7e900a7cb4d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:43.404564 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.404542 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6t2t\" (UniqueName: \"kubernetes.io/projected/8d97fc03-363d-4c04-9bad-08b08817cd63-kube-api-access-c6t2t\") pod \"aws-ebs-csi-driver-node-pqkzx\" (UID: \"8d97fc03-363d-4c04-9bad-08b08817cd63\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pqkzx" Apr 16 18:10:43.404817 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.404792 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpmct\" (UniqueName: \"kubernetes.io/projected/82232800-52de-476a-a364-558f49009263-kube-api-access-xpmct\") pod \"multus-tkxn5\" (UID: \"82232800-52de-476a-a364-558f49009263\") " pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.406029 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.406008 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmf8w\" (UniqueName: \"kubernetes.io/projected/439613fb-5d3f-4d29-b662-f86a49f8e289-kube-api-access-lmf8w\") pod \"multus-additional-cni-plugins-k5m7r\" (UID: \"439613fb-5d3f-4d29-b662-f86a49f8e289\") " pod="openshift-multus/multus-additional-cni-plugins-k5m7r" Apr 16 18:10:43.406136 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.406010 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m8ss\" (UniqueName: \"kubernetes.io/projected/fb4955c9-d4b5-4e21-a2ca-4d700832a59c-kube-api-access-8m8ss\") pod \"node-resolver-mb8s7\" (UID: \"fb4955c9-d4b5-4e21-a2ca-4d700832a59c\") " pod="openshift-dns/node-resolver-mb8s7" Apr 16 18:10:43.488427 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.488400 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jmzcb" Apr 16 18:10:43.493499 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.493478 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:10:43.496683 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.496664 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-d949z" Apr 16 18:10:43.505138 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.505117 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" Apr 16 18:10:43.510621 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.510605 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-22zxr" Apr 16 18:10:43.517186 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.517165 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:10:43.522735 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.522715 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pqkzx" Apr 16 18:10:43.530219 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.530199 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mb8s7" Apr 16 18:10:43.535734 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.535716 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-k5m7r" Apr 16 18:10:43.541224 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.541208 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tkxn5" Apr 16 18:10:43.798791 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:43.798691 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/22eec1cf-b2b9-495f-9507-ee4b6c6a9204-metrics-certs\") pod \"network-metrics-daemon-fwqx5\" (UID: \"22eec1cf-b2b9-495f-9507-ee4b6c6a9204\") " pod="openshift-multus/network-metrics-daemon-fwqx5" Apr 16 18:10:43.798925 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:43.798838 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:43.798979 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:43.798926 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22eec1cf-b2b9-495f-9507-ee4b6c6a9204-metrics-certs podName:22eec1cf-b2b9-495f-9507-ee4b6c6a9204 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:44.798903796 +0000 UTC m=+4.000479054 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/22eec1cf-b2b9-495f-9507-ee4b6c6a9204-metrics-certs") pod "network-metrics-daemon-fwqx5" (UID: "22eec1cf-b2b9-495f-9507-ee4b6c6a9204") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:43.895554 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:43.895520 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb4955c9_d4b5_4e21_a2ca_4d700832a59c.slice/crio-faa60b836282bb422793f1333ff2311bb365792f54886650afc74fac65dde966 WatchSource:0}: Error finding container faa60b836282bb422793f1333ff2311bb365792f54886650afc74fac65dde966: Status 404 returned error can't find the container with id faa60b836282bb422793f1333ff2311bb365792f54886650afc74fac65dde966 Apr 16 18:10:43.899744 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:43.899714 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d97fc03_363d_4c04_9bad_08b08817cd63.slice/crio-767010e5d97e80ffeaf5b118e67b2d6637d19e487e8e7a4122d394d806a9d5a9 WatchSource:0}: Error finding container 767010e5d97e80ffeaf5b118e67b2d6637d19e487e8e7a4122d394d806a9d5a9: Status 404 returned error can't find the container with id 767010e5d97e80ffeaf5b118e67b2d6637d19e487e8e7a4122d394d806a9d5a9 Apr 16 18:10:43.901182 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:43.901157 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e9d6b21_2120_44f2_8a4a_d991547263f2.slice/crio-8cd3af9e629646fabc7921f4b48a7617b145427611febf7491954d3e9b06a249 WatchSource:0}: Error finding container 8cd3af9e629646fabc7921f4b48a7617b145427611febf7491954d3e9b06a249: Status 404 returned error can't find the container with id 8cd3af9e629646fabc7921f4b48a7617b145427611febf7491954d3e9b06a249 Apr 16 18:10:43.902511 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:43.902491 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20b76c8e_45ad_41f5_b3f5_1cc2a66d4881.slice/crio-593dd6f908ced343a1a653a1455efba94c1e4017574c94c48645b118ff986d1e WatchSource:0}: Error finding container 593dd6f908ced343a1a653a1455efba94c1e4017574c94c48645b118ff986d1e: Status 404 returned error can't find the container with id 593dd6f908ced343a1a653a1455efba94c1e4017574c94c48645b118ff986d1e Apr 16 18:10:43.903082 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:43.903062 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b075c15_1f6a_48c0_af94_65cd41bdc367.slice/crio-cabdbd075059ea0377ba847d4a2815e3c23a20f2f8ad7dd96f9ed3691c25ef02 WatchSource:0}: Error finding container cabdbd075059ea0377ba847d4a2815e3c23a20f2f8ad7dd96f9ed3691c25ef02: Status 404 returned error can't find the container with id cabdbd075059ea0377ba847d4a2815e3c23a20f2f8ad7dd96f9ed3691c25ef02 Apr 16 18:10:43.905122 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:43.905101 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod439613fb_5d3f_4d29_b662_f86a49f8e289.slice/crio-98c834c7bfa40f33dbcb2f1a5f55dcae18cf4efc9cdf8be94e20a86089f22a14 WatchSource:0}: Error finding container 98c834c7bfa40f33dbcb2f1a5f55dcae18cf4efc9cdf8be94e20a86089f22a14: Status 404 returned error can't find the container with id 98c834c7bfa40f33dbcb2f1a5f55dcae18cf4efc9cdf8be94e20a86089f22a14 Apr 16 18:10:43.906066 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:43.906044 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76ccc7ff_6855_49c3_a0b5_185487ae8516.slice/crio-6b3e56b296152889dbfea8c0fe36c05a74abcc2bb0950ccfda10bc3187e8bebe WatchSource:0}: Error finding container 6b3e56b296152889dbfea8c0fe36c05a74abcc2bb0950ccfda10bc3187e8bebe: Status 404 returned error can't find the container with id 6b3e56b296152889dbfea8c0fe36c05a74abcc2bb0950ccfda10bc3187e8bebe Apr 16 18:10:43.909615 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:10:43.909386 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82232800_52de_476a_a364_558f49009263.slice/crio-3b28bf14bc7408c33d3aee4898d51cb137a485e6200c503d2f8ec8bb8878abc4 WatchSource:0}: Error finding container 3b28bf14bc7408c33d3aee4898d51cb137a485e6200c503d2f8ec8bb8878abc4: Status 404 returned error can't find the container with id 3b28bf14bc7408c33d3aee4898d51cb137a485e6200c503d2f8ec8bb8878abc4 Apr 16 18:10:44.000346 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:44.000237 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfwhj\" (UniqueName: \"kubernetes.io/projected/5ad84774-79a9-4253-9451-f7e900a7cb4d-kube-api-access-cfwhj\") pod \"network-check-target-r2j4b\" (UID: \"5ad84774-79a9-4253-9451-f7e900a7cb4d\") " pod="openshift-network-diagnostics/network-check-target-r2j4b" Apr 16 18:10:44.000452 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:44.000396 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:10:44.000452 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:44.000421 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:10:44.000452 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:44.000434 2576 projected.go:194] Error preparing data for projected volume kube-api-access-cfwhj for pod openshift-network-diagnostics/network-check-target-r2j4b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:44.000609 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:44.000487 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ad84774-79a9-4253-9451-f7e900a7cb4d-kube-api-access-cfwhj podName:5ad84774-79a9-4253-9451-f7e900a7cb4d nodeName:}" failed. No retries permitted until 2026-04-16 18:10:45.000468126 +0000 UTC m=+4.202043395 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cfwhj" (UniqueName: "kubernetes.io/projected/5ad84774-79a9-4253-9451-f7e900a7cb4d-kube-api-access-cfwhj") pod "network-check-target-r2j4b" (UID: "5ad84774-79a9-4253-9451-f7e900a7cb4d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:44.231536 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:44.231388 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:05:42 +0000 UTC" deadline="2027-09-27 05:02:09.437898539 +0000 UTC" Apr 16 18:10:44.231536 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:44.231425 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12682h51m25.206477422s" Apr 16 18:10:44.343842 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:44.343105 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-96.ec2.internal" event={"ID":"ed8c920fb76e0e328ed5f4aa00cb172f","Type":"ContainerStarted","Data":"d488598c977702f3377e9a7c78e94acfab5184db99383aee5e6941fd794be78e"} Apr 16 18:10:44.346135 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:44.346087 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" event={"ID":"91a81544-645b-4829-be2c-a425f6f14d64","Type":"ContainerStarted","Data":"d597a6b009786416bd0065b8897394cc052c7fd94d35bdf14e75aeba72cbfbf8"} Apr 16 18:10:44.350447 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:44.350382 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tkxn5" event={"ID":"82232800-52de-476a-a364-558f49009263","Type":"ContainerStarted","Data":"3b28bf14bc7408c33d3aee4898d51cb137a485e6200c503d2f8ec8bb8878abc4"} Apr 16 18:10:44.357859 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:44.357831 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k5m7r" event={"ID":"439613fb-5d3f-4d29-b662-f86a49f8e289","Type":"ContainerStarted","Data":"98c834c7bfa40f33dbcb2f1a5f55dcae18cf4efc9cdf8be94e20a86089f22a14"} Apr 16 18:10:44.359514 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:44.359455 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-96.ec2.internal" podStartSLOduration=2.359439564 podStartE2EDuration="2.359439564s" podCreationTimestamp="2026-04-16 18:10:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:10:44.358076118 +0000 UTC m=+3.559651387" watchObservedRunningTime="2026-04-16 18:10:44.359439564 +0000 UTC m=+3.561014832" Apr 16 18:10:44.365094 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:44.365064 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-22zxr" event={"ID":"1b075c15-1f6a-48c0-af94-65cd41bdc367","Type":"ContainerStarted","Data":"cabdbd075059ea0377ba847d4a2815e3c23a20f2f8ad7dd96f9ed3691c25ef02"} Apr 16 18:10:44.369754 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:44.369684 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" event={"ID":"5e9d6b21-2120-44f2-8a4a-d991547263f2","Type":"ContainerStarted","Data":"8cd3af9e629646fabc7921f4b48a7617b145427611febf7491954d3e9b06a249"} Apr 16 18:10:44.376029 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:44.376003 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pqkzx" event={"ID":"8d97fc03-363d-4c04-9bad-08b08817cd63","Type":"ContainerStarted","Data":"767010e5d97e80ffeaf5b118e67b2d6637d19e487e8e7a4122d394d806a9d5a9"} Apr 16 18:10:44.380221 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:44.380177 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mb8s7" event={"ID":"fb4955c9-d4b5-4e21-a2ca-4d700832a59c","Type":"ContainerStarted","Data":"faa60b836282bb422793f1333ff2311bb365792f54886650afc74fac65dde966"} Apr 16 18:10:44.381540 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:44.381485 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jmzcb" event={"ID":"76ccc7ff-6855-49c3-a0b5-185487ae8516","Type":"ContainerStarted","Data":"6b3e56b296152889dbfea8c0fe36c05a74abcc2bb0950ccfda10bc3187e8bebe"} Apr 16 18:10:44.383568 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:44.383510 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-d949z" event={"ID":"20b76c8e-45ad-41f5-b3f5-1cc2a66d4881","Type":"ContainerStarted","Data":"593dd6f908ced343a1a653a1455efba94c1e4017574c94c48645b118ff986d1e"} Apr 16 18:10:44.436822 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:44.436493 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:10:44.808861 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:44.808830 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/22eec1cf-b2b9-495f-9507-ee4b6c6a9204-metrics-certs\") pod \"network-metrics-daemon-fwqx5\" (UID: \"22eec1cf-b2b9-495f-9507-ee4b6c6a9204\") " pod="openshift-multus/network-metrics-daemon-fwqx5" Apr 16 18:10:44.808971 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:44.808956 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:44.809029 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:44.809009 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22eec1cf-b2b9-495f-9507-ee4b6c6a9204-metrics-certs podName:22eec1cf-b2b9-495f-9507-ee4b6c6a9204 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:46.808992393 +0000 UTC m=+6.010567646 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/22eec1cf-b2b9-495f-9507-ee4b6c6a9204-metrics-certs") pod "network-metrics-daemon-fwqx5" (UID: "22eec1cf-b2b9-495f-9507-ee4b6c6a9204") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:45.010121 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:45.010051 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfwhj\" (UniqueName: \"kubernetes.io/projected/5ad84774-79a9-4253-9451-f7e900a7cb4d-kube-api-access-cfwhj\") pod \"network-check-target-r2j4b\" (UID: \"5ad84774-79a9-4253-9451-f7e900a7cb4d\") " pod="openshift-network-diagnostics/network-check-target-r2j4b" Apr 16 18:10:45.010258 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:45.010207 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:10:45.010258 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:45.010226 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:10:45.010258 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:45.010238 2576 projected.go:194] Error preparing data for projected volume kube-api-access-cfwhj for pod openshift-network-diagnostics/network-check-target-r2j4b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:45.010415 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:45.010294 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ad84774-79a9-4253-9451-f7e900a7cb4d-kube-api-access-cfwhj podName:5ad84774-79a9-4253-9451-f7e900a7cb4d nodeName:}" failed. No retries permitted until 2026-04-16 18:10:47.01027672 +0000 UTC m=+6.211851970 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cfwhj" (UniqueName: "kubernetes.io/projected/5ad84774-79a9-4253-9451-f7e900a7cb4d-kube-api-access-cfwhj") pod "network-check-target-r2j4b" (UID: "5ad84774-79a9-4253-9451-f7e900a7cb4d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:45.322607 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:45.321847 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwqx5" Apr 16 18:10:45.322607 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:45.321974 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwqx5" podUID="22eec1cf-b2b9-495f-9507-ee4b6c6a9204" Apr 16 18:10:45.322607 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:45.322425 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r2j4b" Apr 16 18:10:45.322607 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:45.322518 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r2j4b" podUID="5ad84774-79a9-4253-9451-f7e900a7cb4d" Apr 16 18:10:45.393249 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:45.392235 2576 generic.go:358] "Generic (PLEG): container finished" podID="5192824af33e3f669380bf45ab3cf23a" containerID="6799bc176fbc85a185b9cfbcde425b8b973bf40b430e29e8c590f822aa39c211" exitCode=0 Apr 16 18:10:45.393249 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:45.393067 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal" event={"ID":"5192824af33e3f669380bf45ab3cf23a","Type":"ContainerDied","Data":"6799bc176fbc85a185b9cfbcde425b8b973bf40b430e29e8c590f822aa39c211"} Apr 16 18:10:46.402272 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:46.402236 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal" event={"ID":"5192824af33e3f669380bf45ab3cf23a","Type":"ContainerStarted","Data":"670b1e029006c643a72f425fab81bb435cc59c94d9df790cc0c21245e07015ed"} Apr 16 18:10:46.418602 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:46.418555 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal" podStartSLOduration=4.418538155 podStartE2EDuration="4.418538155s" podCreationTimestamp="2026-04-16 18:10:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:10:46.41803578 +0000 UTC m=+5.619611061" watchObservedRunningTime="2026-04-16 18:10:46.418538155 +0000 UTC m=+5.620113424" Apr 16 18:10:46.823883 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:46.823766 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/22eec1cf-b2b9-495f-9507-ee4b6c6a9204-metrics-certs\") pod \"network-metrics-daemon-fwqx5\" (UID: \"22eec1cf-b2b9-495f-9507-ee4b6c6a9204\") " pod="openshift-multus/network-metrics-daemon-fwqx5" Apr 16 18:10:46.824049 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:46.823934 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:46.824049 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:46.824008 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22eec1cf-b2b9-495f-9507-ee4b6c6a9204-metrics-certs podName:22eec1cf-b2b9-495f-9507-ee4b6c6a9204 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:50.823988689 +0000 UTC m=+10.025563948 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/22eec1cf-b2b9-495f-9507-ee4b6c6a9204-metrics-certs") pod "network-metrics-daemon-fwqx5" (UID: "22eec1cf-b2b9-495f-9507-ee4b6c6a9204") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:47.024759 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:47.024722 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfwhj\" (UniqueName: \"kubernetes.io/projected/5ad84774-79a9-4253-9451-f7e900a7cb4d-kube-api-access-cfwhj\") pod \"network-check-target-r2j4b\" (UID: \"5ad84774-79a9-4253-9451-f7e900a7cb4d\") " pod="openshift-network-diagnostics/network-check-target-r2j4b" Apr 16 18:10:47.024936 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:47.024915 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:10:47.025012 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:47.024939 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:10:47.025012 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:47.024952 2576 projected.go:194] Error preparing data for projected volume kube-api-access-cfwhj for pod openshift-network-diagnostics/network-check-target-r2j4b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:47.025012 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:47.025008 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ad84774-79a9-4253-9451-f7e900a7cb4d-kube-api-access-cfwhj podName:5ad84774-79a9-4253-9451-f7e900a7cb4d nodeName:}" failed. No retries permitted until 2026-04-16 18:10:51.024989506 +0000 UTC m=+10.226564754 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cfwhj" (UniqueName: "kubernetes.io/projected/5ad84774-79a9-4253-9451-f7e900a7cb4d-kube-api-access-cfwhj") pod "network-check-target-r2j4b" (UID: "5ad84774-79a9-4253-9451-f7e900a7cb4d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:47.322898 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:47.322815 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwqx5" Apr 16 18:10:47.323044 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:47.322950 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwqx5" podUID="22eec1cf-b2b9-495f-9507-ee4b6c6a9204" Apr 16 18:10:47.323375 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:47.323356 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r2j4b" Apr 16 18:10:47.323477 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:47.323454 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r2j4b" podUID="5ad84774-79a9-4253-9451-f7e900a7cb4d" Apr 16 18:10:49.322652 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:49.322613 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwqx5" Apr 16 18:10:49.322652 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:49.322651 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r2j4b" Apr 16 18:10:49.323246 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:49.322775 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwqx5" podUID="22eec1cf-b2b9-495f-9507-ee4b6c6a9204" Apr 16 18:10:49.323246 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:49.322844 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r2j4b" podUID="5ad84774-79a9-4253-9451-f7e900a7cb4d" Apr 16 18:10:50.854312 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:50.854266 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/22eec1cf-b2b9-495f-9507-ee4b6c6a9204-metrics-certs\") pod \"network-metrics-daemon-fwqx5\" (UID: \"22eec1cf-b2b9-495f-9507-ee4b6c6a9204\") " pod="openshift-multus/network-metrics-daemon-fwqx5" Apr 16 18:10:50.854760 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:50.854426 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:50.854760 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:50.854491 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22eec1cf-b2b9-495f-9507-ee4b6c6a9204-metrics-certs podName:22eec1cf-b2b9-495f-9507-ee4b6c6a9204 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:58.854470846 +0000 UTC m=+18.056046095 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/22eec1cf-b2b9-495f-9507-ee4b6c6a9204-metrics-certs") pod "network-metrics-daemon-fwqx5" (UID: "22eec1cf-b2b9-495f-9507-ee4b6c6a9204") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:51.055249 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:51.055212 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfwhj\" (UniqueName: \"kubernetes.io/projected/5ad84774-79a9-4253-9451-f7e900a7cb4d-kube-api-access-cfwhj\") pod \"network-check-target-r2j4b\" (UID: \"5ad84774-79a9-4253-9451-f7e900a7cb4d\") " pod="openshift-network-diagnostics/network-check-target-r2j4b" Apr 16 18:10:51.055478 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:51.055459 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:10:51.055560 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:51.055482 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:10:51.055560 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:51.055491 2576 projected.go:194] Error preparing data for projected volume kube-api-access-cfwhj for pod openshift-network-diagnostics/network-check-target-r2j4b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:51.055560 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:51.055545 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ad84774-79a9-4253-9451-f7e900a7cb4d-kube-api-access-cfwhj podName:5ad84774-79a9-4253-9451-f7e900a7cb4d nodeName:}" failed. No retries permitted until 2026-04-16 18:10:59.055526207 +0000 UTC m=+18.257101456 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cfwhj" (UniqueName: "kubernetes.io/projected/5ad84774-79a9-4253-9451-f7e900a7cb4d-kube-api-access-cfwhj") pod "network-check-target-r2j4b" (UID: "5ad84774-79a9-4253-9451-f7e900a7cb4d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:51.323315 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:51.323222 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwqx5" Apr 16 18:10:51.323468 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:51.323345 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwqx5" podUID="22eec1cf-b2b9-495f-9507-ee4b6c6a9204" Apr 16 18:10:51.323468 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:51.323352 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r2j4b" Apr 16 18:10:51.323468 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:51.323428 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r2j4b" podUID="5ad84774-79a9-4253-9451-f7e900a7cb4d" Apr 16 18:10:53.321762 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:53.321728 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r2j4b" Apr 16 18:10:53.322132 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:53.321735 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwqx5" Apr 16 18:10:53.322132 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:53.321862 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r2j4b" podUID="5ad84774-79a9-4253-9451-f7e900a7cb4d" Apr 16 18:10:53.322132 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:53.321947 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwqx5" podUID="22eec1cf-b2b9-495f-9507-ee4b6c6a9204" Apr 16 18:10:53.415508 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:53.415429 2576 generic.go:358] "Generic (PLEG): container finished" podID="439613fb-5d3f-4d29-b662-f86a49f8e289" containerID="57443f64198015bf3e732ac3ff4e2ece33990b1326fd0f279c0bb01f48a7b12b" exitCode=0 Apr 16 18:10:53.415508 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:53.415474 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k5m7r" event={"ID":"439613fb-5d3f-4d29-b662-f86a49f8e289","Type":"ContainerDied","Data":"57443f64198015bf3e732ac3ff4e2ece33990b1326fd0f279c0bb01f48a7b12b"} Apr 16 18:10:53.418623 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:53.418596 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pqkzx" event={"ID":"8d97fc03-363d-4c04-9bad-08b08817cd63","Type":"ContainerStarted","Data":"184d5edb65b1b2dfd5f88bf61ed50aa1cd8bb814d081de6023fd53f69ac5db24"} Apr 16 18:10:53.420627 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:53.420577 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mb8s7" event={"ID":"fb4955c9-d4b5-4e21-a2ca-4d700832a59c","Type":"ContainerStarted","Data":"876ebb57545b5077d9cc9bec7138ea360359e5f0cdd4c19cf47718550ad9621b"} Apr 16 18:10:53.423240 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:53.423216 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jmzcb" event={"ID":"76ccc7ff-6855-49c3-a0b5-185487ae8516","Type":"ContainerStarted","Data":"a986bcb25b179f3279a1b2ea07758695433c92d798f380c0dc0fa1209afbd321"} Apr 16 18:10:53.425027 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:53.425004 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-d949z" event={"ID":"20b76c8e-45ad-41f5-b3f5-1cc2a66d4881","Type":"ContainerStarted","Data":"5ab2d3979ec974859eae3943043c4d06fdc702bf8571d77e6b979cbbc8b76c30"} Apr 16 18:10:53.426921 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:53.426899 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" event={"ID":"91a81544-645b-4829-be2c-a425f6f14d64","Type":"ContainerStarted","Data":"ff2551fb30cdf86a76c459850165fbe075d7e75552d8d5023b3a656eeb705bd7"} Apr 16 18:10:53.471552 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:53.471490 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-mb8s7" podStartSLOduration=3.753158258 podStartE2EDuration="12.47147663s" podCreationTimestamp="2026-04-16 18:10:41 +0000 UTC" firstStartedPulling="2026-04-16 18:10:43.897777371 +0000 UTC m=+3.099352620" lastFinishedPulling="2026-04-16 18:10:52.616095737 +0000 UTC m=+11.817670992" observedRunningTime="2026-04-16 18:10:53.455542847 +0000 UTC m=+12.657118119" watchObservedRunningTime="2026-04-16 18:10:53.47147663 +0000 UTC m=+12.673051900" Apr 16 18:10:53.471672 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:53.471584 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jmzcb" podStartSLOduration=3.794148087 podStartE2EDuration="12.471576746s" podCreationTimestamp="2026-04-16 18:10:41 +0000 UTC" firstStartedPulling="2026-04-16 18:10:43.910539885 +0000 UTC m=+3.112115131" lastFinishedPulling="2026-04-16 18:10:52.587968531 +0000 UTC m=+11.789543790" observedRunningTime="2026-04-16 18:10:53.470978332 +0000 UTC m=+12.672553602" watchObservedRunningTime="2026-04-16 18:10:53.471576746 +0000 UTC m=+12.673152016" Apr 16 18:10:53.485595 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:53.485553 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-d949z" podStartSLOduration=3.807633506 podStartE2EDuration="12.4855392s" podCreationTimestamp="2026-04-16 18:10:41 +0000 UTC" firstStartedPulling="2026-04-16 18:10:43.90403418 +0000 UTC m=+3.105609425" lastFinishedPulling="2026-04-16 18:10:52.581939867 +0000 UTC m=+11.783515119" observedRunningTime="2026-04-16 18:10:53.485145768 +0000 UTC m=+12.686721036" watchObservedRunningTime="2026-04-16 18:10:53.4855392 +0000 UTC m=+12.687114469" Apr 16 18:10:53.889405 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:53.889209 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 18:10:54.244896 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:54.244803 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T18:10:53.889403449Z","UUID":"d03e3aab-674c-4181-997e-e43bc9e1cfe5","Handler":null,"Name":"","Endpoint":""} Apr 16 18:10:54.246786 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:54.246763 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 18:10:54.247029 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:54.246795 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 18:10:54.431417 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:54.431332 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pqkzx" event={"ID":"8d97fc03-363d-4c04-9bad-08b08817cd63","Type":"ContainerStarted","Data":"a2611ae37644354af789499184c63451fb937f409970a68c6e89e5d3ee82466f"} Apr 16 18:10:54.433020 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:54.432987 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-22zxr" event={"ID":"1b075c15-1f6a-48c0-af94-65cd41bdc367","Type":"ContainerStarted","Data":"e8af0864d3cb4b1f5359546832d80baee853342cb8b29b4622fa9906be11e048"} Apr 16 18:10:54.448113 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:54.448064 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-lsm5b" podStartSLOduration=4.7241538720000005 podStartE2EDuration="13.448047234s" podCreationTimestamp="2026-04-16 18:10:41 +0000 UTC" firstStartedPulling="2026-04-16 18:10:43.910458055 +0000 UTC m=+3.112033308" lastFinishedPulling="2026-04-16 18:10:52.634351407 +0000 UTC m=+11.835926670" observedRunningTime="2026-04-16 18:10:53.500851394 +0000 UTC m=+12.702426687" watchObservedRunningTime="2026-04-16 18:10:54.448047234 +0000 UTC m=+13.649622504" Apr 16 18:10:55.321823 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:55.321795 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwqx5" Apr 16 18:10:55.322039 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:55.321924 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwqx5" podUID="22eec1cf-b2b9-495f-9507-ee4b6c6a9204" Apr 16 18:10:55.322039 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:55.321954 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r2j4b" Apr 16 18:10:55.322157 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:55.322036 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r2j4b" podUID="5ad84774-79a9-4253-9451-f7e900a7cb4d" Apr 16 18:10:57.321898 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:57.321867 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r2j4b" Apr 16 18:10:57.322548 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:57.321872 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwqx5" Apr 16 18:10:57.322548 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:57.321962 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r2j4b" podUID="5ad84774-79a9-4253-9451-f7e900a7cb4d" Apr 16 18:10:57.322548 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:57.322084 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwqx5" podUID="22eec1cf-b2b9-495f-9507-ee4b6c6a9204" Apr 16 18:10:57.851288 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:57.851244 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-d949z" Apr 16 18:10:57.852009 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:57.851984 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-d949z" Apr 16 18:10:57.867017 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:57.866973 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-22zxr" podStartSLOduration=8.185178596 podStartE2EDuration="16.866957226s" podCreationTimestamp="2026-04-16 18:10:41 +0000 UTC" firstStartedPulling="2026-04-16 18:10:43.905683822 +0000 UTC m=+3.107259081" lastFinishedPulling="2026-04-16 18:10:52.58746245 +0000 UTC m=+11.789037711" observedRunningTime="2026-04-16 18:10:54.447380814 +0000 UTC m=+13.648956123" watchObservedRunningTime="2026-04-16 18:10:57.866957226 +0000 UTC m=+17.068532493" Apr 16 18:10:58.439818 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:58.439777 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-d949z" Apr 16 18:10:58.440343 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:58.440323 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-d949z" Apr 16 18:10:58.921395 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:58.921320 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/22eec1cf-b2b9-495f-9507-ee4b6c6a9204-metrics-certs\") pod \"network-metrics-daemon-fwqx5\" (UID: \"22eec1cf-b2b9-495f-9507-ee4b6c6a9204\") " pod="openshift-multus/network-metrics-daemon-fwqx5" Apr 16 18:10:58.921547 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:58.921439 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:58.921547 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:58.921503 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22eec1cf-b2b9-495f-9507-ee4b6c6a9204-metrics-certs podName:22eec1cf-b2b9-495f-9507-ee4b6c6a9204 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:14.921488368 +0000 UTC m=+34.123063614 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/22eec1cf-b2b9-495f-9507-ee4b6c6a9204-metrics-certs") pod "network-metrics-daemon-fwqx5" (UID: "22eec1cf-b2b9-495f-9507-ee4b6c6a9204") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:59.122622 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:59.122589 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfwhj\" (UniqueName: \"kubernetes.io/projected/5ad84774-79a9-4253-9451-f7e900a7cb4d-kube-api-access-cfwhj\") pod \"network-check-target-r2j4b\" (UID: \"5ad84774-79a9-4253-9451-f7e900a7cb4d\") " pod="openshift-network-diagnostics/network-check-target-r2j4b" Apr 16 18:10:59.122781 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:59.122752 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:10:59.122781 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:59.122767 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:10:59.122781 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:59.122777 2576 projected.go:194] Error preparing data for projected volume kube-api-access-cfwhj for pod openshift-network-diagnostics/network-check-target-r2j4b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:59.122914 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:59.122824 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ad84774-79a9-4253-9451-f7e900a7cb4d-kube-api-access-cfwhj podName:5ad84774-79a9-4253-9451-f7e900a7cb4d nodeName:}" failed. No retries permitted until 2026-04-16 18:11:15.122811806 +0000 UTC m=+34.324387051 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cfwhj" (UniqueName: "kubernetes.io/projected/5ad84774-79a9-4253-9451-f7e900a7cb4d-kube-api-access-cfwhj") pod "network-check-target-r2j4b" (UID: "5ad84774-79a9-4253-9451-f7e900a7cb4d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:59.321913 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:59.321839 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r2j4b" Apr 16 18:10:59.322067 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:10:59.321850 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwqx5" Apr 16 18:10:59.322067 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:59.321955 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r2j4b" podUID="5ad84774-79a9-4253-9451-f7e900a7cb4d" Apr 16 18:10:59.322160 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:10:59.322064 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwqx5" podUID="22eec1cf-b2b9-495f-9507-ee4b6c6a9204" Apr 16 18:11:01.323033 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:01.322995 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r2j4b" Apr 16 18:11:01.323735 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:01.323092 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r2j4b" podUID="5ad84774-79a9-4253-9451-f7e900a7cb4d" Apr 16 18:11:01.323735 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:01.323191 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwqx5" Apr 16 18:11:01.323735 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:01.323307 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwqx5" podUID="22eec1cf-b2b9-495f-9507-ee4b6c6a9204" Apr 16 18:11:03.321544 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:03.321513 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwqx5" Apr 16 18:11:03.322080 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:03.321657 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwqx5" podUID="22eec1cf-b2b9-495f-9507-ee4b6c6a9204" Apr 16 18:11:03.322080 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:03.321727 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r2j4b" Apr 16 18:11:03.322080 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:03.321814 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r2j4b" podUID="5ad84774-79a9-4253-9451-f7e900a7cb4d" Apr 16 18:11:04.449351 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:04.449039 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tkxn5" event={"ID":"82232800-52de-476a-a364-558f49009263","Type":"ContainerStarted","Data":"e95a44d05a2fb744b76ca1e332114947e1f073ea29bb3f2cd6a6eb0656b0d41b"} Apr 16 18:11:04.450765 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:04.450743 2576 generic.go:358] "Generic (PLEG): container finished" podID="439613fb-5d3f-4d29-b662-f86a49f8e289" containerID="f719a90fdc56902d40082b94ce99acd02472029e132c180f9024e03181c6bf3a" exitCode=0 Apr 16 18:11:04.450871 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:04.450810 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k5m7r" event={"ID":"439613fb-5d3f-4d29-b662-f86a49f8e289","Type":"ContainerDied","Data":"f719a90fdc56902d40082b94ce99acd02472029e132c180f9024e03181c6bf3a"} Apr 16 18:11:04.453639 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:04.453610 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" event={"ID":"5e9d6b21-2120-44f2-8a4a-d991547263f2","Type":"ContainerStarted","Data":"41b4bf2ede1ab84ad466fc73fa81abaef0f3432b69b5b2e627024cac6f7fe955"} Apr 16 18:11:04.453760 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:04.453648 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" event={"ID":"5e9d6b21-2120-44f2-8a4a-d991547263f2","Type":"ContainerStarted","Data":"e7ce696ade0018f86b84f7843259f43450e07d325d4f9b518e87900b9a30f761"} Apr 16 18:11:04.453760 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:04.453664 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" event={"ID":"5e9d6b21-2120-44f2-8a4a-d991547263f2","Type":"ContainerStarted","Data":"1c3e922b00c73b0bdb11df51692815597023968925348efcf2c8eaffc821cd76"} Apr 16 18:11:04.453760 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:04.453679 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" event={"ID":"5e9d6b21-2120-44f2-8a4a-d991547263f2","Type":"ContainerStarted","Data":"375daff6f1dce9f49db1e23d2427ee05667ce2f75624e800712fd15a7ec22350"} Apr 16 18:11:04.453760 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:04.453692 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" event={"ID":"5e9d6b21-2120-44f2-8a4a-d991547263f2","Type":"ContainerStarted","Data":"4f046f0dadbdb62eab682a132fdc4d93968197dc82a4be0c495650a396059ea9"} Apr 16 18:11:04.455569 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:04.455543 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pqkzx" event={"ID":"8d97fc03-363d-4c04-9bad-08b08817cd63","Type":"ContainerStarted","Data":"e4293f3b9ee98a8bbe8c972e5d38753565e31bdbc8149e32e07778d8f9677394"} Apr 16 18:11:04.472040 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:04.471977 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-tkxn5" podStartSLOduration=3.489611975 podStartE2EDuration="23.471966984s" podCreationTimestamp="2026-04-16 18:10:41 +0000 UTC" firstStartedPulling="2026-04-16 18:10:43.912338232 +0000 UTC m=+3.113913491" lastFinishedPulling="2026-04-16 18:11:03.894693254 +0000 UTC m=+23.096268500" observedRunningTime="2026-04-16 18:11:04.471870597 +0000 UTC m=+23.673445889" watchObservedRunningTime="2026-04-16 18:11:04.471966984 +0000 UTC m=+23.673542251" Apr 16 18:11:04.513837 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:04.513791 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pqkzx" podStartSLOduration=3.323738194 podStartE2EDuration="23.513776086s" podCreationTimestamp="2026-04-16 18:10:41 +0000 UTC" firstStartedPulling="2026-04-16 18:10:43.901387914 +0000 UTC m=+3.102963164" lastFinishedPulling="2026-04-16 18:11:04.091425799 +0000 UTC m=+23.293001056" observedRunningTime="2026-04-16 18:11:04.512990757 +0000 UTC m=+23.714566050" watchObservedRunningTime="2026-04-16 18:11:04.513776086 +0000 UTC m=+23.715351552" Apr 16 18:11:05.321985 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:05.321923 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwqx5" Apr 16 18:11:05.321985 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:05.321934 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r2j4b" Apr 16 18:11:05.322158 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:05.322051 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwqx5" podUID="22eec1cf-b2b9-495f-9507-ee4b6c6a9204" Apr 16 18:11:05.322219 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:05.322180 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r2j4b" podUID="5ad84774-79a9-4253-9451-f7e900a7cb4d" Apr 16 18:11:05.458734 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:05.458708 2576 generic.go:358] "Generic (PLEG): container finished" podID="439613fb-5d3f-4d29-b662-f86a49f8e289" containerID="c77e713ed8513ac69951cdb64338c18eb7278d5a01ad8e1edbef60dd329d28c6" exitCode=0 Apr 16 18:11:05.459145 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:05.458777 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k5m7r" event={"ID":"439613fb-5d3f-4d29-b662-f86a49f8e289","Type":"ContainerDied","Data":"c77e713ed8513ac69951cdb64338c18eb7278d5a01ad8e1edbef60dd329d28c6"} Apr 16 18:11:05.461470 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:05.461399 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" event={"ID":"5e9d6b21-2120-44f2-8a4a-d991547263f2","Type":"ContainerStarted","Data":"1c919aac757164a83ca769efa0528336a2bcba133985b001de39f91fc65c24b9"} Apr 16 18:11:06.465418 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:06.465364 2576 generic.go:358] "Generic (PLEG): container finished" podID="439613fb-5d3f-4d29-b662-f86a49f8e289" containerID="201b82005e924f9f1d6814afd44a9094ca8c32eb9ccaa5e87c5f00bd723d6a56" exitCode=0 Apr 16 18:11:06.465729 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:06.465424 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k5m7r" event={"ID":"439613fb-5d3f-4d29-b662-f86a49f8e289","Type":"ContainerDied","Data":"201b82005e924f9f1d6814afd44a9094ca8c32eb9ccaa5e87c5f00bd723d6a56"} Apr 16 18:11:07.321871 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:07.321835 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r2j4b" Apr 16 18:11:07.322036 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:07.321964 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r2j4b" podUID="5ad84774-79a9-4253-9451-f7e900a7cb4d" Apr 16 18:11:07.322036 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:07.322000 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwqx5" Apr 16 18:11:07.322115 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:07.322083 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwqx5" podUID="22eec1cf-b2b9-495f-9507-ee4b6c6a9204" Apr 16 18:11:07.469769 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:07.469732 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" event={"ID":"5e9d6b21-2120-44f2-8a4a-d991547263f2","Type":"ContainerStarted","Data":"ac13dd06a63a6599668df04a4c3a6f1083502cfc1d856e74c14523189270d890"} Apr 16 18:11:09.322736 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:09.322472 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwqx5" Apr 16 18:11:09.323421 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:09.322848 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwqx5" podUID="22eec1cf-b2b9-495f-9507-ee4b6c6a9204" Apr 16 18:11:09.323421 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:09.322524 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r2j4b" Apr 16 18:11:09.323421 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:09.322966 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r2j4b" podUID="5ad84774-79a9-4253-9451-f7e900a7cb4d" Apr 16 18:11:09.476818 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:09.476787 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" event={"ID":"5e9d6b21-2120-44f2-8a4a-d991547263f2","Type":"ContainerStarted","Data":"5cd9007098c2f731d86555b8345d710e5d0331337c22621855f48599840d7a9b"} Apr 16 18:11:09.477132 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:09.477104 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:11:09.477132 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:09.477137 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:11:09.493356 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:09.493334 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:11:09.503199 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:09.503152 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" podStartSLOduration=8.341018 podStartE2EDuration="28.503139036s" podCreationTimestamp="2026-04-16 18:10:41 +0000 UTC" firstStartedPulling="2026-04-16 18:10:43.903681729 +0000 UTC m=+3.105256984" lastFinishedPulling="2026-04-16 18:11:04.065802775 +0000 UTC m=+23.267378020" observedRunningTime="2026-04-16 18:11:09.502722766 +0000 UTC m=+28.704298031" watchObservedRunningTime="2026-04-16 18:11:09.503139036 +0000 UTC m=+28.704714301" Apr 16 18:11:10.479342 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:10.479305 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:11:10.494449 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:10.494254 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:11:10.978009 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:10.977971 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fwqx5"] Apr 16 18:11:10.978189 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:10.978129 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwqx5" Apr 16 18:11:10.978278 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:10.978252 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwqx5" podUID="22eec1cf-b2b9-495f-9507-ee4b6c6a9204" Apr 16 18:11:10.979573 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:10.979550 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-r2j4b"] Apr 16 18:11:10.979709 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:10.979638 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r2j4b" Apr 16 18:11:10.979767 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:10.979729 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r2j4b" podUID="5ad84774-79a9-4253-9451-f7e900a7cb4d" Apr 16 18:11:13.322097 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:13.322070 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwqx5" Apr 16 18:11:13.322432 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:13.322114 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r2j4b" Apr 16 18:11:13.322432 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:13.322217 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwqx5" podUID="22eec1cf-b2b9-495f-9507-ee4b6c6a9204" Apr 16 18:11:13.322432 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:13.322326 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r2j4b" podUID="5ad84774-79a9-4253-9451-f7e900a7cb4d" Apr 16 18:11:13.486819 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:13.486790 2576 generic.go:358] "Generic (PLEG): container finished" podID="439613fb-5d3f-4d29-b662-f86a49f8e289" containerID="85d2b898e1f6e51a2df2dfbb0c61e95b32f0f1865fa02ccf853da255eddf5687" exitCode=0 Apr 16 18:11:13.486919 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:13.486841 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k5m7r" event={"ID":"439613fb-5d3f-4d29-b662-f86a49f8e289","Type":"ContainerDied","Data":"85d2b898e1f6e51a2df2dfbb0c61e95b32f0f1865fa02ccf853da255eddf5687"} Apr 16 18:11:14.491018 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:14.490985 2576 generic.go:358] "Generic (PLEG): container finished" podID="439613fb-5d3f-4d29-b662-f86a49f8e289" containerID="a6b4d2b51c1a286aae03e3330ff3758879e76b230e0436584b73a196197c6a59" exitCode=0 Apr 16 18:11:14.491349 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:14.491036 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k5m7r" event={"ID":"439613fb-5d3f-4d29-b662-f86a49f8e289","Type":"ContainerDied","Data":"a6b4d2b51c1a286aae03e3330ff3758879e76b230e0436584b73a196197c6a59"} Apr 16 18:11:14.950121 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:14.950046 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/22eec1cf-b2b9-495f-9507-ee4b6c6a9204-metrics-certs\") pod \"network-metrics-daemon-fwqx5\" (UID: \"22eec1cf-b2b9-495f-9507-ee4b6c6a9204\") " pod="openshift-multus/network-metrics-daemon-fwqx5" Apr 16 18:11:14.950269 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:14.950179 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:11:14.950269 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:14.950231 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22eec1cf-b2b9-495f-9507-ee4b6c6a9204-metrics-certs podName:22eec1cf-b2b9-495f-9507-ee4b6c6a9204 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:46.95021905 +0000 UTC m=+66.151794301 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/22eec1cf-b2b9-495f-9507-ee4b6c6a9204-metrics-certs") pod "network-metrics-daemon-fwqx5" (UID: "22eec1cf-b2b9-495f-9507-ee4b6c6a9204") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:11:15.151079 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:15.151045 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfwhj\" (UniqueName: \"kubernetes.io/projected/5ad84774-79a9-4253-9451-f7e900a7cb4d-kube-api-access-cfwhj\") pod \"network-check-target-r2j4b\" (UID: \"5ad84774-79a9-4253-9451-f7e900a7cb4d\") " pod="openshift-network-diagnostics/network-check-target-r2j4b" Apr 16 18:11:15.151242 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:15.151222 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:11:15.151283 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:15.151249 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:11:15.151283 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:15.151261 2576 projected.go:194] Error preparing data for projected volume kube-api-access-cfwhj for pod openshift-network-diagnostics/network-check-target-r2j4b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:11:15.151358 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:15.151320 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ad84774-79a9-4253-9451-f7e900a7cb4d-kube-api-access-cfwhj podName:5ad84774-79a9-4253-9451-f7e900a7cb4d nodeName:}" failed. No retries permitted until 2026-04-16 18:11:47.151302132 +0000 UTC m=+66.352877381 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cfwhj" (UniqueName: "kubernetes.io/projected/5ad84774-79a9-4253-9451-f7e900a7cb4d-kube-api-access-cfwhj") pod "network-check-target-r2j4b" (UID: "5ad84774-79a9-4253-9451-f7e900a7cb4d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:11:15.322407 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:15.322336 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwqx5" Apr 16 18:11:15.322564 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:15.322336 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r2j4b" Apr 16 18:11:15.322564 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:15.322471 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwqx5" podUID="22eec1cf-b2b9-495f-9507-ee4b6c6a9204" Apr 16 18:11:15.322564 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:15.322514 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r2j4b" podUID="5ad84774-79a9-4253-9451-f7e900a7cb4d" Apr 16 18:11:15.495936 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:15.495908 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k5m7r" event={"ID":"439613fb-5d3f-4d29-b662-f86a49f8e289","Type":"ContainerStarted","Data":"59f87786e1b4ac512e3ffc723b9e7ea973954d50191e8326ed2869228b780eb3"} Apr 16 18:11:15.522742 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:15.522680 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-k5m7r" podStartSLOduration=5.994937982 podStartE2EDuration="34.52266658s" podCreationTimestamp="2026-04-16 18:10:41 +0000 UTC" firstStartedPulling="2026-04-16 18:10:43.90956474 +0000 UTC m=+3.111139986" lastFinishedPulling="2026-04-16 18:11:12.437293324 +0000 UTC m=+31.638868584" observedRunningTime="2026-04-16 18:11:15.52099842 +0000 UTC m=+34.722573701" watchObservedRunningTime="2026-04-16 18:11:15.52266658 +0000 UTC m=+34.724241848" Apr 16 18:11:16.603795 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:16.603766 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-96.ec2.internal" event="NodeReady" Apr 16 18:11:16.604279 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:16.603862 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 18:11:16.650422 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:16.650389 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-5mn52"] Apr 16 18:11:16.673985 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:16.673953 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-clw7f"] Apr 16 18:11:16.674103 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:16.674007 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5mn52" Apr 16 18:11:16.676820 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:16.676796 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 18:11:16.676820 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:16.676811 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-n88vn\"" Apr 16 18:11:16.676820 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:16.676814 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 18:11:16.692853 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:16.692827 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-clw7f"] Apr 16 18:11:16.692948 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:16.692862 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5mn52"] Apr 16 18:11:16.692948 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:16.692835 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-clw7f" Apr 16 18:11:16.695436 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:16.695417 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 18:11:16.695534 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:16.695470 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 18:11:16.695534 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:16.695512 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 18:11:16.695643 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:16.695530 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-kd8rq\"" Apr 16 18:11:16.761449 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:16.761426 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a54e43de-5a67-45f3-b403-4317caee2eca-config-volume\") pod \"dns-default-5mn52\" (UID: \"a54e43de-5a67-45f3-b403-4317caee2eca\") " pod="openshift-dns/dns-default-5mn52" Apr 16 18:11:16.761547 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:16.761471 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a54e43de-5a67-45f3-b403-4317caee2eca-metrics-tls\") pod \"dns-default-5mn52\" (UID: \"a54e43de-5a67-45f3-b403-4317caee2eca\") " pod="openshift-dns/dns-default-5mn52" Apr 16 18:11:16.761547 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:16.761513 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a54e43de-5a67-45f3-b403-4317caee2eca-tmp-dir\") pod \"dns-default-5mn52\" (UID: \"a54e43de-5a67-45f3-b403-4317caee2eca\") " pod="openshift-dns/dns-default-5mn52" Apr 16 18:11:16.761644 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:16.761576 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdrc9\" (UniqueName: \"kubernetes.io/projected/a54e43de-5a67-45f3-b403-4317caee2eca-kube-api-access-wdrc9\") pod \"dns-default-5mn52\" (UID: \"a54e43de-5a67-45f3-b403-4317caee2eca\") " pod="openshift-dns/dns-default-5mn52" Apr 16 18:11:16.862782 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:16.862762 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a54e43de-5a67-45f3-b403-4317caee2eca-metrics-tls\") pod \"dns-default-5mn52\" (UID: \"a54e43de-5a67-45f3-b403-4317caee2eca\") " pod="openshift-dns/dns-default-5mn52" Apr 16 18:11:16.862886 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:16.862796 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31b8e533-ba32-44f3-b6db-5c9e368510c6-cert\") pod \"ingress-canary-clw7f\" (UID: \"31b8e533-ba32-44f3-b6db-5c9e368510c6\") " pod="openshift-ingress-canary/ingress-canary-clw7f" Apr 16 18:11:16.862886 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:16.862827 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a54e43de-5a67-45f3-b403-4317caee2eca-tmp-dir\") pod \"dns-default-5mn52\" (UID: \"a54e43de-5a67-45f3-b403-4317caee2eca\") " pod="openshift-dns/dns-default-5mn52" Apr 16 18:11:16.862993 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:16.862919 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:11:16.862993 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:16.862987 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a54e43de-5a67-45f3-b403-4317caee2eca-metrics-tls podName:a54e43de-5a67-45f3-b403-4317caee2eca nodeName:}" failed. No retries permitted until 2026-04-16 18:11:17.362967407 +0000 UTC m=+36.564542666 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a54e43de-5a67-45f3-b403-4317caee2eca-metrics-tls") pod "dns-default-5mn52" (UID: "a54e43de-5a67-45f3-b403-4317caee2eca") : secret "dns-default-metrics-tls" not found Apr 16 18:11:16.863096 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:16.862915 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wdrc9\" (UniqueName: \"kubernetes.io/projected/a54e43de-5a67-45f3-b403-4317caee2eca-kube-api-access-wdrc9\") pod \"dns-default-5mn52\" (UID: \"a54e43de-5a67-45f3-b403-4317caee2eca\") " pod="openshift-dns/dns-default-5mn52" Apr 16 18:11:16.863096 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:16.863027 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khcf9\" (UniqueName: \"kubernetes.io/projected/31b8e533-ba32-44f3-b6db-5c9e368510c6-kube-api-access-khcf9\") pod \"ingress-canary-clw7f\" (UID: \"31b8e533-ba32-44f3-b6db-5c9e368510c6\") " pod="openshift-ingress-canary/ingress-canary-clw7f" Apr 16 18:11:16.863096 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:16.863059 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a54e43de-5a67-45f3-b403-4317caee2eca-config-volume\") pod \"dns-default-5mn52\" (UID: \"a54e43de-5a67-45f3-b403-4317caee2eca\") " pod="openshift-dns/dns-default-5mn52" Apr 16 18:11:16.863205 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:16.863146 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a54e43de-5a67-45f3-b403-4317caee2eca-tmp-dir\") pod \"dns-default-5mn52\" (UID: \"a54e43de-5a67-45f3-b403-4317caee2eca\") " pod="openshift-dns/dns-default-5mn52" Apr 16 18:11:16.863491 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:16.863474 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a54e43de-5a67-45f3-b403-4317caee2eca-config-volume\") pod \"dns-default-5mn52\" (UID: \"a54e43de-5a67-45f3-b403-4317caee2eca\") " pod="openshift-dns/dns-default-5mn52" Apr 16 18:11:16.873362 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:16.873342 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdrc9\" (UniqueName: \"kubernetes.io/projected/a54e43de-5a67-45f3-b403-4317caee2eca-kube-api-access-wdrc9\") pod \"dns-default-5mn52\" (UID: \"a54e43de-5a67-45f3-b403-4317caee2eca\") " pod="openshift-dns/dns-default-5mn52" Apr 16 18:11:16.963453 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:16.963404 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khcf9\" (UniqueName: \"kubernetes.io/projected/31b8e533-ba32-44f3-b6db-5c9e368510c6-kube-api-access-khcf9\") pod \"ingress-canary-clw7f\" (UID: \"31b8e533-ba32-44f3-b6db-5c9e368510c6\") " pod="openshift-ingress-canary/ingress-canary-clw7f" Apr 16 18:11:16.963525 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:16.963451 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31b8e533-ba32-44f3-b6db-5c9e368510c6-cert\") pod \"ingress-canary-clw7f\" (UID: \"31b8e533-ba32-44f3-b6db-5c9e368510c6\") " pod="openshift-ingress-canary/ingress-canary-clw7f" Apr 16 18:11:16.963562 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:16.963553 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:11:16.963624 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:16.963613 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31b8e533-ba32-44f3-b6db-5c9e368510c6-cert podName:31b8e533-ba32-44f3-b6db-5c9e368510c6 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:17.463596983 +0000 UTC m=+36.665172244 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/31b8e533-ba32-44f3-b6db-5c9e368510c6-cert") pod "ingress-canary-clw7f" (UID: "31b8e533-ba32-44f3-b6db-5c9e368510c6") : secret "canary-serving-cert" not found Apr 16 18:11:16.971984 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:16.971960 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-khcf9\" (UniqueName: \"kubernetes.io/projected/31b8e533-ba32-44f3-b6db-5c9e368510c6-kube-api-access-khcf9\") pod \"ingress-canary-clw7f\" (UID: \"31b8e533-ba32-44f3-b6db-5c9e368510c6\") " pod="openshift-ingress-canary/ingress-canary-clw7f" Apr 16 18:11:17.321793 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:17.321720 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwqx5" Apr 16 18:11:17.321920 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:17.321880 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r2j4b" Apr 16 18:11:17.324628 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:17.324607 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:11:17.325787 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:17.325772 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-rbrbw\"" Apr 16 18:11:17.325787 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:17.325781 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xdm8g\"" Apr 16 18:11:17.325939 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:17.325773 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:11:17.325939 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:17.325832 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:11:17.366537 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:17.366517 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a54e43de-5a67-45f3-b403-4317caee2eca-metrics-tls\") pod \"dns-default-5mn52\" (UID: \"a54e43de-5a67-45f3-b403-4317caee2eca\") " pod="openshift-dns/dns-default-5mn52" Apr 16 18:11:17.366658 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:17.366641 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:11:17.366738 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:17.366727 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a54e43de-5a67-45f3-b403-4317caee2eca-metrics-tls podName:a54e43de-5a67-45f3-b403-4317caee2eca nodeName:}" failed. No retries permitted until 2026-04-16 18:11:18.366691103 +0000 UTC m=+37.568266365 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a54e43de-5a67-45f3-b403-4317caee2eca-metrics-tls") pod "dns-default-5mn52" (UID: "a54e43de-5a67-45f3-b403-4317caee2eca") : secret "dns-default-metrics-tls" not found Apr 16 18:11:17.467531 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:17.467510 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31b8e533-ba32-44f3-b6db-5c9e368510c6-cert\") pod \"ingress-canary-clw7f\" (UID: \"31b8e533-ba32-44f3-b6db-5c9e368510c6\") " pod="openshift-ingress-canary/ingress-canary-clw7f" Apr 16 18:11:17.467653 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:17.467639 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:11:17.467755 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:17.467745 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31b8e533-ba32-44f3-b6db-5c9e368510c6-cert podName:31b8e533-ba32-44f3-b6db-5c9e368510c6 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:18.467731923 +0000 UTC m=+37.669307169 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/31b8e533-ba32-44f3-b6db-5c9e368510c6-cert") pod "ingress-canary-clw7f" (UID: "31b8e533-ba32-44f3-b6db-5c9e368510c6") : secret "canary-serving-cert" not found Apr 16 18:11:18.374978 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:18.374947 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a54e43de-5a67-45f3-b403-4317caee2eca-metrics-tls\") pod \"dns-default-5mn52\" (UID: \"a54e43de-5a67-45f3-b403-4317caee2eca\") " pod="openshift-dns/dns-default-5mn52" Apr 16 18:11:18.375524 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:18.375056 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:11:18.375524 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:18.375109 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a54e43de-5a67-45f3-b403-4317caee2eca-metrics-tls podName:a54e43de-5a67-45f3-b403-4317caee2eca nodeName:}" failed. No retries permitted until 2026-04-16 18:11:20.375094581 +0000 UTC m=+39.576669827 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a54e43de-5a67-45f3-b403-4317caee2eca-metrics-tls") pod "dns-default-5mn52" (UID: "a54e43de-5a67-45f3-b403-4317caee2eca") : secret "dns-default-metrics-tls" not found Apr 16 18:11:18.477637 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:18.475419 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31b8e533-ba32-44f3-b6db-5c9e368510c6-cert\") pod \"ingress-canary-clw7f\" (UID: \"31b8e533-ba32-44f3-b6db-5c9e368510c6\") " pod="openshift-ingress-canary/ingress-canary-clw7f" Apr 16 18:11:18.477637 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:18.475570 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:11:18.477637 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:18.475629 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31b8e533-ba32-44f3-b6db-5c9e368510c6-cert podName:31b8e533-ba32-44f3-b6db-5c9e368510c6 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:20.475612775 +0000 UTC m=+39.677188042 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/31b8e533-ba32-44f3-b6db-5c9e368510c6-cert") pod "ingress-canary-clw7f" (UID: "31b8e533-ba32-44f3-b6db-5c9e368510c6") : secret "canary-serving-cert" not found Apr 16 18:11:20.388528 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:20.388493 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a54e43de-5a67-45f3-b403-4317caee2eca-metrics-tls\") pod \"dns-default-5mn52\" (UID: \"a54e43de-5a67-45f3-b403-4317caee2eca\") " pod="openshift-dns/dns-default-5mn52" Apr 16 18:11:20.388852 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:20.388639 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:11:20.388852 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:20.388723 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a54e43de-5a67-45f3-b403-4317caee2eca-metrics-tls podName:a54e43de-5a67-45f3-b403-4317caee2eca nodeName:}" failed. No retries permitted until 2026-04-16 18:11:24.388689117 +0000 UTC m=+43.590264363 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a54e43de-5a67-45f3-b403-4317caee2eca-metrics-tls") pod "dns-default-5mn52" (UID: "a54e43de-5a67-45f3-b403-4317caee2eca") : secret "dns-default-metrics-tls" not found Apr 16 18:11:20.489459 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:20.489431 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31b8e533-ba32-44f3-b6db-5c9e368510c6-cert\") pod \"ingress-canary-clw7f\" (UID: \"31b8e533-ba32-44f3-b6db-5c9e368510c6\") " pod="openshift-ingress-canary/ingress-canary-clw7f" Apr 16 18:11:20.489602 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:20.489561 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:11:20.489664 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:20.489626 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31b8e533-ba32-44f3-b6db-5c9e368510c6-cert podName:31b8e533-ba32-44f3-b6db-5c9e368510c6 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:24.489609092 +0000 UTC m=+43.691184340 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/31b8e533-ba32-44f3-b6db-5c9e368510c6-cert") pod "ingress-canary-clw7f" (UID: "31b8e533-ba32-44f3-b6db-5c9e368510c6") : secret "canary-serving-cert" not found Apr 16 18:11:24.411817 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:24.411766 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a54e43de-5a67-45f3-b403-4317caee2eca-metrics-tls\") pod \"dns-default-5mn52\" (UID: \"a54e43de-5a67-45f3-b403-4317caee2eca\") " pod="openshift-dns/dns-default-5mn52" Apr 16 18:11:24.412381 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:24.411912 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:11:24.412381 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:24.411982 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a54e43de-5a67-45f3-b403-4317caee2eca-metrics-tls podName:a54e43de-5a67-45f3-b403-4317caee2eca nodeName:}" failed. No retries permitted until 2026-04-16 18:11:32.411965759 +0000 UTC m=+51.613541005 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a54e43de-5a67-45f3-b403-4317caee2eca-metrics-tls") pod "dns-default-5mn52" (UID: "a54e43de-5a67-45f3-b403-4317caee2eca") : secret "dns-default-metrics-tls" not found Apr 16 18:11:24.512121 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:24.512097 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31b8e533-ba32-44f3-b6db-5c9e368510c6-cert\") pod \"ingress-canary-clw7f\" (UID: \"31b8e533-ba32-44f3-b6db-5c9e368510c6\") " pod="openshift-ingress-canary/ingress-canary-clw7f" Apr 16 18:11:24.512249 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:24.512224 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:11:24.512290 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:24.512277 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31b8e533-ba32-44f3-b6db-5c9e368510c6-cert podName:31b8e533-ba32-44f3-b6db-5c9e368510c6 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:32.51226348 +0000 UTC m=+51.713838726 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/31b8e533-ba32-44f3-b6db-5c9e368510c6-cert") pod "ingress-canary-clw7f" (UID: "31b8e533-ba32-44f3-b6db-5c9e368510c6") : secret "canary-serving-cert" not found Apr 16 18:11:32.457524 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:32.457486 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a54e43de-5a67-45f3-b403-4317caee2eca-metrics-tls\") pod \"dns-default-5mn52\" (UID: \"a54e43de-5a67-45f3-b403-4317caee2eca\") " pod="openshift-dns/dns-default-5mn52" Apr 16 18:11:32.458073 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:32.457629 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:11:32.458073 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:32.457689 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a54e43de-5a67-45f3-b403-4317caee2eca-metrics-tls podName:a54e43de-5a67-45f3-b403-4317caee2eca nodeName:}" failed. No retries permitted until 2026-04-16 18:11:48.457674859 +0000 UTC m=+67.659250109 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a54e43de-5a67-45f3-b403-4317caee2eca-metrics-tls") pod "dns-default-5mn52" (UID: "a54e43de-5a67-45f3-b403-4317caee2eca") : secret "dns-default-metrics-tls" not found Apr 16 18:11:32.558058 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:32.558031 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31b8e533-ba32-44f3-b6db-5c9e368510c6-cert\") pod \"ingress-canary-clw7f\" (UID: \"31b8e533-ba32-44f3-b6db-5c9e368510c6\") " pod="openshift-ingress-canary/ingress-canary-clw7f" Apr 16 18:11:32.558217 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:32.558141 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:11:32.558217 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:32.558196 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31b8e533-ba32-44f3-b6db-5c9e368510c6-cert podName:31b8e533-ba32-44f3-b6db-5c9e368510c6 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:48.558181791 +0000 UTC m=+67.759757040 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/31b8e533-ba32-44f3-b6db-5c9e368510c6-cert") pod "ingress-canary-clw7f" (UID: "31b8e533-ba32-44f3-b6db-5c9e368510c6") : secret "canary-serving-cert" not found Apr 16 18:11:41.126312 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.126275 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-j6fk5"] Apr 16 18:11:41.132397 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.132372 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-w28dq"] Apr 16 18:11:41.132561 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.132542 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-j6fk5" Apr 16 18:11:41.135404 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.135388 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-w28dq" Apr 16 18:11:41.135656 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.135634 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-47hvk\"" Apr 16 18:11:41.137022 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.136998 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 18:11:41.137570 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.137041 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 18:11:41.137570 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.137212 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 18:11:41.138208 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.138188 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-j6fk5"] Apr 16 18:11:41.139332 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.139309 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 18:11:41.139442 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.139339 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 18:11:41.139591 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.139568 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 18:11:41.139718 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.139594 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-qbqzg\"" Apr 16 18:11:41.140080 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.140017 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:11:41.140300 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.140264 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 18:11:41.145551 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.145527 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 18:11:41.145838 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.145816 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-w28dq"] Apr 16 18:11:41.230259 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.230228 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7k4g5"] Apr 16 18:11:41.233521 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.233506 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-f6kdh"] Apr 16 18:11:41.233671 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.233656 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7k4g5" Apr 16 18:11:41.236434 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.236267 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 18:11:41.236617 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.236597 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 18:11:41.237265 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.237242 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:11:41.237407 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.237382 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-kmk8g\"" Apr 16 18:11:41.238422 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.238321 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-75859bb697-t64g6"] Apr 16 18:11:41.238602 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.238586 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-f6kdh" Apr 16 18:11:41.241824 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.241806 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-75859bb697-t64g6" Apr 16 18:11:41.242520 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.242499 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 18:11:41.243077 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.243061 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 18:11:41.243163 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.243093 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-qgvv8\"" Apr 16 18:11:41.243163 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.243117 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 18:11:41.243466 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.243447 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 18:11:41.245684 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.245666 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 18:11:41.245971 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.245954 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-9692l\"" Apr 16 18:11:41.246494 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.246478 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 18:11:41.246667 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.246646 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 18:11:41.248495 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.248192 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-f6kdh"] Apr 16 18:11:41.249628 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.249506 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7k4g5"] Apr 16 18:11:41.251728 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.251690 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 18:11:41.253223 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.253190 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 18:11:41.255311 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.255291 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-75859bb697-t64g6"] Apr 16 18:11:41.318898 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.318876 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11e9383c-4bf6-4c5c-9dec-a8f2b642aff1-serving-cert\") pod \"console-operator-d87b8d5fc-w28dq\" (UID: \"11e9383c-4bf6-4c5c-9dec-a8f2b642aff1\") " pod="openshift-console-operator/console-operator-d87b8d5fc-w28dq" Apr 16 18:11:41.318998 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.318902 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/273ddeef-93ac-489e-893f-a85a3c28bdb6-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-j6fk5\" (UID: \"273ddeef-93ac-489e-893f-a85a3c28bdb6\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-j6fk5" Apr 16 18:11:41.318998 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.318931 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g5bs\" (UniqueName: \"kubernetes.io/projected/11e9383c-4bf6-4c5c-9dec-a8f2b642aff1-kube-api-access-4g5bs\") pod \"console-operator-d87b8d5fc-w28dq\" (UID: \"11e9383c-4bf6-4c5c-9dec-a8f2b642aff1\") " pod="openshift-console-operator/console-operator-d87b8d5fc-w28dq" Apr 16 18:11:41.318998 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.318976 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/273ddeef-93ac-489e-893f-a85a3c28bdb6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-j6fk5\" (UID: \"273ddeef-93ac-489e-893f-a85a3c28bdb6\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-j6fk5" Apr 16 18:11:41.319103 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.319004 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw7hl\" (UniqueName: \"kubernetes.io/projected/273ddeef-93ac-489e-893f-a85a3c28bdb6-kube-api-access-sw7hl\") pod \"cluster-monitoring-operator-6667474d89-j6fk5\" (UID: \"273ddeef-93ac-489e-893f-a85a3c28bdb6\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-j6fk5" Apr 16 18:11:41.319103 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.319048 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/11e9383c-4bf6-4c5c-9dec-a8f2b642aff1-trusted-ca\") pod \"console-operator-d87b8d5fc-w28dq\" (UID: \"11e9383c-4bf6-4c5c-9dec-a8f2b642aff1\") " pod="openshift-console-operator/console-operator-d87b8d5fc-w28dq" Apr 16 18:11:41.319103 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.319095 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11e9383c-4bf6-4c5c-9dec-a8f2b642aff1-config\") pod \"console-operator-d87b8d5fc-w28dq\" (UID: \"11e9383c-4bf6-4c5c-9dec-a8f2b642aff1\") " pod="openshift-console-operator/console-operator-d87b8d5fc-w28dq" Apr 16 18:11:41.419822 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.419802 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2eddf300-6693-4839-94d4-0403e9b6e8c2-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-7k4g5\" (UID: \"2eddf300-6693-4839-94d4-0403e9b6e8c2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7k4g5" Apr 16 18:11:41.419922 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.419828 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/de4c4df9-8036-4492-9274-9f47dc6c8180-registry-certificates\") pod \"image-registry-75859bb697-t64g6\" (UID: \"de4c4df9-8036-4492-9274-9f47dc6c8180\") " pod="openshift-image-registry/image-registry-75859bb697-t64g6" Apr 16 18:11:41.419922 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.419853 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/11e9383c-4bf6-4c5c-9dec-a8f2b642aff1-trusted-ca\") pod \"console-operator-d87b8d5fc-w28dq\" (UID: \"11e9383c-4bf6-4c5c-9dec-a8f2b642aff1\") " pod="openshift-console-operator/console-operator-d87b8d5fc-w28dq" Apr 16 18:11:41.419922 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.419871 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de4c4df9-8036-4492-9274-9f47dc6c8180-registry-tls\") pod \"image-registry-75859bb697-t64g6\" (UID: \"de4c4df9-8036-4492-9274-9f47dc6c8180\") " pod="openshift-image-registry/image-registry-75859bb697-t64g6" Apr 16 18:11:41.419922 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.419910 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5f5d\" (UniqueName: \"kubernetes.io/projected/de4c4df9-8036-4492-9274-9f47dc6c8180-kube-api-access-s5f5d\") pod \"image-registry-75859bb697-t64g6\" (UID: \"de4c4df9-8036-4492-9274-9f47dc6c8180\") " pod="openshift-image-registry/image-registry-75859bb697-t64g6" Apr 16 18:11:41.420120 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.419990 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8d34f403-81ae-4142-98c8-5c0168280de0-tmp\") pod \"insights-operator-5785d4fcdd-f6kdh\" (UID: \"8d34f403-81ae-4142-98c8-5c0168280de0\") " pod="openshift-insights/insights-operator-5785d4fcdd-f6kdh" Apr 16 18:11:41.420120 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.420033 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11e9383c-4bf6-4c5c-9dec-a8f2b642aff1-config\") pod \"console-operator-d87b8d5fc-w28dq\" (UID: \"11e9383c-4bf6-4c5c-9dec-a8f2b642aff1\") " pod="openshift-console-operator/console-operator-d87b8d5fc-w28dq" Apr 16 18:11:41.420120 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.420049 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bptwk\" (UniqueName: \"kubernetes.io/projected/2eddf300-6693-4839-94d4-0403e9b6e8c2-kube-api-access-bptwk\") pod \"cluster-samples-operator-667775844f-7k4g5\" (UID: \"2eddf300-6693-4839-94d4-0403e9b6e8c2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7k4g5" Apr 16 18:11:41.420120 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.420064 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de4c4df9-8036-4492-9274-9f47dc6c8180-trusted-ca\") pod \"image-registry-75859bb697-t64g6\" (UID: \"de4c4df9-8036-4492-9274-9f47dc6c8180\") " pod="openshift-image-registry/image-registry-75859bb697-t64g6" Apr 16 18:11:41.420120 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.420081 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d34f403-81ae-4142-98c8-5c0168280de0-serving-cert\") pod \"insights-operator-5785d4fcdd-f6kdh\" (UID: \"8d34f403-81ae-4142-98c8-5c0168280de0\") " pod="openshift-insights/insights-operator-5785d4fcdd-f6kdh" Apr 16 18:11:41.420120 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.420102 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/de4c4df9-8036-4492-9274-9f47dc6c8180-ca-trust-extracted\") pod \"image-registry-75859bb697-t64g6\" (UID: \"de4c4df9-8036-4492-9274-9f47dc6c8180\") " pod="openshift-image-registry/image-registry-75859bb697-t64g6" Apr 16 18:11:41.420377 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.420171 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8d34f403-81ae-4142-98c8-5c0168280de0-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-f6kdh\" (UID: \"8d34f403-81ae-4142-98c8-5c0168280de0\") " pod="openshift-insights/insights-operator-5785d4fcdd-f6kdh" Apr 16 18:11:41.420377 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.420204 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11e9383c-4bf6-4c5c-9dec-a8f2b642aff1-serving-cert\") pod \"console-operator-d87b8d5fc-w28dq\" (UID: \"11e9383c-4bf6-4c5c-9dec-a8f2b642aff1\") " pod="openshift-console-operator/console-operator-d87b8d5fc-w28dq" Apr 16 18:11:41.420377 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.420221 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/273ddeef-93ac-489e-893f-a85a3c28bdb6-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-j6fk5\" (UID: \"273ddeef-93ac-489e-893f-a85a3c28bdb6\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-j6fk5" Apr 16 18:11:41.420377 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.420244 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/8d34f403-81ae-4142-98c8-5c0168280de0-snapshots\") pod \"insights-operator-5785d4fcdd-f6kdh\" (UID: \"8d34f403-81ae-4142-98c8-5c0168280de0\") " pod="openshift-insights/insights-operator-5785d4fcdd-f6kdh" Apr 16 18:11:41.420377 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.420270 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8d34f403-81ae-4142-98c8-5c0168280de0-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-f6kdh\" (UID: \"8d34f403-81ae-4142-98c8-5c0168280de0\") " pod="openshift-insights/insights-operator-5785d4fcdd-f6kdh" Apr 16 18:11:41.420377 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.420299 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lctz5\" (UniqueName: \"kubernetes.io/projected/8d34f403-81ae-4142-98c8-5c0168280de0-kube-api-access-lctz5\") pod \"insights-operator-5785d4fcdd-f6kdh\" (UID: \"8d34f403-81ae-4142-98c8-5c0168280de0\") " pod="openshift-insights/insights-operator-5785d4fcdd-f6kdh" Apr 16 18:11:41.420377 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.420328 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/de4c4df9-8036-4492-9274-9f47dc6c8180-image-registry-private-configuration\") pod \"image-registry-75859bb697-t64g6\" (UID: \"de4c4df9-8036-4492-9274-9f47dc6c8180\") " pod="openshift-image-registry/image-registry-75859bb697-t64g6" Apr 16 18:11:41.420377 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.420377 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4g5bs\" (UniqueName: \"kubernetes.io/projected/11e9383c-4bf6-4c5c-9dec-a8f2b642aff1-kube-api-access-4g5bs\") pod \"console-operator-d87b8d5fc-w28dq\" (UID: \"11e9383c-4bf6-4c5c-9dec-a8f2b642aff1\") " pod="openshift-console-operator/console-operator-d87b8d5fc-w28dq" Apr 16 18:11:41.420685 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.420405 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/de4c4df9-8036-4492-9274-9f47dc6c8180-installation-pull-secrets\") pod \"image-registry-75859bb697-t64g6\" (UID: \"de4c4df9-8036-4492-9274-9f47dc6c8180\") " pod="openshift-image-registry/image-registry-75859bb697-t64g6" Apr 16 18:11:41.420685 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.420431 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de4c4df9-8036-4492-9274-9f47dc6c8180-bound-sa-token\") pod \"image-registry-75859bb697-t64g6\" (UID: \"de4c4df9-8036-4492-9274-9f47dc6c8180\") " pod="openshift-image-registry/image-registry-75859bb697-t64g6" Apr 16 18:11:41.420685 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.420461 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/273ddeef-93ac-489e-893f-a85a3c28bdb6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-j6fk5\" (UID: \"273ddeef-93ac-489e-893f-a85a3c28bdb6\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-j6fk5" Apr 16 18:11:41.420685 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.420485 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sw7hl\" (UniqueName: \"kubernetes.io/projected/273ddeef-93ac-489e-893f-a85a3c28bdb6-kube-api-access-sw7hl\") pod \"cluster-monitoring-operator-6667474d89-j6fk5\" (UID: \"273ddeef-93ac-489e-893f-a85a3c28bdb6\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-j6fk5" Apr 16 18:11:41.423163 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.423144 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 18:11:41.423246 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.423172 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 18:11:41.424366 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.424353 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 18:11:41.424421 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.424406 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 18:11:41.429225 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.429203 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 18:11:41.429566 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.429550 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 18:11:41.429652 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.429565 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 18:11:41.430663 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:41.430649 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:11:41.430756 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:41.430745 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/273ddeef-93ac-489e-893f-a85a3c28bdb6-cluster-monitoring-operator-tls podName:273ddeef-93ac-489e-893f-a85a3c28bdb6 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:41.930723452 +0000 UTC m=+61.132298713 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/273ddeef-93ac-489e-893f-a85a3c28bdb6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-j6fk5" (UID: "273ddeef-93ac-489e-893f-a85a3c28bdb6") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:11:41.431120 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.431102 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11e9383c-4bf6-4c5c-9dec-a8f2b642aff1-config\") pod \"console-operator-d87b8d5fc-w28dq\" (UID: \"11e9383c-4bf6-4c5c-9dec-a8f2b642aff1\") " pod="openshift-console-operator/console-operator-d87b8d5fc-w28dq" Apr 16 18:11:41.431247 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.431230 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/273ddeef-93ac-489e-893f-a85a3c28bdb6-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-j6fk5\" (UID: \"273ddeef-93ac-489e-893f-a85a3c28bdb6\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-j6fk5" Apr 16 18:11:41.431306 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.431274 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/11e9383c-4bf6-4c5c-9dec-a8f2b642aff1-trusted-ca\") pod \"console-operator-d87b8d5fc-w28dq\" (UID: \"11e9383c-4bf6-4c5c-9dec-a8f2b642aff1\") " pod="openshift-console-operator/console-operator-d87b8d5fc-w28dq" Apr 16 18:11:41.433223 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.433206 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11e9383c-4bf6-4c5c-9dec-a8f2b642aff1-serving-cert\") pod \"console-operator-d87b8d5fc-w28dq\" (UID: \"11e9383c-4bf6-4c5c-9dec-a8f2b642aff1\") " pod="openshift-console-operator/console-operator-d87b8d5fc-w28dq" Apr 16 18:11:41.440493 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.440472 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:11:41.440562 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.440472 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 18:11:41.450802 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.450777 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw7hl\" (UniqueName: \"kubernetes.io/projected/273ddeef-93ac-489e-893f-a85a3c28bdb6-kube-api-access-sw7hl\") pod \"cluster-monitoring-operator-6667474d89-j6fk5\" (UID: \"273ddeef-93ac-489e-893f-a85a3c28bdb6\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-j6fk5" Apr 16 18:11:41.450870 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.450825 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g5bs\" (UniqueName: \"kubernetes.io/projected/11e9383c-4bf6-4c5c-9dec-a8f2b642aff1-kube-api-access-4g5bs\") pod \"console-operator-d87b8d5fc-w28dq\" (UID: \"11e9383c-4bf6-4c5c-9dec-a8f2b642aff1\") " pod="openshift-console-operator/console-operator-d87b8d5fc-w28dq" Apr 16 18:11:41.470453 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.470436 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-qbqzg\"" Apr 16 18:11:41.477967 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.477955 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-w28dq" Apr 16 18:11:41.521126 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.521099 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/de4c4df9-8036-4492-9274-9f47dc6c8180-installation-pull-secrets\") pod \"image-registry-75859bb697-t64g6\" (UID: \"de4c4df9-8036-4492-9274-9f47dc6c8180\") " pod="openshift-image-registry/image-registry-75859bb697-t64g6" Apr 16 18:11:41.521126 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.521127 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de4c4df9-8036-4492-9274-9f47dc6c8180-bound-sa-token\") pod \"image-registry-75859bb697-t64g6\" (UID: \"de4c4df9-8036-4492-9274-9f47dc6c8180\") " pod="openshift-image-registry/image-registry-75859bb697-t64g6" Apr 16 18:11:41.521305 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.521160 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2eddf300-6693-4839-94d4-0403e9b6e8c2-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-7k4g5\" (UID: \"2eddf300-6693-4839-94d4-0403e9b6e8c2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7k4g5" Apr 16 18:11:41.521305 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.521177 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/de4c4df9-8036-4492-9274-9f47dc6c8180-registry-certificates\") pod \"image-registry-75859bb697-t64g6\" (UID: \"de4c4df9-8036-4492-9274-9f47dc6c8180\") " pod="openshift-image-registry/image-registry-75859bb697-t64g6" Apr 16 18:11:41.521305 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:41.521263 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:11:41.521447 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:41.521346 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2eddf300-6693-4839-94d4-0403e9b6e8c2-samples-operator-tls podName:2eddf300-6693-4839-94d4-0403e9b6e8c2 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:42.021324119 +0000 UTC m=+61.222899365 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/2eddf300-6693-4839-94d4-0403e9b6e8c2-samples-operator-tls") pod "cluster-samples-operator-667775844f-7k4g5" (UID: "2eddf300-6693-4839-94d4-0403e9b6e8c2") : secret "samples-operator-tls" not found Apr 16 18:11:41.521447 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.521386 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de4c4df9-8036-4492-9274-9f47dc6c8180-registry-tls\") pod \"image-registry-75859bb697-t64g6\" (UID: \"de4c4df9-8036-4492-9274-9f47dc6c8180\") " pod="openshift-image-registry/image-registry-75859bb697-t64g6" Apr 16 18:11:41.521447 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.521425 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5f5d\" (UniqueName: \"kubernetes.io/projected/de4c4df9-8036-4492-9274-9f47dc6c8180-kube-api-access-s5f5d\") pod \"image-registry-75859bb697-t64g6\" (UID: \"de4c4df9-8036-4492-9274-9f47dc6c8180\") " pod="openshift-image-registry/image-registry-75859bb697-t64g6" Apr 16 18:11:41.521605 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.521466 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8d34f403-81ae-4142-98c8-5c0168280de0-tmp\") pod \"insights-operator-5785d4fcdd-f6kdh\" (UID: \"8d34f403-81ae-4142-98c8-5c0168280de0\") " pod="openshift-insights/insights-operator-5785d4fcdd-f6kdh" Apr 16 18:11:41.521605 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:41.521497 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:11:41.521605 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:41.521511 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-75859bb697-t64g6: secret "image-registry-tls" not found Apr 16 18:11:41.521605 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.521543 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bptwk\" (UniqueName: \"kubernetes.io/projected/2eddf300-6693-4839-94d4-0403e9b6e8c2-kube-api-access-bptwk\") pod \"cluster-samples-operator-667775844f-7k4g5\" (UID: \"2eddf300-6693-4839-94d4-0403e9b6e8c2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7k4g5" Apr 16 18:11:41.521605 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:41.521552 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/de4c4df9-8036-4492-9274-9f47dc6c8180-registry-tls podName:de4c4df9-8036-4492-9274-9f47dc6c8180 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:42.02153853 +0000 UTC m=+61.223113778 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/de4c4df9-8036-4492-9274-9f47dc6c8180-registry-tls") pod "image-registry-75859bb697-t64g6" (UID: "de4c4df9-8036-4492-9274-9f47dc6c8180") : secret "image-registry-tls" not found Apr 16 18:11:41.521605 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.521587 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de4c4df9-8036-4492-9274-9f47dc6c8180-trusted-ca\") pod \"image-registry-75859bb697-t64g6\" (UID: \"de4c4df9-8036-4492-9274-9f47dc6c8180\") " pod="openshift-image-registry/image-registry-75859bb697-t64g6" Apr 16 18:11:41.521937 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.521619 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d34f403-81ae-4142-98c8-5c0168280de0-serving-cert\") pod \"insights-operator-5785d4fcdd-f6kdh\" (UID: \"8d34f403-81ae-4142-98c8-5c0168280de0\") " pod="openshift-insights/insights-operator-5785d4fcdd-f6kdh" Apr 16 18:11:41.521937 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.521655 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/de4c4df9-8036-4492-9274-9f47dc6c8180-ca-trust-extracted\") pod \"image-registry-75859bb697-t64g6\" (UID: \"de4c4df9-8036-4492-9274-9f47dc6c8180\") " pod="openshift-image-registry/image-registry-75859bb697-t64g6" Apr 16 18:11:41.521937 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.521730 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8d34f403-81ae-4142-98c8-5c0168280de0-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-f6kdh\" (UID: \"8d34f403-81ae-4142-98c8-5c0168280de0\") " pod="openshift-insights/insights-operator-5785d4fcdd-f6kdh" Apr 16 18:11:41.521937 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.521766 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/8d34f403-81ae-4142-98c8-5c0168280de0-snapshots\") pod \"insights-operator-5785d4fcdd-f6kdh\" (UID: \"8d34f403-81ae-4142-98c8-5c0168280de0\") " pod="openshift-insights/insights-operator-5785d4fcdd-f6kdh" Apr 16 18:11:41.521937 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.521786 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/de4c4df9-8036-4492-9274-9f47dc6c8180-registry-certificates\") pod \"image-registry-75859bb697-t64g6\" (UID: \"de4c4df9-8036-4492-9274-9f47dc6c8180\") " pod="openshift-image-registry/image-registry-75859bb697-t64g6" Apr 16 18:11:41.521937 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.521791 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8d34f403-81ae-4142-98c8-5c0168280de0-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-f6kdh\" (UID: \"8d34f403-81ae-4142-98c8-5c0168280de0\") " pod="openshift-insights/insights-operator-5785d4fcdd-f6kdh" Apr 16 18:11:41.521937 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.521845 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lctz5\" (UniqueName: \"kubernetes.io/projected/8d34f403-81ae-4142-98c8-5c0168280de0-kube-api-access-lctz5\") pod \"insights-operator-5785d4fcdd-f6kdh\" (UID: \"8d34f403-81ae-4142-98c8-5c0168280de0\") " pod="openshift-insights/insights-operator-5785d4fcdd-f6kdh" Apr 16 18:11:41.521937 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.521874 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/de4c4df9-8036-4492-9274-9f47dc6c8180-image-registry-private-configuration\") pod \"image-registry-75859bb697-t64g6\" (UID: \"de4c4df9-8036-4492-9274-9f47dc6c8180\") " pod="openshift-image-registry/image-registry-75859bb697-t64g6" Apr 16 18:11:41.522337 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.521951 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8d34f403-81ae-4142-98c8-5c0168280de0-tmp\") pod \"insights-operator-5785d4fcdd-f6kdh\" (UID: \"8d34f403-81ae-4142-98c8-5c0168280de0\") " pod="openshift-insights/insights-operator-5785d4fcdd-f6kdh" Apr 16 18:11:41.522337 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.522050 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/de4c4df9-8036-4492-9274-9f47dc6c8180-ca-trust-extracted\") pod \"image-registry-75859bb697-t64g6\" (UID: \"de4c4df9-8036-4492-9274-9f47dc6c8180\") " pod="openshift-image-registry/image-registry-75859bb697-t64g6" Apr 16 18:11:41.522566 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.522543 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8d34f403-81ae-4142-98c8-5c0168280de0-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-f6kdh\" (UID: \"8d34f403-81ae-4142-98c8-5c0168280de0\") " pod="openshift-insights/insights-operator-5785d4fcdd-f6kdh" Apr 16 18:11:41.522566 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.522558 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8d34f403-81ae-4142-98c8-5c0168280de0-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-f6kdh\" (UID: \"8d34f403-81ae-4142-98c8-5c0168280de0\") " pod="openshift-insights/insights-operator-5785d4fcdd-f6kdh" Apr 16 18:11:41.522726 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.522683 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/8d34f403-81ae-4142-98c8-5c0168280de0-snapshots\") pod \"insights-operator-5785d4fcdd-f6kdh\" (UID: \"8d34f403-81ae-4142-98c8-5c0168280de0\") " pod="openshift-insights/insights-operator-5785d4fcdd-f6kdh" Apr 16 18:11:41.522795 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.522776 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de4c4df9-8036-4492-9274-9f47dc6c8180-trusted-ca\") pod \"image-registry-75859bb697-t64g6\" (UID: \"de4c4df9-8036-4492-9274-9f47dc6c8180\") " pod="openshift-image-registry/image-registry-75859bb697-t64g6" Apr 16 18:11:41.523645 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.523627 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/de4c4df9-8036-4492-9274-9f47dc6c8180-installation-pull-secrets\") pod \"image-registry-75859bb697-t64g6\" (UID: \"de4c4df9-8036-4492-9274-9f47dc6c8180\") " pod="openshift-image-registry/image-registry-75859bb697-t64g6" Apr 16 18:11:41.523931 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.523916 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d34f403-81ae-4142-98c8-5c0168280de0-serving-cert\") pod \"insights-operator-5785d4fcdd-f6kdh\" (UID: \"8d34f403-81ae-4142-98c8-5c0168280de0\") " pod="openshift-insights/insights-operator-5785d4fcdd-f6kdh" Apr 16 18:11:41.525148 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.525126 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/de4c4df9-8036-4492-9274-9f47dc6c8180-image-registry-private-configuration\") pod \"image-registry-75859bb697-t64g6\" (UID: \"de4c4df9-8036-4492-9274-9f47dc6c8180\") " pod="openshift-image-registry/image-registry-75859bb697-t64g6" Apr 16 18:11:41.531048 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.530535 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bptwk\" (UniqueName: \"kubernetes.io/projected/2eddf300-6693-4839-94d4-0403e9b6e8c2-kube-api-access-bptwk\") pod \"cluster-samples-operator-667775844f-7k4g5\" (UID: \"2eddf300-6693-4839-94d4-0403e9b6e8c2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7k4g5" Apr 16 18:11:41.532060 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.531811 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lctz5\" (UniqueName: \"kubernetes.io/projected/8d34f403-81ae-4142-98c8-5c0168280de0-kube-api-access-lctz5\") pod \"insights-operator-5785d4fcdd-f6kdh\" (UID: \"8d34f403-81ae-4142-98c8-5c0168280de0\") " pod="openshift-insights/insights-operator-5785d4fcdd-f6kdh" Apr 16 18:11:41.532185 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.532164 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de4c4df9-8036-4492-9274-9f47dc6c8180-bound-sa-token\") pod \"image-registry-75859bb697-t64g6\" (UID: \"de4c4df9-8036-4492-9274-9f47dc6c8180\") " pod="openshift-image-registry/image-registry-75859bb697-t64g6" Apr 16 18:11:41.532404 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.532383 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5f5d\" (UniqueName: \"kubernetes.io/projected/de4c4df9-8036-4492-9274-9f47dc6c8180-kube-api-access-s5f5d\") pod \"image-registry-75859bb697-t64g6\" (UID: \"de4c4df9-8036-4492-9274-9f47dc6c8180\") " pod="openshift-image-registry/image-registry-75859bb697-t64g6" Apr 16 18:11:41.553590 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.553567 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-f6kdh" Apr 16 18:11:41.599284 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.599123 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-w28dq"] Apr 16 18:11:41.604806 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:11:41.604747 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11e9383c_4bf6_4c5c_9dec_a8f2b642aff1.slice/crio-5a4815b2ec93a6270bcefb739fc15675bf352eb5a9d115830fe98e1d347ab028 WatchSource:0}: Error finding container 5a4815b2ec93a6270bcefb739fc15675bf352eb5a9d115830fe98e1d347ab028: Status 404 returned error can't find the container with id 5a4815b2ec93a6270bcefb739fc15675bf352eb5a9d115830fe98e1d347ab028 Apr 16 18:11:41.662246 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:41.662219 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-f6kdh"] Apr 16 18:11:41.665736 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:11:41.665712 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d34f403_81ae_4142_98c8_5c0168280de0.slice/crio-a02b61975e564d5d3cc9a0729a0d1424e242298c54812c59235ba2dc837234fd WatchSource:0}: Error finding container a02b61975e564d5d3cc9a0729a0d1424e242298c54812c59235ba2dc837234fd: Status 404 returned error can't find the container with id a02b61975e564d5d3cc9a0729a0d1424e242298c54812c59235ba2dc837234fd Apr 16 18:11:42.025137 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:42.025056 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/273ddeef-93ac-489e-893f-a85a3c28bdb6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-j6fk5\" (UID: \"273ddeef-93ac-489e-893f-a85a3c28bdb6\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-j6fk5" Apr 16 18:11:42.025137 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:42.025091 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2eddf300-6693-4839-94d4-0403e9b6e8c2-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-7k4g5\" (UID: \"2eddf300-6693-4839-94d4-0403e9b6e8c2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7k4g5" Apr 16 18:11:42.025137 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:42.025110 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de4c4df9-8036-4492-9274-9f47dc6c8180-registry-tls\") pod \"image-registry-75859bb697-t64g6\" (UID: \"de4c4df9-8036-4492-9274-9f47dc6c8180\") " pod="openshift-image-registry/image-registry-75859bb697-t64g6" Apr 16 18:11:42.025378 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:42.025228 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:11:42.025378 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:42.025252 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:11:42.025378 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:42.025232 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:11:42.025378 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:42.025295 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-75859bb697-t64g6: secret "image-registry-tls" not found Apr 16 18:11:42.025378 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:42.025303 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2eddf300-6693-4839-94d4-0403e9b6e8c2-samples-operator-tls podName:2eddf300-6693-4839-94d4-0403e9b6e8c2 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:43.025282163 +0000 UTC m=+62.226857441 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/2eddf300-6693-4839-94d4-0403e9b6e8c2-samples-operator-tls") pod "cluster-samples-operator-667775844f-7k4g5" (UID: "2eddf300-6693-4839-94d4-0403e9b6e8c2") : secret "samples-operator-tls" not found Apr 16 18:11:42.025378 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:42.025322 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/de4c4df9-8036-4492-9274-9f47dc6c8180-registry-tls podName:de4c4df9-8036-4492-9274-9f47dc6c8180 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:43.02531095 +0000 UTC m=+62.226886196 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/de4c4df9-8036-4492-9274-9f47dc6c8180-registry-tls") pod "image-registry-75859bb697-t64g6" (UID: "de4c4df9-8036-4492-9274-9f47dc6c8180") : secret "image-registry-tls" not found Apr 16 18:11:42.025378 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:42.025335 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/273ddeef-93ac-489e-893f-a85a3c28bdb6-cluster-monitoring-operator-tls podName:273ddeef-93ac-489e-893f-a85a3c28bdb6 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:43.025329389 +0000 UTC m=+62.226904635 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/273ddeef-93ac-489e-893f-a85a3c28bdb6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-j6fk5" (UID: "273ddeef-93ac-489e-893f-a85a3c28bdb6") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:11:42.495075 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:42.495048 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b6sfk" Apr 16 18:11:42.543238 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:42.543180 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-f6kdh" event={"ID":"8d34f403-81ae-4142-98c8-5c0168280de0","Type":"ContainerStarted","Data":"a02b61975e564d5d3cc9a0729a0d1424e242298c54812c59235ba2dc837234fd"} Apr 16 18:11:42.544296 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:42.544271 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-w28dq" event={"ID":"11e9383c-4bf6-4c5c-9dec-a8f2b642aff1","Type":"ContainerStarted","Data":"5a4815b2ec93a6270bcefb739fc15675bf352eb5a9d115830fe98e1d347ab028"} Apr 16 18:11:43.035057 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:43.035008 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/273ddeef-93ac-489e-893f-a85a3c28bdb6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-j6fk5\" (UID: \"273ddeef-93ac-489e-893f-a85a3c28bdb6\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-j6fk5" Apr 16 18:11:43.035224 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:43.035070 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2eddf300-6693-4839-94d4-0403e9b6e8c2-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-7k4g5\" (UID: \"2eddf300-6693-4839-94d4-0403e9b6e8c2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7k4g5" Apr 16 18:11:43.035224 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:43.035175 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:11:43.035224 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:43.035202 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de4c4df9-8036-4492-9274-9f47dc6c8180-registry-tls\") pod \"image-registry-75859bb697-t64g6\" (UID: \"de4c4df9-8036-4492-9274-9f47dc6c8180\") " pod="openshift-image-registry/image-registry-75859bb697-t64g6" Apr 16 18:11:43.035381 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:43.035234 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2eddf300-6693-4839-94d4-0403e9b6e8c2-samples-operator-tls podName:2eddf300-6693-4839-94d4-0403e9b6e8c2 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:45.035216033 +0000 UTC m=+64.236791301 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/2eddf300-6693-4839-94d4-0403e9b6e8c2-samples-operator-tls") pod "cluster-samples-operator-667775844f-7k4g5" (UID: "2eddf300-6693-4839-94d4-0403e9b6e8c2") : secret "samples-operator-tls" not found Apr 16 18:11:43.035381 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:43.035274 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:11:43.035381 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:43.035177 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:11:43.035381 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:43.035289 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-75859bb697-t64g6: secret "image-registry-tls" not found Apr 16 18:11:43.035381 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:43.035331 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/de4c4df9-8036-4492-9274-9f47dc6c8180-registry-tls podName:de4c4df9-8036-4492-9274-9f47dc6c8180 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:45.035317422 +0000 UTC m=+64.236892668 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/de4c4df9-8036-4492-9274-9f47dc6c8180-registry-tls") pod "image-registry-75859bb697-t64g6" (UID: "de4c4df9-8036-4492-9274-9f47dc6c8180") : secret "image-registry-tls" not found Apr 16 18:11:43.035381 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:43.035346 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/273ddeef-93ac-489e-893f-a85a3c28bdb6-cluster-monitoring-operator-tls podName:273ddeef-93ac-489e-893f-a85a3c28bdb6 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:45.03533816 +0000 UTC m=+64.236913406 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/273ddeef-93ac-489e-893f-a85a3c28bdb6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-j6fk5" (UID: "273ddeef-93ac-489e-893f-a85a3c28bdb6") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:11:44.550302 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:44.550265 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-f6kdh" event={"ID":"8d34f403-81ae-4142-98c8-5c0168280de0","Type":"ContainerStarted","Data":"538b5d437f09469af00f3443076e669e0947fce7befe48a2875de13038cd90d2"} Apr 16 18:11:44.551796 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:44.551779 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-w28dq_11e9383c-4bf6-4c5c-9dec-a8f2b642aff1/console-operator/0.log" Apr 16 18:11:44.551886 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:44.551812 2576 generic.go:358] "Generic (PLEG): container finished" podID="11e9383c-4bf6-4c5c-9dec-a8f2b642aff1" containerID="b1bae6c0cd8a1c8bc96dcedf576b269e744731234ebee9299cdf39eb8489be36" exitCode=255 Apr 16 18:11:44.551886 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:44.551845 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-w28dq" event={"ID":"11e9383c-4bf6-4c5c-9dec-a8f2b642aff1","Type":"ContainerDied","Data":"b1bae6c0cd8a1c8bc96dcedf576b269e744731234ebee9299cdf39eb8489be36"} Apr 16 18:11:44.552052 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:44.552037 2576 scope.go:117] "RemoveContainer" containerID="b1bae6c0cd8a1c8bc96dcedf576b269e744731234ebee9299cdf39eb8489be36" Apr 16 18:11:44.570114 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:44.570059 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5785d4fcdd-f6kdh" podStartSLOduration=1.2757208979999999 podStartE2EDuration="3.570039454s" podCreationTimestamp="2026-04-16 18:11:41 +0000 UTC" firstStartedPulling="2026-04-16 18:11:41.66747193 +0000 UTC m=+60.869047176" lastFinishedPulling="2026-04-16 18:11:43.961790478 +0000 UTC m=+63.163365732" observedRunningTime="2026-04-16 18:11:44.56952222 +0000 UTC m=+63.771097488" watchObservedRunningTime="2026-04-16 18:11:44.570039454 +0000 UTC m=+63.771614722" Apr 16 18:11:45.051393 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:45.051364 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de4c4df9-8036-4492-9274-9f47dc6c8180-registry-tls\") pod \"image-registry-75859bb697-t64g6\" (UID: \"de4c4df9-8036-4492-9274-9f47dc6c8180\") " pod="openshift-image-registry/image-registry-75859bb697-t64g6" Apr 16 18:11:45.051513 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:45.051469 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/273ddeef-93ac-489e-893f-a85a3c28bdb6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-j6fk5\" (UID: \"273ddeef-93ac-489e-893f-a85a3c28bdb6\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-j6fk5" Apr 16 18:11:45.051513 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:45.051492 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2eddf300-6693-4839-94d4-0403e9b6e8c2-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-7k4g5\" (UID: \"2eddf300-6693-4839-94d4-0403e9b6e8c2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7k4g5" Apr 16 18:11:45.051513 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:45.051505 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:11:45.051624 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:45.051522 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-75859bb697-t64g6: secret "image-registry-tls" not found Apr 16 18:11:45.051624 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:45.051565 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/de4c4df9-8036-4492-9274-9f47dc6c8180-registry-tls podName:de4c4df9-8036-4492-9274-9f47dc6c8180 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:49.051551884 +0000 UTC m=+68.253127134 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/de4c4df9-8036-4492-9274-9f47dc6c8180-registry-tls") pod "image-registry-75859bb697-t64g6" (UID: "de4c4df9-8036-4492-9274-9f47dc6c8180") : secret "image-registry-tls" not found Apr 16 18:11:45.051624 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:45.051582 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:11:45.051624 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:45.051584 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:11:45.051794 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:45.051645 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2eddf300-6693-4839-94d4-0403e9b6e8c2-samples-operator-tls podName:2eddf300-6693-4839-94d4-0403e9b6e8c2 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:49.051627939 +0000 UTC m=+68.253203202 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/2eddf300-6693-4839-94d4-0403e9b6e8c2-samples-operator-tls") pod "cluster-samples-operator-667775844f-7k4g5" (UID: "2eddf300-6693-4839-94d4-0403e9b6e8c2") : secret "samples-operator-tls" not found Apr 16 18:11:45.051794 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:45.051662 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/273ddeef-93ac-489e-893f-a85a3c28bdb6-cluster-monitoring-operator-tls podName:273ddeef-93ac-489e-893f-a85a3c28bdb6 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:49.051652916 +0000 UTC m=+68.253228166 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/273ddeef-93ac-489e-893f-a85a3c28bdb6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-j6fk5" (UID: "273ddeef-93ac-489e-893f-a85a3c28bdb6") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:11:45.555213 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:45.555188 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-w28dq_11e9383c-4bf6-4c5c-9dec-a8f2b642aff1/console-operator/1.log" Apr 16 18:11:45.555567 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:45.555552 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-w28dq_11e9383c-4bf6-4c5c-9dec-a8f2b642aff1/console-operator/0.log" Apr 16 18:11:45.555619 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:45.555587 2576 generic.go:358] "Generic (PLEG): container finished" podID="11e9383c-4bf6-4c5c-9dec-a8f2b642aff1" containerID="33d1e64732bcb42f1bae74a9319b875268f2676bf6543b29272e82c3e2934752" exitCode=255 Apr 16 18:11:45.555690 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:45.555670 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-w28dq" event={"ID":"11e9383c-4bf6-4c5c-9dec-a8f2b642aff1","Type":"ContainerDied","Data":"33d1e64732bcb42f1bae74a9319b875268f2676bf6543b29272e82c3e2934752"} Apr 16 18:11:45.555750 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:45.555728 2576 scope.go:117] "RemoveContainer" containerID="b1bae6c0cd8a1c8bc96dcedf576b269e744731234ebee9299cdf39eb8489be36" Apr 16 18:11:45.556044 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:45.556024 2576 scope.go:117] "RemoveContainer" containerID="33d1e64732bcb42f1bae74a9319b875268f2676bf6543b29272e82c3e2934752" Apr 16 18:11:45.556191 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:45.556171 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-w28dq_openshift-console-operator(11e9383c-4bf6-4c5c-9dec-a8f2b642aff1)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-w28dq" podUID="11e9383c-4bf6-4c5c-9dec-a8f2b642aff1" Apr 16 18:11:46.080193 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:46.080165 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-6ctrp"] Apr 16 18:11:46.084147 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:46.084129 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-6ctrp" Apr 16 18:11:46.086778 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:46.086758 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 18:11:46.087762 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:46.087742 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-2tr6p\"" Apr 16 18:11:46.087871 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:46.087746 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 18:11:46.094116 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:46.094095 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-6ctrp"] Apr 16 18:11:46.159100 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:46.159076 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlhmc\" (UniqueName: \"kubernetes.io/projected/23b3ce03-bb68-4ac2-a1fe-04780069ad4d-kube-api-access-qlhmc\") pod \"migrator-64d4d94569-6ctrp\" (UID: \"23b3ce03-bb68-4ac2-a1fe-04780069ad4d\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-6ctrp" Apr 16 18:11:46.259684 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:46.259660 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qlhmc\" (UniqueName: \"kubernetes.io/projected/23b3ce03-bb68-4ac2-a1fe-04780069ad4d-kube-api-access-qlhmc\") pod \"migrator-64d4d94569-6ctrp\" (UID: \"23b3ce03-bb68-4ac2-a1fe-04780069ad4d\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-6ctrp" Apr 16 18:11:46.267429 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:46.267408 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlhmc\" (UniqueName: \"kubernetes.io/projected/23b3ce03-bb68-4ac2-a1fe-04780069ad4d-kube-api-access-qlhmc\") pod \"migrator-64d4d94569-6ctrp\" (UID: \"23b3ce03-bb68-4ac2-a1fe-04780069ad4d\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-6ctrp" Apr 16 18:11:46.392969 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:46.392949 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-6ctrp" Apr 16 18:11:46.502486 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:46.502458 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-6ctrp"] Apr 16 18:11:46.505596 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:11:46.505570 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23b3ce03_bb68_4ac2_a1fe_04780069ad4d.slice/crio-0fe95a4268bf37f754ee75deeacdb2656c7add78ed3e3180a2dce616119aea0a WatchSource:0}: Error finding container 0fe95a4268bf37f754ee75deeacdb2656c7add78ed3e3180a2dce616119aea0a: Status 404 returned error can't find the container with id 0fe95a4268bf37f754ee75deeacdb2656c7add78ed3e3180a2dce616119aea0a Apr 16 18:11:46.559091 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:46.559071 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-w28dq_11e9383c-4bf6-4c5c-9dec-a8f2b642aff1/console-operator/1.log" Apr 16 18:11:46.559450 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:46.559436 2576 scope.go:117] "RemoveContainer" containerID="33d1e64732bcb42f1bae74a9319b875268f2676bf6543b29272e82c3e2934752" Apr 16 18:11:46.562661 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:46.560020 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-w28dq_openshift-console-operator(11e9383c-4bf6-4c5c-9dec-a8f2b642aff1)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-w28dq" podUID="11e9383c-4bf6-4c5c-9dec-a8f2b642aff1" Apr 16 18:11:46.563289 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:46.563258 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-6ctrp" event={"ID":"23b3ce03-bb68-4ac2-a1fe-04780069ad4d","Type":"ContainerStarted","Data":"0fe95a4268bf37f754ee75deeacdb2656c7add78ed3e3180a2dce616119aea0a"} Apr 16 18:11:46.965801 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:46.965765 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/22eec1cf-b2b9-495f-9507-ee4b6c6a9204-metrics-certs\") pod \"network-metrics-daemon-fwqx5\" (UID: \"22eec1cf-b2b9-495f-9507-ee4b6c6a9204\") " pod="openshift-multus/network-metrics-daemon-fwqx5" Apr 16 18:11:46.968462 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:46.968442 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:11:46.976484 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:46.976469 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:11:46.976556 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:46.976547 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22eec1cf-b2b9-495f-9507-ee4b6c6a9204-metrics-certs podName:22eec1cf-b2b9-495f-9507-ee4b6c6a9204 nodeName:}" failed. No retries permitted until 2026-04-16 18:12:50.976506727 +0000 UTC m=+130.178081972 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/22eec1cf-b2b9-495f-9507-ee4b6c6a9204-metrics-certs") pod "network-metrics-daemon-fwqx5" (UID: "22eec1cf-b2b9-495f-9507-ee4b6c6a9204") : secret "metrics-daemon-secret" not found Apr 16 18:11:47.167460 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:47.167403 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfwhj\" (UniqueName: \"kubernetes.io/projected/5ad84774-79a9-4253-9451-f7e900a7cb4d-kube-api-access-cfwhj\") pod \"network-check-target-r2j4b\" (UID: \"5ad84774-79a9-4253-9451-f7e900a7cb4d\") " pod="openshift-network-diagnostics/network-check-target-r2j4b" Apr 16 18:11:47.170733 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:47.170712 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:11:47.180552 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:47.180530 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:11:47.190457 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:47.190434 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfwhj\" (UniqueName: \"kubernetes.io/projected/5ad84774-79a9-4253-9451-f7e900a7cb4d-kube-api-access-cfwhj\") pod \"network-check-target-r2j4b\" (UID: \"5ad84774-79a9-4253-9451-f7e900a7cb4d\") " pod="openshift-network-diagnostics/network-check-target-r2j4b" Apr 16 18:11:47.338407 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:47.338341 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-rbrbw\"" Apr 16 18:11:47.346527 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:47.346509 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r2j4b" Apr 16 18:11:47.382494 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:47.382470 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mb8s7_fb4955c9-d4b5-4e21-a2ca-4d700832a59c/dns-node-resolver/0.log" Apr 16 18:11:47.462965 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:47.462931 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-r2j4b"] Apr 16 18:11:47.647320 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:11:47.647283 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ad84774_79a9_4253_9451_f7e900a7cb4d.slice/crio-ddb84fe5a6aa6157fe5aa5b041ba1d1f933d9e0a8db6f7c17fc0a29fc0b9e1d3 WatchSource:0}: Error finding container ddb84fe5a6aa6157fe5aa5b041ba1d1f933d9e0a8db6f7c17fc0a29fc0b9e1d3: Status 404 returned error can't find the container with id ddb84fe5a6aa6157fe5aa5b041ba1d1f933d9e0a8db6f7c17fc0a29fc0b9e1d3 Apr 16 18:11:48.476147 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:48.476096 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a54e43de-5a67-45f3-b403-4317caee2eca-metrics-tls\") pod \"dns-default-5mn52\" (UID: \"a54e43de-5a67-45f3-b403-4317caee2eca\") " pod="openshift-dns/dns-default-5mn52" Apr 16 18:11:48.476322 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:48.476216 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:11:48.476322 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:48.476278 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a54e43de-5a67-45f3-b403-4317caee2eca-metrics-tls podName:a54e43de-5a67-45f3-b403-4317caee2eca nodeName:}" failed. No retries permitted until 2026-04-16 18:12:20.476259146 +0000 UTC m=+99.677834417 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a54e43de-5a67-45f3-b403-4317caee2eca-metrics-tls") pod "dns-default-5mn52" (UID: "a54e43de-5a67-45f3-b403-4317caee2eca") : secret "dns-default-metrics-tls" not found Apr 16 18:11:48.568107 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:48.568065 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-r2j4b" event={"ID":"5ad84774-79a9-4253-9451-f7e900a7cb4d","Type":"ContainerStarted","Data":"ddb84fe5a6aa6157fe5aa5b041ba1d1f933d9e0a8db6f7c17fc0a29fc0b9e1d3"} Apr 16 18:11:48.569817 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:48.569792 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-6ctrp" event={"ID":"23b3ce03-bb68-4ac2-a1fe-04780069ad4d","Type":"ContainerStarted","Data":"81222156d6eb4ae58c202e3f5f2cff5e7fba58533b77c3cf92faeccf7259a5e2"} Apr 16 18:11:48.569955 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:48.569822 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-6ctrp" event={"ID":"23b3ce03-bb68-4ac2-a1fe-04780069ad4d","Type":"ContainerStarted","Data":"413d2d08b0840990931d285c9da098c80d087c0f89fa52a4c83850208d57fdfa"} Apr 16 18:11:48.577314 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:48.577291 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31b8e533-ba32-44f3-b6db-5c9e368510c6-cert\") pod \"ingress-canary-clw7f\" (UID: \"31b8e533-ba32-44f3-b6db-5c9e368510c6\") " pod="openshift-ingress-canary/ingress-canary-clw7f" Apr 16 18:11:48.577472 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:48.577449 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:11:48.577576 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:48.577518 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31b8e533-ba32-44f3-b6db-5c9e368510c6-cert podName:31b8e533-ba32-44f3-b6db-5c9e368510c6 nodeName:}" failed. No retries permitted until 2026-04-16 18:12:20.577499896 +0000 UTC m=+99.779075142 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/31b8e533-ba32-44f3-b6db-5c9e368510c6-cert") pod "ingress-canary-clw7f" (UID: "31b8e533-ba32-44f3-b6db-5c9e368510c6") : secret "canary-serving-cert" not found Apr 16 18:11:48.586212 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:48.586189 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jmzcb_76ccc7ff-6855-49c3-a0b5-185487ae8516/node-ca/0.log" Apr 16 18:11:48.597117 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:48.597081 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-6ctrp" podStartSLOduration=1.414021147 podStartE2EDuration="2.597068627s" podCreationTimestamp="2026-04-16 18:11:46 +0000 UTC" firstStartedPulling="2026-04-16 18:11:46.507880312 +0000 UTC m=+65.709455558" lastFinishedPulling="2026-04-16 18:11:47.690927778 +0000 UTC m=+66.892503038" observedRunningTime="2026-04-16 18:11:48.596387808 +0000 UTC m=+67.797963076" watchObservedRunningTime="2026-04-16 18:11:48.597068627 +0000 UTC m=+67.798643894" Apr 16 18:11:49.081433 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:49.081383 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2eddf300-6693-4839-94d4-0403e9b6e8c2-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-7k4g5\" (UID: \"2eddf300-6693-4839-94d4-0403e9b6e8c2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7k4g5" Apr 16 18:11:49.081865 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:49.081475 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/273ddeef-93ac-489e-893f-a85a3c28bdb6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-j6fk5\" (UID: \"273ddeef-93ac-489e-893f-a85a3c28bdb6\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-j6fk5" Apr 16 18:11:49.081865 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:49.081552 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:11:49.081865 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:49.081576 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:11:49.081865 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:49.081625 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2eddf300-6693-4839-94d4-0403e9b6e8c2-samples-operator-tls podName:2eddf300-6693-4839-94d4-0403e9b6e8c2 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:57.08160691 +0000 UTC m=+76.283182159 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/2eddf300-6693-4839-94d4-0403e9b6e8c2-samples-operator-tls") pod "cluster-samples-operator-667775844f-7k4g5" (UID: "2eddf300-6693-4839-94d4-0403e9b6e8c2") : secret "samples-operator-tls" not found Apr 16 18:11:49.081865 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:49.081645 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/273ddeef-93ac-489e-893f-a85a3c28bdb6-cluster-monitoring-operator-tls podName:273ddeef-93ac-489e-893f-a85a3c28bdb6 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:57.081634637 +0000 UTC m=+76.283209883 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/273ddeef-93ac-489e-893f-a85a3c28bdb6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-j6fk5" (UID: "273ddeef-93ac-489e-893f-a85a3c28bdb6") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:11:49.081865 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:49.081667 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de4c4df9-8036-4492-9274-9f47dc6c8180-registry-tls\") pod \"image-registry-75859bb697-t64g6\" (UID: \"de4c4df9-8036-4492-9274-9f47dc6c8180\") " pod="openshift-image-registry/image-registry-75859bb697-t64g6" Apr 16 18:11:49.081865 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:49.081783 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:11:49.081865 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:49.081797 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-75859bb697-t64g6: secret "image-registry-tls" not found Apr 16 18:11:49.081865 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:49.081839 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/de4c4df9-8036-4492-9274-9f47dc6c8180-registry-tls podName:de4c4df9-8036-4492-9274-9f47dc6c8180 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:57.081824237 +0000 UTC m=+76.283399497 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/de4c4df9-8036-4492-9274-9f47dc6c8180-registry-tls") pod "image-registry-75859bb697-t64g6" (UID: "de4c4df9-8036-4492-9274-9f47dc6c8180") : secret "image-registry-tls" not found Apr 16 18:11:50.575375 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:50.575291 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-r2j4b" event={"ID":"5ad84774-79a9-4253-9451-f7e900a7cb4d","Type":"ContainerStarted","Data":"49a5c47e28a9bf84119c777cc50fdc63259043941e495f10d43a8e5a4c329337"} Apr 16 18:11:50.575745 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:50.575406 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-r2j4b" Apr 16 18:11:50.593049 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:50.592999 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-r2j4b" podStartSLOduration=67.011288998 podStartE2EDuration="1m9.592983323s" podCreationTimestamp="2026-04-16 18:10:41 +0000 UTC" firstStartedPulling="2026-04-16 18:11:47.649199067 +0000 UTC m=+66.850774316" lastFinishedPulling="2026-04-16 18:11:50.230893395 +0000 UTC m=+69.432468641" observedRunningTime="2026-04-16 18:11:50.592806471 +0000 UTC m=+69.794381768" watchObservedRunningTime="2026-04-16 18:11:50.592983323 +0000 UTC m=+69.794558593" Apr 16 18:11:51.478777 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:51.478747 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-w28dq" Apr 16 18:11:51.478914 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:51.478787 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-w28dq" Apr 16 18:11:51.479086 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:51.479074 2576 scope.go:117] "RemoveContainer" containerID="33d1e64732bcb42f1bae74a9319b875268f2676bf6543b29272e82c3e2934752" Apr 16 18:11:51.479233 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:11:51.479217 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-w28dq_openshift-console-operator(11e9383c-4bf6-4c5c-9dec-a8f2b642aff1)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-w28dq" podUID="11e9383c-4bf6-4c5c-9dec-a8f2b642aff1" Apr 16 18:11:57.142907 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:57.142869 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2eddf300-6693-4839-94d4-0403e9b6e8c2-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-7k4g5\" (UID: \"2eddf300-6693-4839-94d4-0403e9b6e8c2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7k4g5" Apr 16 18:11:57.143276 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:57.142946 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/273ddeef-93ac-489e-893f-a85a3c28bdb6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-j6fk5\" (UID: \"273ddeef-93ac-489e-893f-a85a3c28bdb6\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-j6fk5" Apr 16 18:11:57.143276 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:57.142970 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de4c4df9-8036-4492-9274-9f47dc6c8180-registry-tls\") pod \"image-registry-75859bb697-t64g6\" (UID: \"de4c4df9-8036-4492-9274-9f47dc6c8180\") " pod="openshift-image-registry/image-registry-75859bb697-t64g6" Apr 16 18:11:57.145394 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:57.145364 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de4c4df9-8036-4492-9274-9f47dc6c8180-registry-tls\") pod \"image-registry-75859bb697-t64g6\" (UID: \"de4c4df9-8036-4492-9274-9f47dc6c8180\") " pod="openshift-image-registry/image-registry-75859bb697-t64g6" Apr 16 18:11:57.145527 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:57.145507 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/273ddeef-93ac-489e-893f-a85a3c28bdb6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-j6fk5\" (UID: \"273ddeef-93ac-489e-893f-a85a3c28bdb6\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-j6fk5" Apr 16 18:11:57.145776 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:57.145757 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2eddf300-6693-4839-94d4-0403e9b6e8c2-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-7k4g5\" (UID: \"2eddf300-6693-4839-94d4-0403e9b6e8c2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7k4g5" Apr 16 18:11:57.160190 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:57.160169 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-75859bb697-t64g6" Apr 16 18:11:57.289218 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:57.289192 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-75859bb697-t64g6"] Apr 16 18:11:57.292071 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:11:57.292041 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde4c4df9_8036_4492_9274_9f47dc6c8180.slice/crio-ade4050f2d1b6c049daaa281d0cf31fecb6f7fbabe792bf3ed9780d6be8f0c70 WatchSource:0}: Error finding container ade4050f2d1b6c049daaa281d0cf31fecb6f7fbabe792bf3ed9780d6be8f0c70: Status 404 returned error can't find the container with id ade4050f2d1b6c049daaa281d0cf31fecb6f7fbabe792bf3ed9780d6be8f0c70 Apr 16 18:11:57.347331 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:57.347312 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-47hvk\"" Apr 16 18:11:57.355177 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:57.355156 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-j6fk5" Apr 16 18:11:57.445280 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:57.445253 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7k4g5" Apr 16 18:11:57.467092 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:57.467039 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-j6fk5"] Apr 16 18:11:57.470589 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:11:57.470550 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod273ddeef_93ac_489e_893f_a85a3c28bdb6.slice/crio-706763f1e2097ac2a6479ea48f44edccfbaf6e25a2f3aa63893c7e18b81edf31 WatchSource:0}: Error finding container 706763f1e2097ac2a6479ea48f44edccfbaf6e25a2f3aa63893c7e18b81edf31: Status 404 returned error can't find the container with id 706763f1e2097ac2a6479ea48f44edccfbaf6e25a2f3aa63893c7e18b81edf31 Apr 16 18:11:57.557594 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:57.557560 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7k4g5"] Apr 16 18:11:57.592810 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:57.592779 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-j6fk5" event={"ID":"273ddeef-93ac-489e-893f-a85a3c28bdb6","Type":"ContainerStarted","Data":"706763f1e2097ac2a6479ea48f44edccfbaf6e25a2f3aa63893c7e18b81edf31"} Apr 16 18:11:57.594065 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:57.594044 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-75859bb697-t64g6" event={"ID":"de4c4df9-8036-4492-9274-9f47dc6c8180","Type":"ContainerStarted","Data":"d690e145291a29d60726f30e42ef2de324f2271ede4fb552a4ef3d77446b6ad0"} Apr 16 18:11:57.594166 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:57.594070 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-75859bb697-t64g6" event={"ID":"de4c4df9-8036-4492-9274-9f47dc6c8180","Type":"ContainerStarted","Data":"ade4050f2d1b6c049daaa281d0cf31fecb6f7fbabe792bf3ed9780d6be8f0c70"} Apr 16 18:11:57.594211 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:57.594188 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-75859bb697-t64g6" Apr 16 18:11:57.614513 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:57.614464 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-75859bb697-t64g6" podStartSLOduration=16.61444827 podStartE2EDuration="16.61444827s" podCreationTimestamp="2026-04-16 18:11:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:11:57.613797529 +0000 UTC m=+76.815372798" watchObservedRunningTime="2026-04-16 18:11:57.61444827 +0000 UTC m=+76.816023539" Apr 16 18:11:58.597976 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:58.597934 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7k4g5" event={"ID":"2eddf300-6693-4839-94d4-0403e9b6e8c2","Type":"ContainerStarted","Data":"57036b2cb3c94e016333636bb4b1a7db40ce7c92630d9c19d865f159a5edcf06"} Apr 16 18:11:59.602095 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:59.602051 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-j6fk5" event={"ID":"273ddeef-93ac-489e-893f-a85a3c28bdb6","Type":"ContainerStarted","Data":"f085ed56c564e4e52ec9384671194a0e031e4455e43e2a5d2edd7df013a515ad"} Apr 16 18:11:59.603680 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:59.603657 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7k4g5" event={"ID":"2eddf300-6693-4839-94d4-0403e9b6e8c2","Type":"ContainerStarted","Data":"b6e5a84eaf436cf4a3ad8160180a1d63005e3f0cc07989ff7275e7167e068dd7"} Apr 16 18:11:59.603680 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:59.603682 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7k4g5" event={"ID":"2eddf300-6693-4839-94d4-0403e9b6e8c2","Type":"ContainerStarted","Data":"923b52d8b7fa82a10e92484a5d830a0568e89c2ff51245caaf9780f5b7c08e9a"} Apr 16 18:11:59.621219 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:59.621178 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-j6fk5" podStartSLOduration=16.691183698 podStartE2EDuration="18.621167968s" podCreationTimestamp="2026-04-16 18:11:41 +0000 UTC" firstStartedPulling="2026-04-16 18:11:57.472443423 +0000 UTC m=+76.674018669" lastFinishedPulling="2026-04-16 18:11:59.402427684 +0000 UTC m=+78.604002939" observedRunningTime="2026-04-16 18:11:59.619991363 +0000 UTC m=+78.821566630" watchObservedRunningTime="2026-04-16 18:11:59.621167968 +0000 UTC m=+78.822743236" Apr 16 18:11:59.635953 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:11:59.635914 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7k4g5" podStartSLOduration=16.828581802 podStartE2EDuration="18.635903595s" podCreationTimestamp="2026-04-16 18:11:41 +0000 UTC" firstStartedPulling="2026-04-16 18:11:57.598270595 +0000 UTC m=+76.799845853" lastFinishedPulling="2026-04-16 18:11:59.405592397 +0000 UTC m=+78.607167646" observedRunningTime="2026-04-16 18:11:59.635060677 +0000 UTC m=+78.836635966" watchObservedRunningTime="2026-04-16 18:11:59.635903595 +0000 UTC m=+78.837478863" Apr 16 18:12:03.321987 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:03.321961 2576 scope.go:117] "RemoveContainer" containerID="33d1e64732bcb42f1bae74a9319b875268f2676bf6543b29272e82c3e2934752" Apr 16 18:12:03.614157 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:03.614087 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-w28dq_11e9383c-4bf6-4c5c-9dec-a8f2b642aff1/console-operator/1.log" Apr 16 18:12:03.614293 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:03.614175 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-w28dq" event={"ID":"11e9383c-4bf6-4c5c-9dec-a8f2b642aff1","Type":"ContainerStarted","Data":"e6a346d5da1cb02be6edb3892a4a6645e1bd60406c0ce1986906a41f88d96689"} Apr 16 18:12:03.614439 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:03.614422 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-w28dq" Apr 16 18:12:03.632561 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:03.632521 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-d87b8d5fc-w28dq" podStartSLOduration=20.275615235 podStartE2EDuration="22.63250817s" podCreationTimestamp="2026-04-16 18:11:41 +0000 UTC" firstStartedPulling="2026-04-16 18:11:41.607280458 +0000 UTC m=+60.808855717" lastFinishedPulling="2026-04-16 18:11:43.964173392 +0000 UTC m=+63.165748652" observedRunningTime="2026-04-16 18:12:03.631749654 +0000 UTC m=+82.833324922" watchObservedRunningTime="2026-04-16 18:12:03.63250817 +0000 UTC m=+82.834083491" Apr 16 18:12:03.931209 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:03.931184 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-d87b8d5fc-w28dq" Apr 16 18:12:06.152800 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:06.152768 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-78gqq"] Apr 16 18:12:06.157437 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:06.157416 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-hs56f"] Apr 16 18:12:06.157584 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:06.157566 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-78gqq" Apr 16 18:12:06.160205 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:06.160190 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-hs56f" Apr 16 18:12:06.171929 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:06.171914 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 18:12:06.172609 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:06.172593 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 18:12:06.173647 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:06.173633 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-7j8vd\"" Apr 16 18:12:06.173857 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:06.173841 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 18:12:06.174381 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:06.174369 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-8nnpz\"" Apr 16 18:12:06.209805 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:06.209785 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwh26\" (UniqueName: \"kubernetes.io/projected/c3a8c67b-0001-42d8-bb3f-f86472b945ff-kube-api-access-lwh26\") pod \"insights-runtime-extractor-78gqq\" (UID: \"c3a8c67b-0001-42d8-bb3f-f86472b945ff\") " pod="openshift-insights/insights-runtime-extractor-78gqq" Apr 16 18:12:06.209870 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:06.209813 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c3a8c67b-0001-42d8-bb3f-f86472b945ff-data-volume\") pod \"insights-runtime-extractor-78gqq\" (UID: \"c3a8c67b-0001-42d8-bb3f-f86472b945ff\") " pod="openshift-insights/insights-runtime-extractor-78gqq" Apr 16 18:12:06.209870 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:06.209835 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c3a8c67b-0001-42d8-bb3f-f86472b945ff-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-78gqq\" (UID: \"c3a8c67b-0001-42d8-bb3f-f86472b945ff\") " pod="openshift-insights/insights-runtime-extractor-78gqq" Apr 16 18:12:06.209870 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:06.209862 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/78aaf263-7020-40a9-b797-efba25fa39de-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-hs56f\" (UID: \"78aaf263-7020-40a9-b797-efba25fa39de\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-hs56f" Apr 16 18:12:06.209977 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:06.209895 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c3a8c67b-0001-42d8-bb3f-f86472b945ff-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-78gqq\" (UID: \"c3a8c67b-0001-42d8-bb3f-f86472b945ff\") " pod="openshift-insights/insights-runtime-extractor-78gqq" Apr 16 18:12:06.209977 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:06.209941 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c3a8c67b-0001-42d8-bb3f-f86472b945ff-crio-socket\") pod \"insights-runtime-extractor-78gqq\" (UID: \"c3a8c67b-0001-42d8-bb3f-f86472b945ff\") " pod="openshift-insights/insights-runtime-extractor-78gqq" Apr 16 18:12:06.220649 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:06.220617 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-75859bb697-t64g6"] Apr 16 18:12:06.221692 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:06.221666 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-78gqq"] Apr 16 18:12:06.268328 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:06.268308 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-hs56f"] Apr 16 18:12:06.310242 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:06.310222 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lwh26\" (UniqueName: \"kubernetes.io/projected/c3a8c67b-0001-42d8-bb3f-f86472b945ff-kube-api-access-lwh26\") pod \"insights-runtime-extractor-78gqq\" (UID: \"c3a8c67b-0001-42d8-bb3f-f86472b945ff\") " pod="openshift-insights/insights-runtime-extractor-78gqq" Apr 16 18:12:06.310349 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:06.310261 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c3a8c67b-0001-42d8-bb3f-f86472b945ff-data-volume\") pod \"insights-runtime-extractor-78gqq\" (UID: \"c3a8c67b-0001-42d8-bb3f-f86472b945ff\") " pod="openshift-insights/insights-runtime-extractor-78gqq" Apr 16 18:12:06.310349 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:06.310294 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c3a8c67b-0001-42d8-bb3f-f86472b945ff-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-78gqq\" (UID: \"c3a8c67b-0001-42d8-bb3f-f86472b945ff\") " pod="openshift-insights/insights-runtime-extractor-78gqq" Apr 16 18:12:06.310349 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:06.310327 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/78aaf263-7020-40a9-b797-efba25fa39de-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-hs56f\" (UID: \"78aaf263-7020-40a9-b797-efba25fa39de\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-hs56f" Apr 16 18:12:06.310495 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:06.310355 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c3a8c67b-0001-42d8-bb3f-f86472b945ff-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-78gqq\" (UID: \"c3a8c67b-0001-42d8-bb3f-f86472b945ff\") " pod="openshift-insights/insights-runtime-extractor-78gqq" Apr 16 18:12:06.310495 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:06.310427 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c3a8c67b-0001-42d8-bb3f-f86472b945ff-crio-socket\") pod \"insights-runtime-extractor-78gqq\" (UID: \"c3a8c67b-0001-42d8-bb3f-f86472b945ff\") " pod="openshift-insights/insights-runtime-extractor-78gqq" Apr 16 18:12:06.310592 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:06.310538 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c3a8c67b-0001-42d8-bb3f-f86472b945ff-crio-socket\") pod \"insights-runtime-extractor-78gqq\" (UID: \"c3a8c67b-0001-42d8-bb3f-f86472b945ff\") " pod="openshift-insights/insights-runtime-extractor-78gqq" Apr 16 18:12:06.310666 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:06.310647 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c3a8c67b-0001-42d8-bb3f-f86472b945ff-data-volume\") pod \"insights-runtime-extractor-78gqq\" (UID: \"c3a8c67b-0001-42d8-bb3f-f86472b945ff\") " pod="openshift-insights/insights-runtime-extractor-78gqq" Apr 16 18:12:06.310960 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:06.310939 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c3a8c67b-0001-42d8-bb3f-f86472b945ff-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-78gqq\" (UID: \"c3a8c67b-0001-42d8-bb3f-f86472b945ff\") " pod="openshift-insights/insights-runtime-extractor-78gqq" Apr 16 18:12:06.312597 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:06.312579 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/78aaf263-7020-40a9-b797-efba25fa39de-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-hs56f\" (UID: \"78aaf263-7020-40a9-b797-efba25fa39de\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-hs56f" Apr 16 18:12:06.312672 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:06.312581 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c3a8c67b-0001-42d8-bb3f-f86472b945ff-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-78gqq\" (UID: \"c3a8c67b-0001-42d8-bb3f-f86472b945ff\") " pod="openshift-insights/insights-runtime-extractor-78gqq" Apr 16 18:12:06.326766 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:06.326743 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwh26\" (UniqueName: \"kubernetes.io/projected/c3a8c67b-0001-42d8-bb3f-f86472b945ff-kube-api-access-lwh26\") pod \"insights-runtime-extractor-78gqq\" (UID: \"c3a8c67b-0001-42d8-bb3f-f86472b945ff\") " pod="openshift-insights/insights-runtime-extractor-78gqq" Apr 16 18:12:06.467147 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:06.467097 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-78gqq" Apr 16 18:12:06.471152 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:06.471133 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-hs56f" Apr 16 18:12:06.598199 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:06.598169 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-78gqq"] Apr 16 18:12:06.601097 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:12:06.601069 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3a8c67b_0001_42d8_bb3f_f86472b945ff.slice/crio-764eb01a2acd9c992b0b6452ede1462475fd8bb19ef10bf3147086b9d88e3d5f WatchSource:0}: Error finding container 764eb01a2acd9c992b0b6452ede1462475fd8bb19ef10bf3147086b9d88e3d5f: Status 404 returned error can't find the container with id 764eb01a2acd9c992b0b6452ede1462475fd8bb19ef10bf3147086b9d88e3d5f Apr 16 18:12:06.616142 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:06.616121 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-hs56f"] Apr 16 18:12:06.620040 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:12:06.619955 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78aaf263_7020_40a9_b797_efba25fa39de.slice/crio-e04b51304cdfcfa58771086fd99452393acc528d861743b19ff71944c3a00430 WatchSource:0}: Error finding container e04b51304cdfcfa58771086fd99452393acc528d861743b19ff71944c3a00430: Status 404 returned error can't find the container with id e04b51304cdfcfa58771086fd99452393acc528d861743b19ff71944c3a00430 Apr 16 18:12:06.622459 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:06.622363 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-78gqq" event={"ID":"c3a8c67b-0001-42d8-bb3f-f86472b945ff","Type":"ContainerStarted","Data":"764eb01a2acd9c992b0b6452ede1462475fd8bb19ef10bf3147086b9d88e3d5f"} Apr 16 18:12:07.625434 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:07.625404 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-hs56f" event={"ID":"78aaf263-7020-40a9-b797-efba25fa39de","Type":"ContainerStarted","Data":"e04b51304cdfcfa58771086fd99452393acc528d861743b19ff71944c3a00430"} Apr 16 18:12:07.626576 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:07.626551 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-78gqq" event={"ID":"c3a8c67b-0001-42d8-bb3f-f86472b945ff","Type":"ContainerStarted","Data":"3dd6582919ad06a66bc0f189787323c399e339a8e0f33dfae63b445b1b792ee9"} Apr 16 18:12:08.631151 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:08.631113 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-78gqq" event={"ID":"c3a8c67b-0001-42d8-bb3f-f86472b945ff","Type":"ContainerStarted","Data":"25232810120d7ffca1e52846c101fa15e8bd6894562c34cd15696be2dafd8901"} Apr 16 18:12:08.632555 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:08.632526 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-hs56f" event={"ID":"78aaf263-7020-40a9-b797-efba25fa39de","Type":"ContainerStarted","Data":"7739d704476f01751817014e564e404c015f5f2cc5fd78fbf50d219de6045e3e"} Apr 16 18:12:08.632764 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:08.632735 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-hs56f" Apr 16 18:12:08.637722 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:08.637687 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-hs56f" Apr 16 18:12:08.648240 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:08.648200 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-hs56f" podStartSLOduration=1.598550999 podStartE2EDuration="2.648185537s" podCreationTimestamp="2026-04-16 18:12:06 +0000 UTC" firstStartedPulling="2026-04-16 18:12:06.622038084 +0000 UTC m=+85.823613331" lastFinishedPulling="2026-04-16 18:12:07.671672622 +0000 UTC m=+86.873247869" observedRunningTime="2026-04-16 18:12:08.648016378 +0000 UTC m=+87.849591670" watchObservedRunningTime="2026-04-16 18:12:08.648185537 +0000 UTC m=+87.849760803" Apr 16 18:12:09.016434 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:09.016412 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-mgj2v"] Apr 16 18:12:09.019413 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:09.019397 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-mgj2v" Apr 16 18:12:09.021898 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:09.021876 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 18:12:09.021997 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:09.021959 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-vk6lh\"" Apr 16 18:12:09.023281 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:09.023263 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 18:12:09.023373 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:09.023302 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 18:12:09.028191 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:09.028172 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-mgj2v"] Apr 16 18:12:09.128985 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:09.128958 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/bd8e1e3f-f92a-4c3d-a28b-78213622057d-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-mgj2v\" (UID: \"bd8e1e3f-f92a-4c3d-a28b-78213622057d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-mgj2v" Apr 16 18:12:09.128985 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:09.128991 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bd8e1e3f-f92a-4c3d-a28b-78213622057d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-mgj2v\" (UID: \"bd8e1e3f-f92a-4c3d-a28b-78213622057d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-mgj2v" Apr 16 18:12:09.129145 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:09.129043 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmssx\" (UniqueName: \"kubernetes.io/projected/bd8e1e3f-f92a-4c3d-a28b-78213622057d-kube-api-access-bmssx\") pod \"prometheus-operator-78f957474d-mgj2v\" (UID: \"bd8e1e3f-f92a-4c3d-a28b-78213622057d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-mgj2v" Apr 16 18:12:09.129145 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:09.129079 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bd8e1e3f-f92a-4c3d-a28b-78213622057d-metrics-client-ca\") pod \"prometheus-operator-78f957474d-mgj2v\" (UID: \"bd8e1e3f-f92a-4c3d-a28b-78213622057d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-mgj2v" Apr 16 18:12:09.230371 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:09.230291 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/bd8e1e3f-f92a-4c3d-a28b-78213622057d-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-mgj2v\" (UID: \"bd8e1e3f-f92a-4c3d-a28b-78213622057d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-mgj2v" Apr 16 18:12:09.230371 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:09.230330 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bd8e1e3f-f92a-4c3d-a28b-78213622057d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-mgj2v\" (UID: \"bd8e1e3f-f92a-4c3d-a28b-78213622057d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-mgj2v" Apr 16 18:12:09.230529 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:09.230389 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bmssx\" (UniqueName: \"kubernetes.io/projected/bd8e1e3f-f92a-4c3d-a28b-78213622057d-kube-api-access-bmssx\") pod \"prometheus-operator-78f957474d-mgj2v\" (UID: \"bd8e1e3f-f92a-4c3d-a28b-78213622057d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-mgj2v" Apr 16 18:12:09.230529 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:12:09.230435 2576 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 16 18:12:09.230529 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:09.230444 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bd8e1e3f-f92a-4c3d-a28b-78213622057d-metrics-client-ca\") pod \"prometheus-operator-78f957474d-mgj2v\" (UID: \"bd8e1e3f-f92a-4c3d-a28b-78213622057d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-mgj2v" Apr 16 18:12:09.230529 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:12:09.230499 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd8e1e3f-f92a-4c3d-a28b-78213622057d-prometheus-operator-tls podName:bd8e1e3f-f92a-4c3d-a28b-78213622057d nodeName:}" failed. No retries permitted until 2026-04-16 18:12:09.730479699 +0000 UTC m=+88.932054959 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/bd8e1e3f-f92a-4c3d-a28b-78213622057d-prometheus-operator-tls") pod "prometheus-operator-78f957474d-mgj2v" (UID: "bd8e1e3f-f92a-4c3d-a28b-78213622057d") : secret "prometheus-operator-tls" not found Apr 16 18:12:09.231188 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:09.231168 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bd8e1e3f-f92a-4c3d-a28b-78213622057d-metrics-client-ca\") pod \"prometheus-operator-78f957474d-mgj2v\" (UID: \"bd8e1e3f-f92a-4c3d-a28b-78213622057d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-mgj2v" Apr 16 18:12:09.232660 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:09.232641 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bd8e1e3f-f92a-4c3d-a28b-78213622057d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-mgj2v\" (UID: \"bd8e1e3f-f92a-4c3d-a28b-78213622057d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-mgj2v" Apr 16 18:12:09.239084 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:09.239061 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmssx\" (UniqueName: \"kubernetes.io/projected/bd8e1e3f-f92a-4c3d-a28b-78213622057d-kube-api-access-bmssx\") pod \"prometheus-operator-78f957474d-mgj2v\" (UID: \"bd8e1e3f-f92a-4c3d-a28b-78213622057d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-mgj2v" Apr 16 18:12:09.637289 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:09.637256 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-78gqq" event={"ID":"c3a8c67b-0001-42d8-bb3f-f86472b945ff","Type":"ContainerStarted","Data":"e12e1185ad72e401a6cf0016e747b97915e1e348bd581e1bc3fcebfa63192ef0"} Apr 16 18:12:09.658845 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:09.658797 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-78gqq" podStartSLOduration=1.415352087 podStartE2EDuration="3.658778017s" podCreationTimestamp="2026-04-16 18:12:06 +0000 UTC" firstStartedPulling="2026-04-16 18:12:06.685869258 +0000 UTC m=+85.887444504" lastFinishedPulling="2026-04-16 18:12:08.929295185 +0000 UTC m=+88.130870434" observedRunningTime="2026-04-16 18:12:09.65856821 +0000 UTC m=+88.860143480" watchObservedRunningTime="2026-04-16 18:12:09.658778017 +0000 UTC m=+88.860353286" Apr 16 18:12:09.734175 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:09.734146 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/bd8e1e3f-f92a-4c3d-a28b-78213622057d-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-mgj2v\" (UID: \"bd8e1e3f-f92a-4c3d-a28b-78213622057d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-mgj2v" Apr 16 18:12:09.736362 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:09.736340 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/bd8e1e3f-f92a-4c3d-a28b-78213622057d-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-mgj2v\" (UID: \"bd8e1e3f-f92a-4c3d-a28b-78213622057d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-mgj2v" Apr 16 18:12:09.927989 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:09.927927 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-mgj2v" Apr 16 18:12:10.041842 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:10.041810 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-mgj2v"] Apr 16 18:12:10.045678 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:12:10.045654 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd8e1e3f_f92a_4c3d_a28b_78213622057d.slice/crio-858a423126cd2259914d0ff38ad795dfa45c90a62845f1159b99b3421f32f759 WatchSource:0}: Error finding container 858a423126cd2259914d0ff38ad795dfa45c90a62845f1159b99b3421f32f759: Status 404 returned error can't find the container with id 858a423126cd2259914d0ff38ad795dfa45c90a62845f1159b99b3421f32f759 Apr 16 18:12:10.642072 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:10.642033 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-mgj2v" event={"ID":"bd8e1e3f-f92a-4c3d-a28b-78213622057d","Type":"ContainerStarted","Data":"858a423126cd2259914d0ff38ad795dfa45c90a62845f1159b99b3421f32f759"} Apr 16 18:12:11.646405 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:11.646362 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-mgj2v" event={"ID":"bd8e1e3f-f92a-4c3d-a28b-78213622057d","Type":"ContainerStarted","Data":"258c1285fda8d6ca264cbb0b0a5c40e715ca4cf29f32e1f04d88fe5b7806128d"} Apr 16 18:12:11.646405 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:11.646403 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-mgj2v" event={"ID":"bd8e1e3f-f92a-4c3d-a28b-78213622057d","Type":"ContainerStarted","Data":"c8c88f14bf8a239eea7b7fbe8fea9a5221bdd06ea80926ed7d452464635726d9"} Apr 16 18:12:11.665592 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:11.665529 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-78f957474d-mgj2v" podStartSLOduration=2.254539233 podStartE2EDuration="3.665513482s" podCreationTimestamp="2026-04-16 18:12:08 +0000 UTC" firstStartedPulling="2026-04-16 18:12:10.047453156 +0000 UTC m=+89.249028403" lastFinishedPulling="2026-04-16 18:12:11.458427396 +0000 UTC m=+90.660002652" observedRunningTime="2026-04-16 18:12:11.664231456 +0000 UTC m=+90.865806725" watchObservedRunningTime="2026-04-16 18:12:11.665513482 +0000 UTC m=+90.867088764" Apr 16 18:12:13.395908 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:13.395880 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-59zm6"] Apr 16 18:12:13.399247 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:13.399228 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-59zm6" Apr 16 18:12:13.401985 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:13.401954 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 18:12:13.402082 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:13.402003 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 18:12:13.402224 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:13.402208 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 18:12:13.402280 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:13.402233 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-gwlxm\"" Apr 16 18:12:13.462433 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:13.462409 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/50df0bc0-8825-4d04-bc78-cf3f1ee8887e-root\") pod \"node-exporter-59zm6\" (UID: \"50df0bc0-8825-4d04-bc78-cf3f1ee8887e\") " pod="openshift-monitoring/node-exporter-59zm6" Apr 16 18:12:13.462433 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:13.462435 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/50df0bc0-8825-4d04-bc78-cf3f1ee8887e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-59zm6\" (UID: \"50df0bc0-8825-4d04-bc78-cf3f1ee8887e\") " pod="openshift-monitoring/node-exporter-59zm6" Apr 16 18:12:13.462598 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:13.462452 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk984\" (UniqueName: \"kubernetes.io/projected/50df0bc0-8825-4d04-bc78-cf3f1ee8887e-kube-api-access-zk984\") pod \"node-exporter-59zm6\" (UID: \"50df0bc0-8825-4d04-bc78-cf3f1ee8887e\") " pod="openshift-monitoring/node-exporter-59zm6" Apr 16 18:12:13.462598 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:13.462470 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/50df0bc0-8825-4d04-bc78-cf3f1ee8887e-node-exporter-tls\") pod \"node-exporter-59zm6\" (UID: \"50df0bc0-8825-4d04-bc78-cf3f1ee8887e\") " pod="openshift-monitoring/node-exporter-59zm6" Apr 16 18:12:13.462598 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:13.462488 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/50df0bc0-8825-4d04-bc78-cf3f1ee8887e-sys\") pod \"node-exporter-59zm6\" (UID: \"50df0bc0-8825-4d04-bc78-cf3f1ee8887e\") " pod="openshift-monitoring/node-exporter-59zm6" Apr 16 18:12:13.462794 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:13.462608 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/50df0bc0-8825-4d04-bc78-cf3f1ee8887e-node-exporter-textfile\") pod \"node-exporter-59zm6\" (UID: \"50df0bc0-8825-4d04-bc78-cf3f1ee8887e\") " pod="openshift-monitoring/node-exporter-59zm6" Apr 16 18:12:13.462794 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:13.462657 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/50df0bc0-8825-4d04-bc78-cf3f1ee8887e-node-exporter-wtmp\") pod \"node-exporter-59zm6\" (UID: \"50df0bc0-8825-4d04-bc78-cf3f1ee8887e\") " pod="openshift-monitoring/node-exporter-59zm6" Apr 16 18:12:13.462794 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:13.462692 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/50df0bc0-8825-4d04-bc78-cf3f1ee8887e-metrics-client-ca\") pod \"node-exporter-59zm6\" (UID: \"50df0bc0-8825-4d04-bc78-cf3f1ee8887e\") " pod="openshift-monitoring/node-exporter-59zm6" Apr 16 18:12:13.462794 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:13.462745 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/50df0bc0-8825-4d04-bc78-cf3f1ee8887e-node-exporter-accelerators-collector-config\") pod \"node-exporter-59zm6\" (UID: \"50df0bc0-8825-4d04-bc78-cf3f1ee8887e\") " pod="openshift-monitoring/node-exporter-59zm6" Apr 16 18:12:13.563758 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:13.563727 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/50df0bc0-8825-4d04-bc78-cf3f1ee8887e-root\") pod \"node-exporter-59zm6\" (UID: \"50df0bc0-8825-4d04-bc78-cf3f1ee8887e\") " pod="openshift-monitoring/node-exporter-59zm6" Apr 16 18:12:13.563758 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:13.563758 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/50df0bc0-8825-4d04-bc78-cf3f1ee8887e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-59zm6\" (UID: \"50df0bc0-8825-4d04-bc78-cf3f1ee8887e\") " pod="openshift-monitoring/node-exporter-59zm6" Apr 16 18:12:13.563958 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:13.563782 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zk984\" (UniqueName: \"kubernetes.io/projected/50df0bc0-8825-4d04-bc78-cf3f1ee8887e-kube-api-access-zk984\") pod \"node-exporter-59zm6\" (UID: \"50df0bc0-8825-4d04-bc78-cf3f1ee8887e\") " pod="openshift-monitoring/node-exporter-59zm6" Apr 16 18:12:13.563958 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:13.563800 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/50df0bc0-8825-4d04-bc78-cf3f1ee8887e-node-exporter-tls\") pod \"node-exporter-59zm6\" (UID: \"50df0bc0-8825-4d04-bc78-cf3f1ee8887e\") " pod="openshift-monitoring/node-exporter-59zm6" Apr 16 18:12:13.563958 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:13.563828 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/50df0bc0-8825-4d04-bc78-cf3f1ee8887e-sys\") pod \"node-exporter-59zm6\" (UID: \"50df0bc0-8825-4d04-bc78-cf3f1ee8887e\") " pod="openshift-monitoring/node-exporter-59zm6" Apr 16 18:12:13.563958 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:13.563832 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/50df0bc0-8825-4d04-bc78-cf3f1ee8887e-root\") pod \"node-exporter-59zm6\" (UID: \"50df0bc0-8825-4d04-bc78-cf3f1ee8887e\") " pod="openshift-monitoring/node-exporter-59zm6" Apr 16 18:12:13.563958 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:13.563858 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/50df0bc0-8825-4d04-bc78-cf3f1ee8887e-node-exporter-textfile\") pod \"node-exporter-59zm6\" (UID: \"50df0bc0-8825-4d04-bc78-cf3f1ee8887e\") " pod="openshift-monitoring/node-exporter-59zm6" Apr 16 18:12:13.563958 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:13.563904 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/50df0bc0-8825-4d04-bc78-cf3f1ee8887e-node-exporter-wtmp\") pod \"node-exporter-59zm6\" (UID: \"50df0bc0-8825-4d04-bc78-cf3f1ee8887e\") " pod="openshift-monitoring/node-exporter-59zm6" Apr 16 18:12:13.563958 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:13.563943 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/50df0bc0-8825-4d04-bc78-cf3f1ee8887e-metrics-client-ca\") pod \"node-exporter-59zm6\" (UID: \"50df0bc0-8825-4d04-bc78-cf3f1ee8887e\") " pod="openshift-monitoring/node-exporter-59zm6" Apr 16 18:12:13.564232 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:12:13.563968 2576 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 18:12:13.564232 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:13.563974 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/50df0bc0-8825-4d04-bc78-cf3f1ee8887e-node-exporter-accelerators-collector-config\") pod \"node-exporter-59zm6\" (UID: \"50df0bc0-8825-4d04-bc78-cf3f1ee8887e\") " pod="openshift-monitoring/node-exporter-59zm6" Apr 16 18:12:13.564232 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:13.563944 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/50df0bc0-8825-4d04-bc78-cf3f1ee8887e-sys\") pod \"node-exporter-59zm6\" (UID: \"50df0bc0-8825-4d04-bc78-cf3f1ee8887e\") " pod="openshift-monitoring/node-exporter-59zm6" Apr 16 18:12:13.564232 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:12:13.564067 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50df0bc0-8825-4d04-bc78-cf3f1ee8887e-node-exporter-tls podName:50df0bc0-8825-4d04-bc78-cf3f1ee8887e nodeName:}" failed. No retries permitted until 2026-04-16 18:12:14.06404906 +0000 UTC m=+93.265624311 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/50df0bc0-8825-4d04-bc78-cf3f1ee8887e-node-exporter-tls") pod "node-exporter-59zm6" (UID: "50df0bc0-8825-4d04-bc78-cf3f1ee8887e") : secret "node-exporter-tls" not found Apr 16 18:12:13.564232 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:13.564069 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/50df0bc0-8825-4d04-bc78-cf3f1ee8887e-node-exporter-wtmp\") pod \"node-exporter-59zm6\" (UID: \"50df0bc0-8825-4d04-bc78-cf3f1ee8887e\") " pod="openshift-monitoring/node-exporter-59zm6" Apr 16 18:12:13.564393 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:13.564252 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/50df0bc0-8825-4d04-bc78-cf3f1ee8887e-node-exporter-textfile\") pod \"node-exporter-59zm6\" (UID: \"50df0bc0-8825-4d04-bc78-cf3f1ee8887e\") " pod="openshift-monitoring/node-exporter-59zm6" Apr 16 18:12:13.564590 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:13.564568 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/50df0bc0-8825-4d04-bc78-cf3f1ee8887e-node-exporter-accelerators-collector-config\") pod \"node-exporter-59zm6\" (UID: \"50df0bc0-8825-4d04-bc78-cf3f1ee8887e\") " pod="openshift-monitoring/node-exporter-59zm6" Apr 16 18:12:13.564623 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:13.564568 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/50df0bc0-8825-4d04-bc78-cf3f1ee8887e-metrics-client-ca\") pod \"node-exporter-59zm6\" (UID: \"50df0bc0-8825-4d04-bc78-cf3f1ee8887e\") " pod="openshift-monitoring/node-exporter-59zm6" Apr 16 18:12:13.565952 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:13.565931 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/50df0bc0-8825-4d04-bc78-cf3f1ee8887e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-59zm6\" (UID: \"50df0bc0-8825-4d04-bc78-cf3f1ee8887e\") " pod="openshift-monitoring/node-exporter-59zm6" Apr 16 18:12:13.572116 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:13.572096 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk984\" (UniqueName: \"kubernetes.io/projected/50df0bc0-8825-4d04-bc78-cf3f1ee8887e-kube-api-access-zk984\") pod \"node-exporter-59zm6\" (UID: \"50df0bc0-8825-4d04-bc78-cf3f1ee8887e\") " pod="openshift-monitoring/node-exporter-59zm6" Apr 16 18:12:14.068780 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.068745 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/50df0bc0-8825-4d04-bc78-cf3f1ee8887e-node-exporter-tls\") pod \"node-exporter-59zm6\" (UID: \"50df0bc0-8825-4d04-bc78-cf3f1ee8887e\") " pod="openshift-monitoring/node-exporter-59zm6" Apr 16 18:12:14.071010 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.070984 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/50df0bc0-8825-4d04-bc78-cf3f1ee8887e-node-exporter-tls\") pod \"node-exporter-59zm6\" (UID: \"50df0bc0-8825-4d04-bc78-cf3f1ee8887e\") " pod="openshift-monitoring/node-exporter-59zm6" Apr 16 18:12:14.307833 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.307797 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-59zm6" Apr 16 18:12:14.316421 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:12:14.316397 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50df0bc0_8825_4d04_bc78_cf3f1ee8887e.slice/crio-3b7d1253614319136a1fb7258cfca9f70bbd7f18cdc70927fd261a4ecc632d86 WatchSource:0}: Error finding container 3b7d1253614319136a1fb7258cfca9f70bbd7f18cdc70927fd261a4ecc632d86: Status 404 returned error can't find the container with id 3b7d1253614319136a1fb7258cfca9f70bbd7f18cdc70927fd261a4ecc632d86 Apr 16 18:12:14.445043 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.445011 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:12:14.449810 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.449789 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:14.453067 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.452379 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 18:12:14.453067 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.452393 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 18:12:14.453067 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.452636 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-w2b9x\"" Apr 16 18:12:14.453067 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.452712 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 18:12:14.453067 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.452637 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 18:12:14.453067 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.453025 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 18:12:14.453797 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.453628 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 18:12:14.453797 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.453679 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 18:12:14.453970 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.453797 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 18:12:14.453970 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.453797 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 18:12:14.462064 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.462043 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:12:14.572446 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.572363 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:14.572446 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.572427 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:14.572610 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.572457 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/36564dfe-56d0-4531-9269-dd5a9bc94fa5-tls-assets\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:14.572610 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.572484 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-web-config\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:14.572610 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.572509 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/36564dfe-56d0-4531-9269-dd5a9bc94fa5-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:14.572610 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.572539 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:14.572610 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.572566 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-config-volume\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:14.572610 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.572590 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36564dfe-56d0-4531-9269-dd5a9bc94fa5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:14.572885 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.572621 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/36564dfe-56d0-4531-9269-dd5a9bc94fa5-config-out\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:14.572885 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.572734 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/36564dfe-56d0-4531-9269-dd5a9bc94fa5-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:14.572885 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.572781 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:14.572885 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.572838 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:14.572885 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.572863 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfnxf\" (UniqueName: \"kubernetes.io/projected/36564dfe-56d0-4531-9269-dd5a9bc94fa5-kube-api-access-hfnxf\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:14.656640 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.656593 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-59zm6" event={"ID":"50df0bc0-8825-4d04-bc78-cf3f1ee8887e","Type":"ContainerStarted","Data":"3b7d1253614319136a1fb7258cfca9f70bbd7f18cdc70927fd261a4ecc632d86"} Apr 16 18:12:14.674235 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.674202 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:14.674399 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.674247 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/36564dfe-56d0-4531-9269-dd5a9bc94fa5-tls-assets\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:14.674399 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.674280 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-web-config\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:14.674399 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.674302 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/36564dfe-56d0-4531-9269-dd5a9bc94fa5-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:14.674399 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.674338 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:14.674399 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.674366 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-config-volume\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:14.674399 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.674388 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36564dfe-56d0-4531-9269-dd5a9bc94fa5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:14.674683 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.674421 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/36564dfe-56d0-4531-9269-dd5a9bc94fa5-config-out\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:14.674683 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.674478 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/36564dfe-56d0-4531-9269-dd5a9bc94fa5-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:14.674683 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.674510 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:14.674683 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.674556 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:14.674683 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.674586 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hfnxf\" (UniqueName: \"kubernetes.io/projected/36564dfe-56d0-4531-9269-dd5a9bc94fa5-kube-api-access-hfnxf\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:14.674683 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.674646 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:14.675713 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.675227 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/36564dfe-56d0-4531-9269-dd5a9bc94fa5-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:14.675713 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:12:14.675316 2576 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 16 18:12:14.675713 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:12:14.675387 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-secret-alertmanager-main-tls podName:36564dfe-56d0-4531-9269-dd5a9bc94fa5 nodeName:}" failed. No retries permitted until 2026-04-16 18:12:15.175367768 +0000 UTC m=+94.376943019 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "36564dfe-56d0-4531-9269-dd5a9bc94fa5") : secret "alertmanager-main-tls" not found Apr 16 18:12:14.677443 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.677418 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:14.677554 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.677470 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:14.677845 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.677675 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:14.677845 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.677760 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/36564dfe-56d0-4531-9269-dd5a9bc94fa5-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:14.677845 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:12:14.677818 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/36564dfe-56d0-4531-9269-dd5a9bc94fa5-alertmanager-trusted-ca-bundle podName:36564dfe-56d0-4531-9269-dd5a9bc94fa5 nodeName:}" failed. No retries permitted until 2026-04-16 18:12:15.177794379 +0000 UTC m=+94.379369626 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/36564dfe-56d0-4531-9269-dd5a9bc94fa5-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "36564dfe-56d0-4531-9269-dd5a9bc94fa5") : configmap references non-existent config key: ca-bundle.crt Apr 16 18:12:14.678370 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.678330 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/36564dfe-56d0-4531-9269-dd5a9bc94fa5-config-out\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:14.678863 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.678815 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/36564dfe-56d0-4531-9269-dd5a9bc94fa5-tls-assets\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:14.678954 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.678902 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:14.680106 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.680083 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-config-volume\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:14.680229 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.680211 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-web-config\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:14.686498 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:14.686473 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfnxf\" (UniqueName: \"kubernetes.io/projected/36564dfe-56d0-4531-9269-dd5a9bc94fa5-kube-api-access-hfnxf\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:15.179076 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.179043 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:15.179192 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.179114 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36564dfe-56d0-4531-9269-dd5a9bc94fa5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:15.180469 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.180444 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36564dfe-56d0-4531-9269-dd5a9bc94fa5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:15.181092 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.181072 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:15.344981 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.344911 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-66dc987d6f-rjrtv"] Apr 16 18:12:15.348407 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.348392 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-66dc987d6f-rjrtv" Apr 16 18:12:15.351177 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.351152 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-7cianos8mmhk1\"" Apr 16 18:12:15.351295 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.351259 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 18:12:15.351364 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.351290 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 18:12:15.351364 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.351259 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 18:12:15.351485 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.351472 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-5p7m5\"" Apr 16 18:12:15.351581 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.351564 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 18:12:15.352220 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.352204 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 18:12:15.358500 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.358480 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-66dc987d6f-rjrtv"] Apr 16 18:12:15.360721 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.360690 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:15.481679 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.481646 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/26885632-9180-4f43-8bf2-d905b1c6e357-secret-thanos-querier-tls\") pod \"thanos-querier-66dc987d6f-rjrtv\" (UID: \"26885632-9180-4f43-8bf2-d905b1c6e357\") " pod="openshift-monitoring/thanos-querier-66dc987d6f-rjrtv" Apr 16 18:12:15.482080 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.481783 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/26885632-9180-4f43-8bf2-d905b1c6e357-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-66dc987d6f-rjrtv\" (UID: \"26885632-9180-4f43-8bf2-d905b1c6e357\") " pod="openshift-monitoring/thanos-querier-66dc987d6f-rjrtv" Apr 16 18:12:15.482080 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.481820 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/26885632-9180-4f43-8bf2-d905b1c6e357-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-66dc987d6f-rjrtv\" (UID: \"26885632-9180-4f43-8bf2-d905b1c6e357\") " pod="openshift-monitoring/thanos-querier-66dc987d6f-rjrtv" Apr 16 18:12:15.482080 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.481859 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqhhd\" (UniqueName: \"kubernetes.io/projected/26885632-9180-4f43-8bf2-d905b1c6e357-kube-api-access-xqhhd\") pod \"thanos-querier-66dc987d6f-rjrtv\" (UID: \"26885632-9180-4f43-8bf2-d905b1c6e357\") " pod="openshift-monitoring/thanos-querier-66dc987d6f-rjrtv" Apr 16 18:12:15.482080 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.481903 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/26885632-9180-4f43-8bf2-d905b1c6e357-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-66dc987d6f-rjrtv\" (UID: \"26885632-9180-4f43-8bf2-d905b1c6e357\") " pod="openshift-monitoring/thanos-querier-66dc987d6f-rjrtv" Apr 16 18:12:15.482080 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.481939 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/26885632-9180-4f43-8bf2-d905b1c6e357-secret-grpc-tls\") pod \"thanos-querier-66dc987d6f-rjrtv\" (UID: \"26885632-9180-4f43-8bf2-d905b1c6e357\") " pod="openshift-monitoring/thanos-querier-66dc987d6f-rjrtv" Apr 16 18:12:15.482080 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.481993 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/26885632-9180-4f43-8bf2-d905b1c6e357-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-66dc987d6f-rjrtv\" (UID: \"26885632-9180-4f43-8bf2-d905b1c6e357\") " pod="openshift-monitoring/thanos-querier-66dc987d6f-rjrtv" Apr 16 18:12:15.482080 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.482034 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/26885632-9180-4f43-8bf2-d905b1c6e357-metrics-client-ca\") pod \"thanos-querier-66dc987d6f-rjrtv\" (UID: \"26885632-9180-4f43-8bf2-d905b1c6e357\") " pod="openshift-monitoring/thanos-querier-66dc987d6f-rjrtv" Apr 16 18:12:15.483743 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.483722 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:12:15.487267 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:12:15.487238 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36564dfe_56d0_4531_9269_dd5a9bc94fa5.slice/crio-fbd354a53d3a86c020652d299f0e0666784df044fe2810336058a234f4cbe723 WatchSource:0}: Error finding container fbd354a53d3a86c020652d299f0e0666784df044fe2810336058a234f4cbe723: Status 404 returned error can't find the container with id fbd354a53d3a86c020652d299f0e0666784df044fe2810336058a234f4cbe723 Apr 16 18:12:15.582407 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.582383 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/26885632-9180-4f43-8bf2-d905b1c6e357-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-66dc987d6f-rjrtv\" (UID: \"26885632-9180-4f43-8bf2-d905b1c6e357\") " pod="openshift-monitoring/thanos-querier-66dc987d6f-rjrtv" Apr 16 18:12:15.582511 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.582431 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/26885632-9180-4f43-8bf2-d905b1c6e357-secret-grpc-tls\") pod \"thanos-querier-66dc987d6f-rjrtv\" (UID: \"26885632-9180-4f43-8bf2-d905b1c6e357\") " pod="openshift-monitoring/thanos-querier-66dc987d6f-rjrtv" Apr 16 18:12:15.582511 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.582472 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/26885632-9180-4f43-8bf2-d905b1c6e357-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-66dc987d6f-rjrtv\" (UID: \"26885632-9180-4f43-8bf2-d905b1c6e357\") " pod="openshift-monitoring/thanos-querier-66dc987d6f-rjrtv" Apr 16 18:12:15.582511 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.582502 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/26885632-9180-4f43-8bf2-d905b1c6e357-metrics-client-ca\") pod \"thanos-querier-66dc987d6f-rjrtv\" (UID: \"26885632-9180-4f43-8bf2-d905b1c6e357\") " pod="openshift-monitoring/thanos-querier-66dc987d6f-rjrtv" Apr 16 18:12:15.582666 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.582547 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/26885632-9180-4f43-8bf2-d905b1c6e357-secret-thanos-querier-tls\") pod \"thanos-querier-66dc987d6f-rjrtv\" (UID: \"26885632-9180-4f43-8bf2-d905b1c6e357\") " pod="openshift-monitoring/thanos-querier-66dc987d6f-rjrtv" Apr 16 18:12:15.582666 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.582574 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/26885632-9180-4f43-8bf2-d905b1c6e357-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-66dc987d6f-rjrtv\" (UID: \"26885632-9180-4f43-8bf2-d905b1c6e357\") " pod="openshift-monitoring/thanos-querier-66dc987d6f-rjrtv" Apr 16 18:12:15.582666 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.582590 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/26885632-9180-4f43-8bf2-d905b1c6e357-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-66dc987d6f-rjrtv\" (UID: \"26885632-9180-4f43-8bf2-d905b1c6e357\") " pod="openshift-monitoring/thanos-querier-66dc987d6f-rjrtv" Apr 16 18:12:15.582666 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.582612 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqhhd\" (UniqueName: \"kubernetes.io/projected/26885632-9180-4f43-8bf2-d905b1c6e357-kube-api-access-xqhhd\") pod \"thanos-querier-66dc987d6f-rjrtv\" (UID: \"26885632-9180-4f43-8bf2-d905b1c6e357\") " pod="openshift-monitoring/thanos-querier-66dc987d6f-rjrtv" Apr 16 18:12:15.583315 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.583266 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/26885632-9180-4f43-8bf2-d905b1c6e357-metrics-client-ca\") pod \"thanos-querier-66dc987d6f-rjrtv\" (UID: \"26885632-9180-4f43-8bf2-d905b1c6e357\") " pod="openshift-monitoring/thanos-querier-66dc987d6f-rjrtv" Apr 16 18:12:15.584929 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.584897 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/26885632-9180-4f43-8bf2-d905b1c6e357-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-66dc987d6f-rjrtv\" (UID: \"26885632-9180-4f43-8bf2-d905b1c6e357\") " pod="openshift-monitoring/thanos-querier-66dc987d6f-rjrtv" Apr 16 18:12:15.585092 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.585073 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/26885632-9180-4f43-8bf2-d905b1c6e357-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-66dc987d6f-rjrtv\" (UID: \"26885632-9180-4f43-8bf2-d905b1c6e357\") " pod="openshift-monitoring/thanos-querier-66dc987d6f-rjrtv" Apr 16 18:12:15.585300 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.585282 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/26885632-9180-4f43-8bf2-d905b1c6e357-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-66dc987d6f-rjrtv\" (UID: \"26885632-9180-4f43-8bf2-d905b1c6e357\") " pod="openshift-monitoring/thanos-querier-66dc987d6f-rjrtv" Apr 16 18:12:15.585361 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.585326 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/26885632-9180-4f43-8bf2-d905b1c6e357-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-66dc987d6f-rjrtv\" (UID: \"26885632-9180-4f43-8bf2-d905b1c6e357\") " pod="openshift-monitoring/thanos-querier-66dc987d6f-rjrtv" Apr 16 18:12:15.585495 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.585475 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/26885632-9180-4f43-8bf2-d905b1c6e357-secret-grpc-tls\") pod \"thanos-querier-66dc987d6f-rjrtv\" (UID: \"26885632-9180-4f43-8bf2-d905b1c6e357\") " pod="openshift-monitoring/thanos-querier-66dc987d6f-rjrtv" Apr 16 18:12:15.585528 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.585490 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/26885632-9180-4f43-8bf2-d905b1c6e357-secret-thanos-querier-tls\") pod \"thanos-querier-66dc987d6f-rjrtv\" (UID: \"26885632-9180-4f43-8bf2-d905b1c6e357\") " pod="openshift-monitoring/thanos-querier-66dc987d6f-rjrtv" Apr 16 18:12:15.590922 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.590905 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqhhd\" (UniqueName: \"kubernetes.io/projected/26885632-9180-4f43-8bf2-d905b1c6e357-kube-api-access-xqhhd\") pod \"thanos-querier-66dc987d6f-rjrtv\" (UID: \"26885632-9180-4f43-8bf2-d905b1c6e357\") " pod="openshift-monitoring/thanos-querier-66dc987d6f-rjrtv" Apr 16 18:12:15.657741 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.657722 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-66dc987d6f-rjrtv" Apr 16 18:12:15.660032 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.660006 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"36564dfe-56d0-4531-9269-dd5a9bc94fa5","Type":"ContainerStarted","Data":"fbd354a53d3a86c020652d299f0e0666784df044fe2810336058a234f4cbe723"} Apr 16 18:12:15.661474 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.661451 2576 generic.go:358] "Generic (PLEG): container finished" podID="50df0bc0-8825-4d04-bc78-cf3f1ee8887e" containerID="ceb88c809721578e7cb99454a4f4b3f2cac077e9f77a41ca562bb5abeb82d1a6" exitCode=0 Apr 16 18:12:15.661565 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.661484 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-59zm6" event={"ID":"50df0bc0-8825-4d04-bc78-cf3f1ee8887e","Type":"ContainerDied","Data":"ceb88c809721578e7cb99454a4f4b3f2cac077e9f77a41ca562bb5abeb82d1a6"} Apr 16 18:12:15.785107 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:15.785081 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-66dc987d6f-rjrtv"] Apr 16 18:12:15.787756 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:12:15.787730 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26885632_9180_4f43_8bf2_d905b1c6e357.slice/crio-318e4dd66855c4e04ef4a11f7a2ce41f0f0a8aa8546d39330c5e1cba0649d079 WatchSource:0}: Error finding container 318e4dd66855c4e04ef4a11f7a2ce41f0f0a8aa8546d39330c5e1cba0649d079: Status 404 returned error can't find the container with id 318e4dd66855c4e04ef4a11f7a2ce41f0f0a8aa8546d39330c5e1cba0649d079 Apr 16 18:12:16.227114 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:16.227067 2576 patch_prober.go:28] interesting pod/image-registry-75859bb697-t64g6 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 18:12:16.227288 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:16.227140 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-75859bb697-t64g6" podUID="de4c4df9-8036-4492-9274-9f47dc6c8180" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:12:16.666814 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:16.666768 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-59zm6" event={"ID":"50df0bc0-8825-4d04-bc78-cf3f1ee8887e","Type":"ContainerStarted","Data":"165d5de44dbd032e3d70b954271d09180c58814177cbffa3e72e9c7479ddae2f"} Apr 16 18:12:16.666814 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:16.666810 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-59zm6" event={"ID":"50df0bc0-8825-4d04-bc78-cf3f1ee8887e","Type":"ContainerStarted","Data":"f02d371628491ca1a698d24c47f30fa2605912f7f913b1454c0a14211637b36b"} Apr 16 18:12:16.668154 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:16.668129 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-66dc987d6f-rjrtv" event={"ID":"26885632-9180-4f43-8bf2-d905b1c6e357","Type":"ContainerStarted","Data":"318e4dd66855c4e04ef4a11f7a2ce41f0f0a8aa8546d39330c5e1cba0649d079"} Apr 16 18:12:16.669571 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:16.669538 2576 generic.go:358] "Generic (PLEG): container finished" podID="36564dfe-56d0-4531-9269-dd5a9bc94fa5" containerID="d27de8332b6c8f8f0cbbc36c0a88f70ee2dd21a630d58cc0876eb5b473c493ce" exitCode=0 Apr 16 18:12:16.669716 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:16.669585 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"36564dfe-56d0-4531-9269-dd5a9bc94fa5","Type":"ContainerDied","Data":"d27de8332b6c8f8f0cbbc36c0a88f70ee2dd21a630d58cc0876eb5b473c493ce"} Apr 16 18:12:16.687024 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:16.686988 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-59zm6" podStartSLOduration=3.031411376 podStartE2EDuration="3.686977553s" podCreationTimestamp="2026-04-16 18:12:13 +0000 UTC" firstStartedPulling="2026-04-16 18:12:14.317982586 +0000 UTC m=+93.519557832" lastFinishedPulling="2026-04-16 18:12:14.973548758 +0000 UTC m=+94.175124009" observedRunningTime="2026-04-16 18:12:16.686153382 +0000 UTC m=+95.887728650" watchObservedRunningTime="2026-04-16 18:12:16.686977553 +0000 UTC m=+95.888552887" Apr 16 18:12:18.159874 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:18.159839 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-dclv7"] Apr 16 18:12:18.185766 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:18.185730 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-dclv7"] Apr 16 18:12:18.185919 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:18.185780 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-dclv7" Apr 16 18:12:18.188451 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:18.188419 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 18:12:18.188591 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:18.188463 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-4w54b\"" Apr 16 18:12:18.311191 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:18.311155 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f3fc3a09-3d9e-4d57-89b8-557e15fa868c-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-dclv7\" (UID: \"f3fc3a09-3d9e-4d57-89b8-557e15fa868c\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-dclv7" Apr 16 18:12:18.411932 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:18.411856 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f3fc3a09-3d9e-4d57-89b8-557e15fa868c-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-dclv7\" (UID: \"f3fc3a09-3d9e-4d57-89b8-557e15fa868c\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-dclv7" Apr 16 18:12:18.414589 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:18.414548 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f3fc3a09-3d9e-4d57-89b8-557e15fa868c-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-dclv7\" (UID: \"f3fc3a09-3d9e-4d57-89b8-557e15fa868c\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-dclv7" Apr 16 18:12:18.496499 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:18.496471 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-dclv7" Apr 16 18:12:18.716847 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:18.716824 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-dclv7"] Apr 16 18:12:18.720023 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:12:18.719999 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3fc3a09_3d9e_4d57_89b8_557e15fa868c.slice/crio-60784eaf184413c6a6932b04e5ea4381ed264ee91ad9cad0a833f408081483a6 WatchSource:0}: Error finding container 60784eaf184413c6a6932b04e5ea4381ed264ee91ad9cad0a833f408081483a6: Status 404 returned error can't find the container with id 60784eaf184413c6a6932b04e5ea4381ed264ee91ad9cad0a833f408081483a6 Apr 16 18:12:19.600095 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.599998 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:12:19.605796 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.604633 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.607619 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.607594 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 18:12:19.607769 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.607594 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 18:12:19.607975 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.607952 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 18:12:19.609011 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.608992 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-8lhrgmnf08hm\"" Apr 16 18:12:19.609180 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.609159 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 18:12:19.609733 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.609319 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 18:12:19.609733 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.609414 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 18:12:19.609733 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.609535 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-84wx4\"" Apr 16 18:12:19.609733 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.609540 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 18:12:19.609733 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.609575 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 18:12:19.609733 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.609638 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 18:12:19.610197 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.610128 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 18:12:19.610262 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.610231 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 18:12:19.610323 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.610294 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 18:12:19.612059 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.611866 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 18:12:19.617355 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.617333 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:12:19.687039 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.687000 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"36564dfe-56d0-4531-9269-dd5a9bc94fa5","Type":"ContainerStarted","Data":"02ed20d1781a32826efc4cbb802274701e6c14b03c4514c3562dbd27ee3a806b"} Apr 16 18:12:19.687039 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.687041 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"36564dfe-56d0-4531-9269-dd5a9bc94fa5","Type":"ContainerStarted","Data":"157121dd1a374a613817305a55ced989caecb8e0d1edf483f05d35681e85ffcd"} Apr 16 18:12:19.687259 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.687055 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"36564dfe-56d0-4531-9269-dd5a9bc94fa5","Type":"ContainerStarted","Data":"8ca555ea4d864bd01b7f3edb50c4f78133747af4c6f06b52cea7cfb89ab8db60"} Apr 16 18:12:19.687259 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.687067 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"36564dfe-56d0-4531-9269-dd5a9bc94fa5","Type":"ContainerStarted","Data":"339e7bbe65d2fd665818b9262134f1004fa78fd3ac20f1cc2d65885be82e80db"} Apr 16 18:12:19.687259 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.687078 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"36564dfe-56d0-4531-9269-dd5a9bc94fa5","Type":"ContainerStarted","Data":"d39148d2241fd5bc3140d746b01d4de56eeca89d6432869f24f95fb2b613a9a8"} Apr 16 18:12:19.688511 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.688483 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-dclv7" event={"ID":"f3fc3a09-3d9e-4d57-89b8-557e15fa868c","Type":"ContainerStarted","Data":"60784eaf184413c6a6932b04e5ea4381ed264ee91ad9cad0a833f408081483a6"} Apr 16 18:12:19.691742 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.691717 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-66dc987d6f-rjrtv" event={"ID":"26885632-9180-4f43-8bf2-d905b1c6e357","Type":"ContainerStarted","Data":"d9db81668ec63b4f630341ee8f2eaa55e295b263a2a5881956751dc8a438f496"} Apr 16 18:12:19.691858 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.691752 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-66dc987d6f-rjrtv" event={"ID":"26885632-9180-4f43-8bf2-d905b1c6e357","Type":"ContainerStarted","Data":"7d23973d9c948962065b58948c6797f28556347e3ad2d2b6d065e53a43f203ec"} Apr 16 18:12:19.691858 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.691766 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-66dc987d6f-rjrtv" event={"ID":"26885632-9180-4f43-8bf2-d905b1c6e357","Type":"ContainerStarted","Data":"ff25812c2be8c12c951eef836686178eb939479a6578b10ae27a33ea9a0f3618"} Apr 16 18:12:19.725274 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.725163 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.725274 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.725195 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.725274 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.725225 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.725513 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.725308 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.725513 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.725353 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-config\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.725513 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.725400 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.725513 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.725469 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7x28\" (UniqueName: \"kubernetes.io/projected/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-kube-api-access-d7x28\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.725738 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.725562 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.725738 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.725616 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-web-config\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.725738 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.725662 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.725738 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.725691 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.725738 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.725730 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.725973 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.725792 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.725973 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.725831 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.725973 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.725914 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-config-out\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.725973 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.725946 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.726152 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.725972 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.726152 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.726034 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.826996 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.826954 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-config\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.827167 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.827023 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.827167 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.827054 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d7x28\" (UniqueName: \"kubernetes.io/projected/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-kube-api-access-d7x28\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.827167 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.827082 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.827167 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.827119 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-web-config\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.827167 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.827166 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.827399 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.827194 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.827399 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.827220 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.827399 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.827251 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.827399 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.827275 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.827399 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.827325 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-config-out\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.827399 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.827352 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.827399 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.827378 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.827767 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.827440 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.827767 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.827466 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.827767 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.827472 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.827767 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.827535 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.827767 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.827570 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.827767 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.827636 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.830531 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.829087 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.832913 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.831935 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.832913 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.832582 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.839286 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.839070 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.839625 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.839598 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.839813 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.839792 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.840610 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.840568 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.844085 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.843831 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-web-config\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.844183 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.844098 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.844241 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.844222 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.844825 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.844579 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.844825 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.844658 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.844825 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.844759 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-config\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.844825 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.844783 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.846039 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.846014 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.846186 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.846163 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7x28\" (UniqueName: \"kubernetes.io/projected/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-kube-api-access-d7x28\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.847356 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.847337 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-config-out\") pod \"prometheus-k8s-0\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:19.922254 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:19.921798 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:20.364296 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:20.364270 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:12:20.367293 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:12:20.367273 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb252c920_4abd_4d7c_bd5e_d74dcbcd643a.slice/crio-17000dd6c1a3c9aee9ec804ba604b3c652aec03d16c7035b1a6d4b8fc406247f WatchSource:0}: Error finding container 17000dd6c1a3c9aee9ec804ba604b3c652aec03d16c7035b1a6d4b8fc406247f: Status 404 returned error can't find the container with id 17000dd6c1a3c9aee9ec804ba604b3c652aec03d16c7035b1a6d4b8fc406247f Apr 16 18:12:20.532669 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:20.532642 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a54e43de-5a67-45f3-b403-4317caee2eca-metrics-tls\") pod \"dns-default-5mn52\" (UID: \"a54e43de-5a67-45f3-b403-4317caee2eca\") " pod="openshift-dns/dns-default-5mn52" Apr 16 18:12:20.534643 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:20.534621 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a54e43de-5a67-45f3-b403-4317caee2eca-metrics-tls\") pod \"dns-default-5mn52\" (UID: \"a54e43de-5a67-45f3-b403-4317caee2eca\") " pod="openshift-dns/dns-default-5mn52" Apr 16 18:12:20.586295 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:20.586273 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-n88vn\"" Apr 16 18:12:20.593523 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:20.593507 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5mn52" Apr 16 18:12:20.633551 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:20.633517 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31b8e533-ba32-44f3-b6db-5c9e368510c6-cert\") pod \"ingress-canary-clw7f\" (UID: \"31b8e533-ba32-44f3-b6db-5c9e368510c6\") " pod="openshift-ingress-canary/ingress-canary-clw7f" Apr 16 18:12:20.635820 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:20.635794 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31b8e533-ba32-44f3-b6db-5c9e368510c6-cert\") pod \"ingress-canary-clw7f\" (UID: \"31b8e533-ba32-44f3-b6db-5c9e368510c6\") " pod="openshift-ingress-canary/ingress-canary-clw7f" Apr 16 18:12:20.697086 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:20.697054 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-66dc987d6f-rjrtv" event={"ID":"26885632-9180-4f43-8bf2-d905b1c6e357","Type":"ContainerStarted","Data":"8eee460297e09e3f235e159d4dcc50f98d2af1e8653d7c44ad7781d1431569b9"} Apr 16 18:12:20.697233 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:20.697093 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-66dc987d6f-rjrtv" event={"ID":"26885632-9180-4f43-8bf2-d905b1c6e357","Type":"ContainerStarted","Data":"2b1c0a5aaacd4e349c14d3c655ed0b29b0f0f1d4e3d46f0b6065b06239535284"} Apr 16 18:12:20.697233 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:20.697108 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-66dc987d6f-rjrtv" event={"ID":"26885632-9180-4f43-8bf2-d905b1c6e357","Type":"ContainerStarted","Data":"418327f89934ed62be0ffd50eb38bc82f954f5c1082b1f4bfa87894009003c05"} Apr 16 18:12:20.697233 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:20.697210 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-66dc987d6f-rjrtv" Apr 16 18:12:20.698373 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:20.698348 2576 generic.go:358] "Generic (PLEG): container finished" podID="b252c920-4abd-4d7c-bd5e-d74dcbcd643a" containerID="dc0d3ea3f9ba8ca43960cc2c298db25726d4144cc930fb12accc66833d21b035" exitCode=0 Apr 16 18:12:20.698460 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:20.698431 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b252c920-4abd-4d7c-bd5e-d74dcbcd643a","Type":"ContainerDied","Data":"dc0d3ea3f9ba8ca43960cc2c298db25726d4144cc930fb12accc66833d21b035"} Apr 16 18:12:20.698460 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:20.698456 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b252c920-4abd-4d7c-bd5e-d74dcbcd643a","Type":"ContainerStarted","Data":"17000dd6c1a3c9aee9ec804ba604b3c652aec03d16c7035b1a6d4b8fc406247f"} Apr 16 18:12:20.701525 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:20.701505 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"36564dfe-56d0-4531-9269-dd5a9bc94fa5","Type":"ContainerStarted","Data":"4ed9ef8f1e425a269361828c5e1cebba2338653f8dca58446bfdcbcf7a04232d"} Apr 16 18:12:20.702734 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:20.702714 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-dclv7" event={"ID":"f3fc3a09-3d9e-4d57-89b8-557e15fa868c","Type":"ContainerStarted","Data":"cb153f29ff5c3adb9da1db9f677bc1016709f40dfbe5cd29100a36ec12f897a2"} Apr 16 18:12:20.702888 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:20.702871 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-dclv7" Apr 16 18:12:20.707442 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:20.707358 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5mn52"] Apr 16 18:12:20.707904 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:20.707891 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-dclv7" Apr 16 18:12:20.709978 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:12:20.709957 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda54e43de_5a67_45f3_b403_4317caee2eca.slice/crio-080d18fe8515c5b537fe6cd2be8a52bc806b9bc61bc48cafba1b91eab7c9f3bc WatchSource:0}: Error finding container 080d18fe8515c5b537fe6cd2be8a52bc806b9bc61bc48cafba1b91eab7c9f3bc: Status 404 returned error can't find the container with id 080d18fe8515c5b537fe6cd2be8a52bc806b9bc61bc48cafba1b91eab7c9f3bc Apr 16 18:12:20.718388 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:20.718342 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-66dc987d6f-rjrtv" podStartSLOduration=1.2845620979999999 podStartE2EDuration="5.718327891s" podCreationTimestamp="2026-04-16 18:12:15 +0000 UTC" firstStartedPulling="2026-04-16 18:12:15.789617528 +0000 UTC m=+94.991192775" lastFinishedPulling="2026-04-16 18:12:20.223383322 +0000 UTC m=+99.424958568" observedRunningTime="2026-04-16 18:12:20.71679761 +0000 UTC m=+99.918372889" watchObservedRunningTime="2026-04-16 18:12:20.718327891 +0000 UTC m=+99.919903160" Apr 16 18:12:20.774846 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:20.774720 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.007283372 podStartE2EDuration="6.77468709s" podCreationTimestamp="2026-04-16 18:12:14 +0000 UTC" firstStartedPulling="2026-04-16 18:12:15.489059531 +0000 UTC m=+94.690634780" lastFinishedPulling="2026-04-16 18:12:20.256463243 +0000 UTC m=+99.458038498" observedRunningTime="2026-04-16 18:12:20.774290879 +0000 UTC m=+99.975866158" watchObservedRunningTime="2026-04-16 18:12:20.77468709 +0000 UTC m=+99.976262432" Apr 16 18:12:20.792178 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:20.792140 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-dclv7" podStartSLOduration=1.293070132 podStartE2EDuration="2.792130174s" podCreationTimestamp="2026-04-16 18:12:18 +0000 UTC" firstStartedPulling="2026-04-16 18:12:18.723645946 +0000 UTC m=+97.925221193" lastFinishedPulling="2026-04-16 18:12:20.222705966 +0000 UTC m=+99.424281235" observedRunningTime="2026-04-16 18:12:20.791403282 +0000 UTC m=+99.992978546" watchObservedRunningTime="2026-04-16 18:12:20.792130174 +0000 UTC m=+99.993705442" Apr 16 18:12:20.903452 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:20.903429 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-kd8rq\"" Apr 16 18:12:20.910754 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:20.910739 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-clw7f" Apr 16 18:12:21.032050 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:21.031902 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-clw7f"] Apr 16 18:12:21.034864 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:12:21.034836 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31b8e533_ba32_44f3_b6db_5c9e368510c6.slice/crio-620aace4a1296879b34e3d433020627639546d50dc7dcaeb99a6d25563d520a3 WatchSource:0}: Error finding container 620aace4a1296879b34e3d433020627639546d50dc7dcaeb99a6d25563d520a3: Status 404 returned error can't find the container with id 620aace4a1296879b34e3d433020627639546d50dc7dcaeb99a6d25563d520a3 Apr 16 18:12:21.580744 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:21.580460 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-r2j4b" Apr 16 18:12:21.708665 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:21.708615 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5mn52" event={"ID":"a54e43de-5a67-45f3-b403-4317caee2eca","Type":"ContainerStarted","Data":"080d18fe8515c5b537fe6cd2be8a52bc806b9bc61bc48cafba1b91eab7c9f3bc"} Apr 16 18:12:21.709863 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:21.709764 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-clw7f" event={"ID":"31b8e533-ba32-44f3-b6db-5c9e368510c6","Type":"ContainerStarted","Data":"620aace4a1296879b34e3d433020627639546d50dc7dcaeb99a6d25563d520a3"} Apr 16 18:12:24.720369 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:24.720328 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-clw7f" event={"ID":"31b8e533-ba32-44f3-b6db-5c9e368510c6","Type":"ContainerStarted","Data":"13635a33e341527f77ded3fe2b73edc53d81daf073a4eb1b227ad74e899bd774"} Apr 16 18:12:24.723214 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:24.723189 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b252c920-4abd-4d7c-bd5e-d74dcbcd643a","Type":"ContainerStarted","Data":"fa16f3c670dba4369cd88186858c5b0d2e98ed9c73b5e9cd05fbffa3af8efc83"} Apr 16 18:12:24.723335 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:24.723219 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b252c920-4abd-4d7c-bd5e-d74dcbcd643a","Type":"ContainerStarted","Data":"ed184ce2f059dda12848bd8dabf54a96c498b4d984bbb6de6e0c7896ce6e751c"} Apr 16 18:12:24.723335 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:24.723234 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b252c920-4abd-4d7c-bd5e-d74dcbcd643a","Type":"ContainerStarted","Data":"10185443b88eb998125800b778892279ba793508a2ee767bf5b26ced2072c262"} Apr 16 18:12:24.723335 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:24.723247 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b252c920-4abd-4d7c-bd5e-d74dcbcd643a","Type":"ContainerStarted","Data":"2a85df2c06617580395339185bdbb016052a1e1569f79857cbff0ae3c8cc1054"} Apr 16 18:12:24.723335 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:24.723261 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b252c920-4abd-4d7c-bd5e-d74dcbcd643a","Type":"ContainerStarted","Data":"cf5369ba5d307d3665894b9ef4c9e9d336a9e36233b3e29ca9ac4757a315b6b1"} Apr 16 18:12:24.723335 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:24.723274 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b252c920-4abd-4d7c-bd5e-d74dcbcd643a","Type":"ContainerStarted","Data":"5b3e5375ac3f006e74a4f228fee8e40dec23853a3b8b1d90b8f9e685eaa8f49a"} Apr 16 18:12:24.729249 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:24.729223 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5mn52" event={"ID":"a54e43de-5a67-45f3-b403-4317caee2eca","Type":"ContainerStarted","Data":"2e1bdd3f090549510d5ed0ab89b84fe21d5feefd44b2a4e0976ddb742a734f62"} Apr 16 18:12:24.729347 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:24.729255 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5mn52" event={"ID":"a54e43de-5a67-45f3-b403-4317caee2eca","Type":"ContainerStarted","Data":"0cf81e54054fe48292670f65e70d86a79bd64deba46c92ec1cf4aa778bb06d10"} Apr 16 18:12:24.729415 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:24.729400 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-5mn52" Apr 16 18:12:24.738082 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:24.738038 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-clw7f" podStartSLOduration=66.003559813 podStartE2EDuration="1m8.738024565s" podCreationTimestamp="2026-04-16 18:11:16 +0000 UTC" firstStartedPulling="2026-04-16 18:12:21.036673497 +0000 UTC m=+100.238248744" lastFinishedPulling="2026-04-16 18:12:23.771138235 +0000 UTC m=+102.972713496" observedRunningTime="2026-04-16 18:12:24.736556356 +0000 UTC m=+103.938131631" watchObservedRunningTime="2026-04-16 18:12:24.738024565 +0000 UTC m=+103.939599834" Apr 16 18:12:24.754672 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:24.754635 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-5mn52" podStartSLOduration=67.216240162 podStartE2EDuration="1m8.754626693s" podCreationTimestamp="2026-04-16 18:11:16 +0000 UTC" firstStartedPulling="2026-04-16 18:12:20.711576971 +0000 UTC m=+99.913152217" lastFinishedPulling="2026-04-16 18:12:22.249963489 +0000 UTC m=+101.451538748" observedRunningTime="2026-04-16 18:12:24.752873478 +0000 UTC m=+103.954448746" watchObservedRunningTime="2026-04-16 18:12:24.754626693 +0000 UTC m=+103.956201962" Apr 16 18:12:24.782974 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:24.782916 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.705865175 podStartE2EDuration="5.782871633s" podCreationTimestamp="2026-04-16 18:12:19 +0000 UTC" firstStartedPulling="2026-04-16 18:12:20.699542283 +0000 UTC m=+99.901117529" lastFinishedPulling="2026-04-16 18:12:23.776548729 +0000 UTC m=+102.978123987" observedRunningTime="2026-04-16 18:12:24.780253495 +0000 UTC m=+103.981828776" watchObservedRunningTime="2026-04-16 18:12:24.782871633 +0000 UTC m=+103.984446901" Apr 16 18:12:24.922758 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:24.922733 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.225515 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:26.225480 2576 patch_prober.go:28] interesting pod/image-registry-75859bb697-t64g6 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 18:12:26.225891 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:26.225529 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-75859bb697-t64g6" podUID="de4c4df9-8036-4492-9274-9f47dc6c8180" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:12:26.715685 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:26.715661 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-66dc987d6f-rjrtv" Apr 16 18:12:31.238907 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:31.238866 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-75859bb697-t64g6" podUID="de4c4df9-8036-4492-9274-9f47dc6c8180" containerName="registry" containerID="cri-o://d690e145291a29d60726f30e42ef2de324f2271ede4fb552a4ef3d77446b6ad0" gracePeriod=30 Apr 16 18:12:31.468081 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:31.468059 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-75859bb697-t64g6" Apr 16 18:12:31.636975 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:31.636945 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/de4c4df9-8036-4492-9274-9f47dc6c8180-image-registry-private-configuration\") pod \"de4c4df9-8036-4492-9274-9f47dc6c8180\" (UID: \"de4c4df9-8036-4492-9274-9f47dc6c8180\") " Apr 16 18:12:31.637133 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:31.636998 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de4c4df9-8036-4492-9274-9f47dc6c8180-registry-tls\") pod \"de4c4df9-8036-4492-9274-9f47dc6c8180\" (UID: \"de4c4df9-8036-4492-9274-9f47dc6c8180\") " Apr 16 18:12:31.637133 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:31.637014 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/de4c4df9-8036-4492-9274-9f47dc6c8180-installation-pull-secrets\") pod \"de4c4df9-8036-4492-9274-9f47dc6c8180\" (UID: \"de4c4df9-8036-4492-9274-9f47dc6c8180\") " Apr 16 18:12:31.637133 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:31.637061 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/de4c4df9-8036-4492-9274-9f47dc6c8180-registry-certificates\") pod \"de4c4df9-8036-4492-9274-9f47dc6c8180\" (UID: \"de4c4df9-8036-4492-9274-9f47dc6c8180\") " Apr 16 18:12:31.637133 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:31.637097 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/de4c4df9-8036-4492-9274-9f47dc6c8180-ca-trust-extracted\") pod \"de4c4df9-8036-4492-9274-9f47dc6c8180\" (UID: \"de4c4df9-8036-4492-9274-9f47dc6c8180\") " Apr 16 18:12:31.637133 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:31.637116 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5f5d\" (UniqueName: \"kubernetes.io/projected/de4c4df9-8036-4492-9274-9f47dc6c8180-kube-api-access-s5f5d\") pod \"de4c4df9-8036-4492-9274-9f47dc6c8180\" (UID: \"de4c4df9-8036-4492-9274-9f47dc6c8180\") " Apr 16 18:12:31.637398 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:31.637139 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de4c4df9-8036-4492-9274-9f47dc6c8180-bound-sa-token\") pod \"de4c4df9-8036-4492-9274-9f47dc6c8180\" (UID: \"de4c4df9-8036-4492-9274-9f47dc6c8180\") " Apr 16 18:12:31.637398 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:31.637212 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de4c4df9-8036-4492-9274-9f47dc6c8180-trusted-ca\") pod \"de4c4df9-8036-4492-9274-9f47dc6c8180\" (UID: \"de4c4df9-8036-4492-9274-9f47dc6c8180\") " Apr 16 18:12:31.637674 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:31.637616 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de4c4df9-8036-4492-9274-9f47dc6c8180-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "de4c4df9-8036-4492-9274-9f47dc6c8180" (UID: "de4c4df9-8036-4492-9274-9f47dc6c8180"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:12:31.637821 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:31.637730 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de4c4df9-8036-4492-9274-9f47dc6c8180-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "de4c4df9-8036-4492-9274-9f47dc6c8180" (UID: "de4c4df9-8036-4492-9274-9f47dc6c8180"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:12:31.637884 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:31.637821 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/de4c4df9-8036-4492-9274-9f47dc6c8180-registry-certificates\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:12:31.637884 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:31.637843 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de4c4df9-8036-4492-9274-9f47dc6c8180-trusted-ca\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:12:31.639552 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:31.639499 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de4c4df9-8036-4492-9274-9f47dc6c8180-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "de4c4df9-8036-4492-9274-9f47dc6c8180" (UID: "de4c4df9-8036-4492-9274-9f47dc6c8180"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:12:31.639552 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:31.639532 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de4c4df9-8036-4492-9274-9f47dc6c8180-kube-api-access-s5f5d" (OuterVolumeSpecName: "kube-api-access-s5f5d") pod "de4c4df9-8036-4492-9274-9f47dc6c8180" (UID: "de4c4df9-8036-4492-9274-9f47dc6c8180"). InnerVolumeSpecName "kube-api-access-s5f5d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:12:31.639732 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:31.639641 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de4c4df9-8036-4492-9274-9f47dc6c8180-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "de4c4df9-8036-4492-9274-9f47dc6c8180" (UID: "de4c4df9-8036-4492-9274-9f47dc6c8180"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:12:31.639732 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:31.639612 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de4c4df9-8036-4492-9274-9f47dc6c8180-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "de4c4df9-8036-4492-9274-9f47dc6c8180" (UID: "de4c4df9-8036-4492-9274-9f47dc6c8180"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:12:31.639841 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:31.639781 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de4c4df9-8036-4492-9274-9f47dc6c8180-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "de4c4df9-8036-4492-9274-9f47dc6c8180" (UID: "de4c4df9-8036-4492-9274-9f47dc6c8180"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:12:31.646010 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:31.645988 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de4c4df9-8036-4492-9274-9f47dc6c8180-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "de4c4df9-8036-4492-9274-9f47dc6c8180" (UID: "de4c4df9-8036-4492-9274-9f47dc6c8180"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:12:31.738350 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:31.738322 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/de4c4df9-8036-4492-9274-9f47dc6c8180-image-registry-private-configuration\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:12:31.738448 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:31.738347 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de4c4df9-8036-4492-9274-9f47dc6c8180-registry-tls\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:12:31.738448 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:31.738373 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/de4c4df9-8036-4492-9274-9f47dc6c8180-installation-pull-secrets\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:12:31.738448 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:31.738388 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/de4c4df9-8036-4492-9274-9f47dc6c8180-ca-trust-extracted\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:12:31.738448 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:31.738402 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s5f5d\" (UniqueName: \"kubernetes.io/projected/de4c4df9-8036-4492-9274-9f47dc6c8180-kube-api-access-s5f5d\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:12:31.738448 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:31.738415 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de4c4df9-8036-4492-9274-9f47dc6c8180-bound-sa-token\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:12:31.751584 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:31.751557 2576 generic.go:358] "Generic (PLEG): container finished" podID="de4c4df9-8036-4492-9274-9f47dc6c8180" containerID="d690e145291a29d60726f30e42ef2de324f2271ede4fb552a4ef3d77446b6ad0" exitCode=0 Apr 16 18:12:31.751672 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:31.751613 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-75859bb697-t64g6" Apr 16 18:12:31.751672 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:31.751646 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-75859bb697-t64g6" event={"ID":"de4c4df9-8036-4492-9274-9f47dc6c8180","Type":"ContainerDied","Data":"d690e145291a29d60726f30e42ef2de324f2271ede4fb552a4ef3d77446b6ad0"} Apr 16 18:12:31.751807 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:31.751689 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-75859bb697-t64g6" event={"ID":"de4c4df9-8036-4492-9274-9f47dc6c8180","Type":"ContainerDied","Data":"ade4050f2d1b6c049daaa281d0cf31fecb6f7fbabe792bf3ed9780d6be8f0c70"} Apr 16 18:12:31.751807 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:31.751734 2576 scope.go:117] "RemoveContainer" containerID="d690e145291a29d60726f30e42ef2de324f2271ede4fb552a4ef3d77446b6ad0" Apr 16 18:12:31.759299 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:31.759282 2576 scope.go:117] "RemoveContainer" containerID="d690e145291a29d60726f30e42ef2de324f2271ede4fb552a4ef3d77446b6ad0" Apr 16 18:12:31.759532 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:12:31.759515 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d690e145291a29d60726f30e42ef2de324f2271ede4fb552a4ef3d77446b6ad0\": container with ID starting with d690e145291a29d60726f30e42ef2de324f2271ede4fb552a4ef3d77446b6ad0 not found: ID does not exist" containerID="d690e145291a29d60726f30e42ef2de324f2271ede4fb552a4ef3d77446b6ad0" Apr 16 18:12:31.759577 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:31.759540 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d690e145291a29d60726f30e42ef2de324f2271ede4fb552a4ef3d77446b6ad0"} err="failed to get container status \"d690e145291a29d60726f30e42ef2de324f2271ede4fb552a4ef3d77446b6ad0\": rpc error: code = NotFound desc = could not find container \"d690e145291a29d60726f30e42ef2de324f2271ede4fb552a4ef3d77446b6ad0\": container with ID starting with d690e145291a29d60726f30e42ef2de324f2271ede4fb552a4ef3d77446b6ad0 not found: ID does not exist" Apr 16 18:12:31.771311 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:31.771288 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-75859bb697-t64g6"] Apr 16 18:12:31.775368 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:31.775347 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-75859bb697-t64g6"] Apr 16 18:12:33.326154 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:33.326117 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de4c4df9-8036-4492-9274-9f47dc6c8180" path="/var/lib/kubelet/pods/de4c4df9-8036-4492-9274-9f47dc6c8180/volumes" Apr 16 18:12:34.734882 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:34.734854 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-5mn52" Apr 16 18:12:50.978648 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:50.978617 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/22eec1cf-b2b9-495f-9507-ee4b6c6a9204-metrics-certs\") pod \"network-metrics-daemon-fwqx5\" (UID: \"22eec1cf-b2b9-495f-9507-ee4b6c6a9204\") " pod="openshift-multus/network-metrics-daemon-fwqx5" Apr 16 18:12:50.980689 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:50.980670 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/22eec1cf-b2b9-495f-9507-ee4b6c6a9204-metrics-certs\") pod \"network-metrics-daemon-fwqx5\" (UID: \"22eec1cf-b2b9-495f-9507-ee4b6c6a9204\") " pod="openshift-multus/network-metrics-daemon-fwqx5" Apr 16 18:12:51.233762 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:51.233671 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xdm8g\"" Apr 16 18:12:51.241087 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:51.241065 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwqx5" Apr 16 18:12:51.360605 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:51.360584 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fwqx5"] Apr 16 18:12:51.364135 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:12:51.364104 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22eec1cf_b2b9_495f_9507_ee4b6c6a9204.slice/crio-34457b32f3bf828c645cb87476a65b7ee890b04d139c454b4185e3240881c7e8 WatchSource:0}: Error finding container 34457b32f3bf828c645cb87476a65b7ee890b04d139c454b4185e3240881c7e8: Status 404 returned error can't find the container with id 34457b32f3bf828c645cb87476a65b7ee890b04d139c454b4185e3240881c7e8 Apr 16 18:12:51.822926 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:51.822890 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fwqx5" event={"ID":"22eec1cf-b2b9-495f-9507-ee4b6c6a9204","Type":"ContainerStarted","Data":"34457b32f3bf828c645cb87476a65b7ee890b04d139c454b4185e3240881c7e8"} Apr 16 18:12:52.826848 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:52.826809 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fwqx5" event={"ID":"22eec1cf-b2b9-495f-9507-ee4b6c6a9204","Type":"ContainerStarted","Data":"1e07719a5865931f973f0b0dbfecf966dfc4d0c5cff71606b6ac4b9817cfbb17"} Apr 16 18:12:52.826848 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:52.826852 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fwqx5" event={"ID":"22eec1cf-b2b9-495f-9507-ee4b6c6a9204","Type":"ContainerStarted","Data":"df1d6c19fb52e38b992254fc484c55c648dc336d2f68be5c60f6475e76522368"} Apr 16 18:12:52.843541 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:52.843498 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-fwqx5" podStartSLOduration=130.947585403 podStartE2EDuration="2m11.843484882s" podCreationTimestamp="2026-04-16 18:10:41 +0000 UTC" firstStartedPulling="2026-04-16 18:12:51.36606812 +0000 UTC m=+130.567643366" lastFinishedPulling="2026-04-16 18:12:52.261967595 +0000 UTC m=+131.463542845" observedRunningTime="2026-04-16 18:12:52.842974866 +0000 UTC m=+132.044550135" watchObservedRunningTime="2026-04-16 18:12:52.843484882 +0000 UTC m=+132.045060150" Apr 16 18:12:55.837648 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:55.837612 2576 generic.go:358] "Generic (PLEG): container finished" podID="8d34f403-81ae-4142-98c8-5c0168280de0" containerID="538b5d437f09469af00f3443076e669e0947fce7befe48a2875de13038cd90d2" exitCode=0 Apr 16 18:12:55.838060 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:55.837682 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-f6kdh" event={"ID":"8d34f403-81ae-4142-98c8-5c0168280de0","Type":"ContainerDied","Data":"538b5d437f09469af00f3443076e669e0947fce7befe48a2875de13038cd90d2"} Apr 16 18:12:55.838060 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:55.838002 2576 scope.go:117] "RemoveContainer" containerID="538b5d437f09469af00f3443076e669e0947fce7befe48a2875de13038cd90d2" Apr 16 18:12:56.531113 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:56.531086 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-clw7f_31b8e533-ba32-44f3-b6db-5c9e368510c6/serve-healthcheck-canary/0.log" Apr 16 18:12:56.841881 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:12:56.841811 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-f6kdh" event={"ID":"8d34f403-81ae-4142-98c8-5c0168280de0","Type":"ContainerStarted","Data":"6285ada4349a33da1e1a7fb78b17100cf0db19912be78a9a0740d080873cb56a"} Apr 16 18:13:19.922502 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:19.922469 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:19.940943 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:19.940916 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:20.926761 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:20.926718 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:33.625457 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:33.625373 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:13:33.626017 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:33.625819 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="36564dfe-56d0-4531-9269-dd5a9bc94fa5" containerName="alertmanager" containerID="cri-o://d39148d2241fd5bc3140d746b01d4de56eeca89d6432869f24f95fb2b613a9a8" gracePeriod=120 Apr 16 18:13:33.626017 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:33.625877 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="36564dfe-56d0-4531-9269-dd5a9bc94fa5" containerName="kube-rbac-proxy-metric" containerID="cri-o://02ed20d1781a32826efc4cbb802274701e6c14b03c4514c3562dbd27ee3a806b" gracePeriod=120 Apr 16 18:13:33.626017 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:33.625929 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="36564dfe-56d0-4531-9269-dd5a9bc94fa5" containerName="prom-label-proxy" containerID="cri-o://4ed9ef8f1e425a269361828c5e1cebba2338653f8dca58446bfdcbcf7a04232d" gracePeriod=120 Apr 16 18:13:33.626017 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:33.625964 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="36564dfe-56d0-4531-9269-dd5a9bc94fa5" containerName="config-reloader" containerID="cri-o://339e7bbe65d2fd665818b9262134f1004fa78fd3ac20f1cc2d65885be82e80db" gracePeriod=120 Apr 16 18:13:33.626017 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:33.625981 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="36564dfe-56d0-4531-9269-dd5a9bc94fa5" containerName="kube-rbac-proxy" containerID="cri-o://157121dd1a374a613817305a55ced989caecb8e0d1edf483f05d35681e85ffcd" gracePeriod=120 Apr 16 18:13:33.626300 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:33.625929 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="36564dfe-56d0-4531-9269-dd5a9bc94fa5" containerName="kube-rbac-proxy-web" containerID="cri-o://8ca555ea4d864bd01b7f3edb50c4f78133747af4c6f06b52cea7cfb89ab8db60" gracePeriod=120 Apr 16 18:13:33.951404 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:33.951366 2576 generic.go:358] "Generic (PLEG): container finished" podID="36564dfe-56d0-4531-9269-dd5a9bc94fa5" containerID="4ed9ef8f1e425a269361828c5e1cebba2338653f8dca58446bfdcbcf7a04232d" exitCode=0 Apr 16 18:13:33.951404 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:33.951399 2576 generic.go:358] "Generic (PLEG): container finished" podID="36564dfe-56d0-4531-9269-dd5a9bc94fa5" containerID="157121dd1a374a613817305a55ced989caecb8e0d1edf483f05d35681e85ffcd" exitCode=0 Apr 16 18:13:33.951404 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:33.951406 2576 generic.go:358] "Generic (PLEG): container finished" podID="36564dfe-56d0-4531-9269-dd5a9bc94fa5" containerID="339e7bbe65d2fd665818b9262134f1004fa78fd3ac20f1cc2d65885be82e80db" exitCode=0 Apr 16 18:13:33.951619 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:33.951414 2576 generic.go:358] "Generic (PLEG): container finished" podID="36564dfe-56d0-4531-9269-dd5a9bc94fa5" containerID="d39148d2241fd5bc3140d746b01d4de56eeca89d6432869f24f95fb2b613a9a8" exitCode=0 Apr 16 18:13:33.951619 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:33.951449 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"36564dfe-56d0-4531-9269-dd5a9bc94fa5","Type":"ContainerDied","Data":"4ed9ef8f1e425a269361828c5e1cebba2338653f8dca58446bfdcbcf7a04232d"} Apr 16 18:13:33.951619 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:33.951489 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"36564dfe-56d0-4531-9269-dd5a9bc94fa5","Type":"ContainerDied","Data":"157121dd1a374a613817305a55ced989caecb8e0d1edf483f05d35681e85ffcd"} Apr 16 18:13:33.951619 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:33.951504 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"36564dfe-56d0-4531-9269-dd5a9bc94fa5","Type":"ContainerDied","Data":"339e7bbe65d2fd665818b9262134f1004fa78fd3ac20f1cc2d65885be82e80db"} Apr 16 18:13:33.951619 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:33.951517 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"36564dfe-56d0-4531-9269-dd5a9bc94fa5","Type":"ContainerDied","Data":"d39148d2241fd5bc3140d746b01d4de56eeca89d6432869f24f95fb2b613a9a8"} Apr 16 18:13:34.860300 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:34.860281 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:34.957077 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:34.957004 2576 generic.go:358] "Generic (PLEG): container finished" podID="36564dfe-56d0-4531-9269-dd5a9bc94fa5" containerID="02ed20d1781a32826efc4cbb802274701e6c14b03c4514c3562dbd27ee3a806b" exitCode=0 Apr 16 18:13:34.957077 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:34.957029 2576 generic.go:358] "Generic (PLEG): container finished" podID="36564dfe-56d0-4531-9269-dd5a9bc94fa5" containerID="8ca555ea4d864bd01b7f3edb50c4f78133747af4c6f06b52cea7cfb89ab8db60" exitCode=0 Apr 16 18:13:34.957077 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:34.957067 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"36564dfe-56d0-4531-9269-dd5a9bc94fa5","Type":"ContainerDied","Data":"02ed20d1781a32826efc4cbb802274701e6c14b03c4514c3562dbd27ee3a806b"} Apr 16 18:13:34.957302 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:34.957101 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"36564dfe-56d0-4531-9269-dd5a9bc94fa5","Type":"ContainerDied","Data":"8ca555ea4d864bd01b7f3edb50c4f78133747af4c6f06b52cea7cfb89ab8db60"} Apr 16 18:13:34.957302 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:34.957104 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:34.957302 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:34.957115 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"36564dfe-56d0-4531-9269-dd5a9bc94fa5","Type":"ContainerDied","Data":"fbd354a53d3a86c020652d299f0e0666784df044fe2810336058a234f4cbe723"} Apr 16 18:13:34.957302 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:34.957137 2576 scope.go:117] "RemoveContainer" containerID="4ed9ef8f1e425a269361828c5e1cebba2338653f8dca58446bfdcbcf7a04232d" Apr 16 18:13:34.964115 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:34.964096 2576 scope.go:117] "RemoveContainer" containerID="02ed20d1781a32826efc4cbb802274701e6c14b03c4514c3562dbd27ee3a806b" Apr 16 18:13:34.970159 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:34.970144 2576 scope.go:117] "RemoveContainer" containerID="157121dd1a374a613817305a55ced989caecb8e0d1edf483f05d35681e85ffcd" Apr 16 18:13:34.976028 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:34.976012 2576 scope.go:117] "RemoveContainer" containerID="8ca555ea4d864bd01b7f3edb50c4f78133747af4c6f06b52cea7cfb89ab8db60" Apr 16 18:13:34.981902 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:34.981889 2576 scope.go:117] "RemoveContainer" containerID="339e7bbe65d2fd665818b9262134f1004fa78fd3ac20f1cc2d65885be82e80db" Apr 16 18:13:34.987769 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:34.987751 2576 scope.go:117] "RemoveContainer" containerID="d39148d2241fd5bc3140d746b01d4de56eeca89d6432869f24f95fb2b613a9a8" Apr 16 18:13:34.993687 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:34.993671 2576 scope.go:117] "RemoveContainer" containerID="d27de8332b6c8f8f0cbbc36c0a88f70ee2dd21a630d58cc0876eb5b473c493ce" Apr 16 18:13:34.999524 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:34.999502 2576 scope.go:117] "RemoveContainer" containerID="4ed9ef8f1e425a269361828c5e1cebba2338653f8dca58446bfdcbcf7a04232d" Apr 16 18:13:34.999828 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:13:34.999812 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ed9ef8f1e425a269361828c5e1cebba2338653f8dca58446bfdcbcf7a04232d\": container with ID starting with 4ed9ef8f1e425a269361828c5e1cebba2338653f8dca58446bfdcbcf7a04232d not found: ID does not exist" containerID="4ed9ef8f1e425a269361828c5e1cebba2338653f8dca58446bfdcbcf7a04232d" Apr 16 18:13:34.999871 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:34.999836 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ed9ef8f1e425a269361828c5e1cebba2338653f8dca58446bfdcbcf7a04232d"} err="failed to get container status \"4ed9ef8f1e425a269361828c5e1cebba2338653f8dca58446bfdcbcf7a04232d\": rpc error: code = NotFound desc = could not find container \"4ed9ef8f1e425a269361828c5e1cebba2338653f8dca58446bfdcbcf7a04232d\": container with ID starting with 4ed9ef8f1e425a269361828c5e1cebba2338653f8dca58446bfdcbcf7a04232d not found: ID does not exist" Apr 16 18:13:34.999871 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:34.999855 2576 scope.go:117] "RemoveContainer" containerID="02ed20d1781a32826efc4cbb802274701e6c14b03c4514c3562dbd27ee3a806b" Apr 16 18:13:35.000091 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:13:35.000074 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02ed20d1781a32826efc4cbb802274701e6c14b03c4514c3562dbd27ee3a806b\": container with ID starting with 02ed20d1781a32826efc4cbb802274701e6c14b03c4514c3562dbd27ee3a806b not found: ID does not exist" containerID="02ed20d1781a32826efc4cbb802274701e6c14b03c4514c3562dbd27ee3a806b" Apr 16 18:13:35.000131 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.000097 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02ed20d1781a32826efc4cbb802274701e6c14b03c4514c3562dbd27ee3a806b"} err="failed to get container status \"02ed20d1781a32826efc4cbb802274701e6c14b03c4514c3562dbd27ee3a806b\": rpc error: code = NotFound desc = could not find container \"02ed20d1781a32826efc4cbb802274701e6c14b03c4514c3562dbd27ee3a806b\": container with ID starting with 02ed20d1781a32826efc4cbb802274701e6c14b03c4514c3562dbd27ee3a806b not found: ID does not exist" Apr 16 18:13:35.000131 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.000114 2576 scope.go:117] "RemoveContainer" containerID="157121dd1a374a613817305a55ced989caecb8e0d1edf483f05d35681e85ffcd" Apr 16 18:13:35.000342 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:13:35.000324 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"157121dd1a374a613817305a55ced989caecb8e0d1edf483f05d35681e85ffcd\": container with ID starting with 157121dd1a374a613817305a55ced989caecb8e0d1edf483f05d35681e85ffcd not found: ID does not exist" containerID="157121dd1a374a613817305a55ced989caecb8e0d1edf483f05d35681e85ffcd" Apr 16 18:13:35.000380 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.000347 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"157121dd1a374a613817305a55ced989caecb8e0d1edf483f05d35681e85ffcd"} err="failed to get container status \"157121dd1a374a613817305a55ced989caecb8e0d1edf483f05d35681e85ffcd\": rpc error: code = NotFound desc = could not find container \"157121dd1a374a613817305a55ced989caecb8e0d1edf483f05d35681e85ffcd\": container with ID starting with 157121dd1a374a613817305a55ced989caecb8e0d1edf483f05d35681e85ffcd not found: ID does not exist" Apr 16 18:13:35.000380 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.000363 2576 scope.go:117] "RemoveContainer" containerID="8ca555ea4d864bd01b7f3edb50c4f78133747af4c6f06b52cea7cfb89ab8db60" Apr 16 18:13:35.000559 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:13:35.000539 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ca555ea4d864bd01b7f3edb50c4f78133747af4c6f06b52cea7cfb89ab8db60\": container with ID starting with 8ca555ea4d864bd01b7f3edb50c4f78133747af4c6f06b52cea7cfb89ab8db60 not found: ID does not exist" containerID="8ca555ea4d864bd01b7f3edb50c4f78133747af4c6f06b52cea7cfb89ab8db60" Apr 16 18:13:35.000604 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.000563 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ca555ea4d864bd01b7f3edb50c4f78133747af4c6f06b52cea7cfb89ab8db60"} err="failed to get container status \"8ca555ea4d864bd01b7f3edb50c4f78133747af4c6f06b52cea7cfb89ab8db60\": rpc error: code = NotFound desc = could not find container \"8ca555ea4d864bd01b7f3edb50c4f78133747af4c6f06b52cea7cfb89ab8db60\": container with ID starting with 8ca555ea4d864bd01b7f3edb50c4f78133747af4c6f06b52cea7cfb89ab8db60 not found: ID does not exist" Apr 16 18:13:35.000604 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.000575 2576 scope.go:117] "RemoveContainer" containerID="339e7bbe65d2fd665818b9262134f1004fa78fd3ac20f1cc2d65885be82e80db" Apr 16 18:13:35.001009 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:13:35.000976 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"339e7bbe65d2fd665818b9262134f1004fa78fd3ac20f1cc2d65885be82e80db\": container with ID starting with 339e7bbe65d2fd665818b9262134f1004fa78fd3ac20f1cc2d65885be82e80db not found: ID does not exist" containerID="339e7bbe65d2fd665818b9262134f1004fa78fd3ac20f1cc2d65885be82e80db" Apr 16 18:13:35.001083 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.001017 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"339e7bbe65d2fd665818b9262134f1004fa78fd3ac20f1cc2d65885be82e80db"} err="failed to get container status \"339e7bbe65d2fd665818b9262134f1004fa78fd3ac20f1cc2d65885be82e80db\": rpc error: code = NotFound desc = could not find container \"339e7bbe65d2fd665818b9262134f1004fa78fd3ac20f1cc2d65885be82e80db\": container with ID starting with 339e7bbe65d2fd665818b9262134f1004fa78fd3ac20f1cc2d65885be82e80db not found: ID does not exist" Apr 16 18:13:35.001083 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.001037 2576 scope.go:117] "RemoveContainer" containerID="d39148d2241fd5bc3140d746b01d4de56eeca89d6432869f24f95fb2b613a9a8" Apr 16 18:13:35.001260 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:13:35.001241 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d39148d2241fd5bc3140d746b01d4de56eeca89d6432869f24f95fb2b613a9a8\": container with ID starting with d39148d2241fd5bc3140d746b01d4de56eeca89d6432869f24f95fb2b613a9a8 not found: ID does not exist" containerID="d39148d2241fd5bc3140d746b01d4de56eeca89d6432869f24f95fb2b613a9a8" Apr 16 18:13:35.001299 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.001264 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d39148d2241fd5bc3140d746b01d4de56eeca89d6432869f24f95fb2b613a9a8"} err="failed to get container status \"d39148d2241fd5bc3140d746b01d4de56eeca89d6432869f24f95fb2b613a9a8\": rpc error: code = NotFound desc = could not find container \"d39148d2241fd5bc3140d746b01d4de56eeca89d6432869f24f95fb2b613a9a8\": container with ID starting with d39148d2241fd5bc3140d746b01d4de56eeca89d6432869f24f95fb2b613a9a8 not found: ID does not exist" Apr 16 18:13:35.001299 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.001280 2576 scope.go:117] "RemoveContainer" containerID="d27de8332b6c8f8f0cbbc36c0a88f70ee2dd21a630d58cc0876eb5b473c493ce" Apr 16 18:13:35.001442 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:13:35.001427 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d27de8332b6c8f8f0cbbc36c0a88f70ee2dd21a630d58cc0876eb5b473c493ce\": container with ID starting with d27de8332b6c8f8f0cbbc36c0a88f70ee2dd21a630d58cc0876eb5b473c493ce not found: ID does not exist" containerID="d27de8332b6c8f8f0cbbc36c0a88f70ee2dd21a630d58cc0876eb5b473c493ce" Apr 16 18:13:35.001482 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.001446 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d27de8332b6c8f8f0cbbc36c0a88f70ee2dd21a630d58cc0876eb5b473c493ce"} err="failed to get container status \"d27de8332b6c8f8f0cbbc36c0a88f70ee2dd21a630d58cc0876eb5b473c493ce\": rpc error: code = NotFound desc = could not find container \"d27de8332b6c8f8f0cbbc36c0a88f70ee2dd21a630d58cc0876eb5b473c493ce\": container with ID starting with d27de8332b6c8f8f0cbbc36c0a88f70ee2dd21a630d58cc0876eb5b473c493ce not found: ID does not exist" Apr 16 18:13:35.001482 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.001459 2576 scope.go:117] "RemoveContainer" containerID="4ed9ef8f1e425a269361828c5e1cebba2338653f8dca58446bfdcbcf7a04232d" Apr 16 18:13:35.001646 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.001629 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ed9ef8f1e425a269361828c5e1cebba2338653f8dca58446bfdcbcf7a04232d"} err="failed to get container status \"4ed9ef8f1e425a269361828c5e1cebba2338653f8dca58446bfdcbcf7a04232d\": rpc error: code = NotFound desc = could not find container \"4ed9ef8f1e425a269361828c5e1cebba2338653f8dca58446bfdcbcf7a04232d\": container with ID starting with 4ed9ef8f1e425a269361828c5e1cebba2338653f8dca58446bfdcbcf7a04232d not found: ID does not exist" Apr 16 18:13:35.001735 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.001649 2576 scope.go:117] "RemoveContainer" containerID="02ed20d1781a32826efc4cbb802274701e6c14b03c4514c3562dbd27ee3a806b" Apr 16 18:13:35.001894 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.001876 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02ed20d1781a32826efc4cbb802274701e6c14b03c4514c3562dbd27ee3a806b"} err="failed to get container status \"02ed20d1781a32826efc4cbb802274701e6c14b03c4514c3562dbd27ee3a806b\": rpc error: code = NotFound desc = could not find container \"02ed20d1781a32826efc4cbb802274701e6c14b03c4514c3562dbd27ee3a806b\": container with ID starting with 02ed20d1781a32826efc4cbb802274701e6c14b03c4514c3562dbd27ee3a806b not found: ID does not exist" Apr 16 18:13:35.001937 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.001895 2576 scope.go:117] "RemoveContainer" containerID="157121dd1a374a613817305a55ced989caecb8e0d1edf483f05d35681e85ffcd" Apr 16 18:13:35.002090 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.002074 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"157121dd1a374a613817305a55ced989caecb8e0d1edf483f05d35681e85ffcd"} err="failed to get container status \"157121dd1a374a613817305a55ced989caecb8e0d1edf483f05d35681e85ffcd\": rpc error: code = NotFound desc = could not find container \"157121dd1a374a613817305a55ced989caecb8e0d1edf483f05d35681e85ffcd\": container with ID starting with 157121dd1a374a613817305a55ced989caecb8e0d1edf483f05d35681e85ffcd not found: ID does not exist" Apr 16 18:13:35.002139 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.002090 2576 scope.go:117] "RemoveContainer" containerID="8ca555ea4d864bd01b7f3edb50c4f78133747af4c6f06b52cea7cfb89ab8db60" Apr 16 18:13:35.002315 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.002295 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ca555ea4d864bd01b7f3edb50c4f78133747af4c6f06b52cea7cfb89ab8db60"} err="failed to get container status \"8ca555ea4d864bd01b7f3edb50c4f78133747af4c6f06b52cea7cfb89ab8db60\": rpc error: code = NotFound desc = could not find container \"8ca555ea4d864bd01b7f3edb50c4f78133747af4c6f06b52cea7cfb89ab8db60\": container with ID starting with 8ca555ea4d864bd01b7f3edb50c4f78133747af4c6f06b52cea7cfb89ab8db60 not found: ID does not exist" Apr 16 18:13:35.002383 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.002318 2576 scope.go:117] "RemoveContainer" containerID="339e7bbe65d2fd665818b9262134f1004fa78fd3ac20f1cc2d65885be82e80db" Apr 16 18:13:35.002514 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.002493 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"339e7bbe65d2fd665818b9262134f1004fa78fd3ac20f1cc2d65885be82e80db"} err="failed to get container status \"339e7bbe65d2fd665818b9262134f1004fa78fd3ac20f1cc2d65885be82e80db\": rpc error: code = NotFound desc = could not find container \"339e7bbe65d2fd665818b9262134f1004fa78fd3ac20f1cc2d65885be82e80db\": container with ID starting with 339e7bbe65d2fd665818b9262134f1004fa78fd3ac20f1cc2d65885be82e80db not found: ID does not exist" Apr 16 18:13:35.002561 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.002516 2576 scope.go:117] "RemoveContainer" containerID="d39148d2241fd5bc3140d746b01d4de56eeca89d6432869f24f95fb2b613a9a8" Apr 16 18:13:35.002714 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.002676 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d39148d2241fd5bc3140d746b01d4de56eeca89d6432869f24f95fb2b613a9a8"} err="failed to get container status \"d39148d2241fd5bc3140d746b01d4de56eeca89d6432869f24f95fb2b613a9a8\": rpc error: code = NotFound desc = could not find container \"d39148d2241fd5bc3140d746b01d4de56eeca89d6432869f24f95fb2b613a9a8\": container with ID starting with d39148d2241fd5bc3140d746b01d4de56eeca89d6432869f24f95fb2b613a9a8 not found: ID does not exist" Apr 16 18:13:35.002714 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.002708 2576 scope.go:117] "RemoveContainer" containerID="d27de8332b6c8f8f0cbbc36c0a88f70ee2dd21a630d58cc0876eb5b473c493ce" Apr 16 18:13:35.002894 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.002868 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d27de8332b6c8f8f0cbbc36c0a88f70ee2dd21a630d58cc0876eb5b473c493ce"} err="failed to get container status \"d27de8332b6c8f8f0cbbc36c0a88f70ee2dd21a630d58cc0876eb5b473c493ce\": rpc error: code = NotFound desc = could not find container \"d27de8332b6c8f8f0cbbc36c0a88f70ee2dd21a630d58cc0876eb5b473c493ce\": container with ID starting with d27de8332b6c8f8f0cbbc36c0a88f70ee2dd21a630d58cc0876eb5b473c493ce not found: ID does not exist" Apr 16 18:13:35.020192 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.020174 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " Apr 16 18:13:35.020279 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.020201 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/36564dfe-56d0-4531-9269-dd5a9bc94fa5-tls-assets\") pod \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " Apr 16 18:13:35.020279 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.020227 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-web-config\") pod \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " Apr 16 18:13:35.020279 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.020251 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-secret-alertmanager-kube-rbac-proxy\") pod \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " Apr 16 18:13:35.020444 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.020322 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36564dfe-56d0-4531-9269-dd5a9bc94fa5-alertmanager-trusted-ca-bundle\") pod \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " Apr 16 18:13:35.020444 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.020364 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/36564dfe-56d0-4531-9269-dd5a9bc94fa5-metrics-client-ca\") pod \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " Apr 16 18:13:35.020444 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.020388 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-cluster-tls-config\") pod \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " Apr 16 18:13:35.020600 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.020459 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfnxf\" (UniqueName: \"kubernetes.io/projected/36564dfe-56d0-4531-9269-dd5a9bc94fa5-kube-api-access-hfnxf\") pod \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " Apr 16 18:13:35.020600 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.020496 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-secret-alertmanager-kube-rbac-proxy-web\") pod \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " Apr 16 18:13:35.021017 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.020738 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36564dfe-56d0-4531-9269-dd5a9bc94fa5-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "36564dfe-56d0-4531-9269-dd5a9bc94fa5" (UID: "36564dfe-56d0-4531-9269-dd5a9bc94fa5"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:13:35.021017 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.020857 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36564dfe-56d0-4531-9269-dd5a9bc94fa5-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "36564dfe-56d0-4531-9269-dd5a9bc94fa5" (UID: "36564dfe-56d0-4531-9269-dd5a9bc94fa5"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:13:35.021017 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.020904 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-config-volume\") pod \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " Apr 16 18:13:35.021017 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.020934 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/36564dfe-56d0-4531-9269-dd5a9bc94fa5-config-out\") pod \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " Apr 16 18:13:35.021017 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.020969 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-secret-alertmanager-main-tls\") pod \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " Apr 16 18:13:35.021017 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.021000 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/36564dfe-56d0-4531-9269-dd5a9bc94fa5-alertmanager-main-db\") pod \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\" (UID: \"36564dfe-56d0-4531-9269-dd5a9bc94fa5\") " Apr 16 18:13:35.021358 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.021264 2576 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36564dfe-56d0-4531-9269-dd5a9bc94fa5-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:13:35.021358 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.021285 2576 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/36564dfe-56d0-4531-9269-dd5a9bc94fa5-metrics-client-ca\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:13:35.021744 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.021687 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36564dfe-56d0-4531-9269-dd5a9bc94fa5-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "36564dfe-56d0-4531-9269-dd5a9bc94fa5" (UID: "36564dfe-56d0-4531-9269-dd5a9bc94fa5"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:13:35.023220 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.023191 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "36564dfe-56d0-4531-9269-dd5a9bc94fa5" (UID: "36564dfe-56d0-4531-9269-dd5a9bc94fa5"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:13:35.023497 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.023473 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "36564dfe-56d0-4531-9269-dd5a9bc94fa5" (UID: "36564dfe-56d0-4531-9269-dd5a9bc94fa5"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:13:35.023621 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.023559 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36564dfe-56d0-4531-9269-dd5a9bc94fa5-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "36564dfe-56d0-4531-9269-dd5a9bc94fa5" (UID: "36564dfe-56d0-4531-9269-dd5a9bc94fa5"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:13:35.024093 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.024068 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "36564dfe-56d0-4531-9269-dd5a9bc94fa5" (UID: "36564dfe-56d0-4531-9269-dd5a9bc94fa5"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:13:35.024263 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.024244 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36564dfe-56d0-4531-9269-dd5a9bc94fa5-config-out" (OuterVolumeSpecName: "config-out") pod "36564dfe-56d0-4531-9269-dd5a9bc94fa5" (UID: "36564dfe-56d0-4531-9269-dd5a9bc94fa5"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:13:35.024345 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.024314 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36564dfe-56d0-4531-9269-dd5a9bc94fa5-kube-api-access-hfnxf" (OuterVolumeSpecName: "kube-api-access-hfnxf") pod "36564dfe-56d0-4531-9269-dd5a9bc94fa5" (UID: "36564dfe-56d0-4531-9269-dd5a9bc94fa5"). InnerVolumeSpecName "kube-api-access-hfnxf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:13:35.024634 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.024607 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-config-volume" (OuterVolumeSpecName: "config-volume") pod "36564dfe-56d0-4531-9269-dd5a9bc94fa5" (UID: "36564dfe-56d0-4531-9269-dd5a9bc94fa5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:13:35.025079 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.025063 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "36564dfe-56d0-4531-9269-dd5a9bc94fa5" (UID: "36564dfe-56d0-4531-9269-dd5a9bc94fa5"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:13:35.027203 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.027177 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "36564dfe-56d0-4531-9269-dd5a9bc94fa5" (UID: "36564dfe-56d0-4531-9269-dd5a9bc94fa5"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:13:35.033584 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.033566 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-web-config" (OuterVolumeSpecName: "web-config") pod "36564dfe-56d0-4531-9269-dd5a9bc94fa5" (UID: "36564dfe-56d0-4531-9269-dd5a9bc94fa5"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:13:35.121573 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.121552 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:13:35.121573 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.121572 2576 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-config-volume\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:13:35.121685 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.121581 2576 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/36564dfe-56d0-4531-9269-dd5a9bc94fa5-config-out\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:13:35.121685 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.121590 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-secret-alertmanager-main-tls\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:13:35.121685 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.121599 2576 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/36564dfe-56d0-4531-9269-dd5a9bc94fa5-alertmanager-main-db\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:13:35.121685 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.121609 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:13:35.121685 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.121618 2576 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/36564dfe-56d0-4531-9269-dd5a9bc94fa5-tls-assets\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:13:35.121685 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.121626 2576 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-web-config\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:13:35.121685 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.121635 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:13:35.121685 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.121643 2576 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/36564dfe-56d0-4531-9269-dd5a9bc94fa5-cluster-tls-config\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:13:35.121685 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.121652 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hfnxf\" (UniqueName: \"kubernetes.io/projected/36564dfe-56d0-4531-9269-dd5a9bc94fa5-kube-api-access-hfnxf\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:13:35.280073 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.280050 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:13:35.284391 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.284371 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:13:35.307243 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.307212 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:13:35.307542 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.307528 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36564dfe-56d0-4531-9269-dd5a9bc94fa5" containerName="config-reloader" Apr 16 18:13:35.307584 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.307544 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="36564dfe-56d0-4531-9269-dd5a9bc94fa5" containerName="config-reloader" Apr 16 18:13:35.307584 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.307563 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36564dfe-56d0-4531-9269-dd5a9bc94fa5" containerName="init-config-reloader" Apr 16 18:13:35.307584 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.307572 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="36564dfe-56d0-4531-9269-dd5a9bc94fa5" containerName="init-config-reloader" Apr 16 18:13:35.307584 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.307583 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36564dfe-56d0-4531-9269-dd5a9bc94fa5" containerName="kube-rbac-proxy" Apr 16 18:13:35.307744 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.307589 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="36564dfe-56d0-4531-9269-dd5a9bc94fa5" containerName="kube-rbac-proxy" Apr 16 18:13:35.307744 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.307600 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36564dfe-56d0-4531-9269-dd5a9bc94fa5" containerName="prom-label-proxy" Apr 16 18:13:35.307744 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.307605 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="36564dfe-56d0-4531-9269-dd5a9bc94fa5" containerName="prom-label-proxy" Apr 16 18:13:35.307744 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.307611 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de4c4df9-8036-4492-9274-9f47dc6c8180" containerName="registry" Apr 16 18:13:35.307744 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.307616 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="de4c4df9-8036-4492-9274-9f47dc6c8180" containerName="registry" Apr 16 18:13:35.307744 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.307623 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36564dfe-56d0-4531-9269-dd5a9bc94fa5" containerName="alertmanager" Apr 16 18:13:35.307744 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.307628 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="36564dfe-56d0-4531-9269-dd5a9bc94fa5" containerName="alertmanager" Apr 16 18:13:35.307744 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.307634 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36564dfe-56d0-4531-9269-dd5a9bc94fa5" containerName="kube-rbac-proxy-web" Apr 16 18:13:35.307744 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.307640 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="36564dfe-56d0-4531-9269-dd5a9bc94fa5" containerName="kube-rbac-proxy-web" Apr 16 18:13:35.307744 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.307653 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36564dfe-56d0-4531-9269-dd5a9bc94fa5" containerName="kube-rbac-proxy-metric" Apr 16 18:13:35.307744 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.307660 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="36564dfe-56d0-4531-9269-dd5a9bc94fa5" containerName="kube-rbac-proxy-metric" Apr 16 18:13:35.307744 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.307716 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="de4c4df9-8036-4492-9274-9f47dc6c8180" containerName="registry" Apr 16 18:13:35.307744 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.307725 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="36564dfe-56d0-4531-9269-dd5a9bc94fa5" containerName="prom-label-proxy" Apr 16 18:13:35.307744 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.307734 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="36564dfe-56d0-4531-9269-dd5a9bc94fa5" containerName="alertmanager" Apr 16 18:13:35.307744 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.307744 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="36564dfe-56d0-4531-9269-dd5a9bc94fa5" containerName="kube-rbac-proxy-web" Apr 16 18:13:35.307744 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.307751 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="36564dfe-56d0-4531-9269-dd5a9bc94fa5" containerName="config-reloader" Apr 16 18:13:35.308244 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.307757 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="36564dfe-56d0-4531-9269-dd5a9bc94fa5" containerName="kube-rbac-proxy" Apr 16 18:13:35.308244 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.307763 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="36564dfe-56d0-4531-9269-dd5a9bc94fa5" containerName="kube-rbac-proxy-metric" Apr 16 18:13:35.314486 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.314466 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.317148 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.317128 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 18:13:35.317246 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.317156 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 18:13:35.317246 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.317132 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 18:13:35.317493 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.317476 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 18:13:35.317562 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.317547 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 18:13:35.318538 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.318065 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 18:13:35.319478 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.319455 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-w2b9x\"" Apr 16 18:13:35.319599 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.319578 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 18:13:35.319947 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.319922 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 18:13:35.328288 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.328267 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 18:13:35.331799 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.331766 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36564dfe-56d0-4531-9269-dd5a9bc94fa5" path="/var/lib/kubelet/pods/36564dfe-56d0-4531-9269-dd5a9bc94fa5/volumes" Apr 16 18:13:35.332305 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.332287 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:13:35.424763 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.424723 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/a6188869-8c94-4bfd-8639-66d342e6af7d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"a6188869-8c94-4bfd-8639-66d342e6af7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.424763 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.424774 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a6188869-8c94-4bfd-8639-66d342e6af7d-config-out\") pod \"alertmanager-main-0\" (UID: \"a6188869-8c94-4bfd-8639-66d342e6af7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.424936 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.424790 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a6188869-8c94-4bfd-8639-66d342e6af7d-web-config\") pod \"alertmanager-main-0\" (UID: \"a6188869-8c94-4bfd-8639-66d342e6af7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.424936 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.424810 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6hl2\" (UniqueName: \"kubernetes.io/projected/a6188869-8c94-4bfd-8639-66d342e6af7d-kube-api-access-l6hl2\") pod \"alertmanager-main-0\" (UID: \"a6188869-8c94-4bfd-8639-66d342e6af7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.424936 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.424827 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a6188869-8c94-4bfd-8639-66d342e6af7d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"a6188869-8c94-4bfd-8639-66d342e6af7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.424936 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.424844 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6188869-8c94-4bfd-8639-66d342e6af7d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"a6188869-8c94-4bfd-8639-66d342e6af7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.424936 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.424861 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a6188869-8c94-4bfd-8639-66d342e6af7d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"a6188869-8c94-4bfd-8639-66d342e6af7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.424936 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.424881 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/a6188869-8c94-4bfd-8639-66d342e6af7d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"a6188869-8c94-4bfd-8639-66d342e6af7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.424936 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.424897 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/a6188869-8c94-4bfd-8639-66d342e6af7d-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"a6188869-8c94-4bfd-8639-66d342e6af7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.424936 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.424917 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/a6188869-8c94-4bfd-8639-66d342e6af7d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"a6188869-8c94-4bfd-8639-66d342e6af7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.424936 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.424932 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a6188869-8c94-4bfd-8639-66d342e6af7d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"a6188869-8c94-4bfd-8639-66d342e6af7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.425224 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.424990 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a6188869-8c94-4bfd-8639-66d342e6af7d-config-volume\") pod \"alertmanager-main-0\" (UID: \"a6188869-8c94-4bfd-8639-66d342e6af7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.425224 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.425052 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a6188869-8c94-4bfd-8639-66d342e6af7d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"a6188869-8c94-4bfd-8639-66d342e6af7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.525856 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.525799 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/a6188869-8c94-4bfd-8639-66d342e6af7d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"a6188869-8c94-4bfd-8639-66d342e6af7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.525856 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.525832 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a6188869-8c94-4bfd-8639-66d342e6af7d-config-out\") pod \"alertmanager-main-0\" (UID: \"a6188869-8c94-4bfd-8639-66d342e6af7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.525856 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.525849 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a6188869-8c94-4bfd-8639-66d342e6af7d-web-config\") pod \"alertmanager-main-0\" (UID: \"a6188869-8c94-4bfd-8639-66d342e6af7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.526013 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.525912 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l6hl2\" (UniqueName: \"kubernetes.io/projected/a6188869-8c94-4bfd-8639-66d342e6af7d-kube-api-access-l6hl2\") pod \"alertmanager-main-0\" (UID: \"a6188869-8c94-4bfd-8639-66d342e6af7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.526013 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.525948 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a6188869-8c94-4bfd-8639-66d342e6af7d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"a6188869-8c94-4bfd-8639-66d342e6af7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.526013 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.525981 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6188869-8c94-4bfd-8639-66d342e6af7d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"a6188869-8c94-4bfd-8639-66d342e6af7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.526120 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.526012 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a6188869-8c94-4bfd-8639-66d342e6af7d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"a6188869-8c94-4bfd-8639-66d342e6af7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.526120 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.526040 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/a6188869-8c94-4bfd-8639-66d342e6af7d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"a6188869-8c94-4bfd-8639-66d342e6af7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.526120 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.526066 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/a6188869-8c94-4bfd-8639-66d342e6af7d-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"a6188869-8c94-4bfd-8639-66d342e6af7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.526120 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.526104 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/a6188869-8c94-4bfd-8639-66d342e6af7d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"a6188869-8c94-4bfd-8639-66d342e6af7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.526742 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.526689 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/a6188869-8c94-4bfd-8639-66d342e6af7d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"a6188869-8c94-4bfd-8639-66d342e6af7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.527062 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.527038 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6188869-8c94-4bfd-8639-66d342e6af7d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"a6188869-8c94-4bfd-8639-66d342e6af7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.527318 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.527295 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a6188869-8c94-4bfd-8639-66d342e6af7d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"a6188869-8c94-4bfd-8639-66d342e6af7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.527470 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.527453 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a6188869-8c94-4bfd-8639-66d342e6af7d-config-volume\") pod \"alertmanager-main-0\" (UID: \"a6188869-8c94-4bfd-8639-66d342e6af7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.527598 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.527582 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a6188869-8c94-4bfd-8639-66d342e6af7d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"a6188869-8c94-4bfd-8639-66d342e6af7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.528591 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.528560 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a6188869-8c94-4bfd-8639-66d342e6af7d-config-out\") pod \"alertmanager-main-0\" (UID: \"a6188869-8c94-4bfd-8639-66d342e6af7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.528716 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.527327 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a6188869-8c94-4bfd-8639-66d342e6af7d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"a6188869-8c94-4bfd-8639-66d342e6af7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.528951 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.528927 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/a6188869-8c94-4bfd-8639-66d342e6af7d-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"a6188869-8c94-4bfd-8639-66d342e6af7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.529026 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.528935 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/a6188869-8c94-4bfd-8639-66d342e6af7d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"a6188869-8c94-4bfd-8639-66d342e6af7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.529086 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.529049 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a6188869-8c94-4bfd-8639-66d342e6af7d-web-config\") pod \"alertmanager-main-0\" (UID: \"a6188869-8c94-4bfd-8639-66d342e6af7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.529324 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.529301 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a6188869-8c94-4bfd-8639-66d342e6af7d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"a6188869-8c94-4bfd-8639-66d342e6af7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.529378 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.529341 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/a6188869-8c94-4bfd-8639-66d342e6af7d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"a6188869-8c94-4bfd-8639-66d342e6af7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.529757 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.529736 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a6188869-8c94-4bfd-8639-66d342e6af7d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"a6188869-8c94-4bfd-8639-66d342e6af7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.530086 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.530071 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a6188869-8c94-4bfd-8639-66d342e6af7d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"a6188869-8c94-4bfd-8639-66d342e6af7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.530499 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.530475 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a6188869-8c94-4bfd-8639-66d342e6af7d-config-volume\") pod \"alertmanager-main-0\" (UID: \"a6188869-8c94-4bfd-8639-66d342e6af7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.533669 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.533648 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6hl2\" (UniqueName: \"kubernetes.io/projected/a6188869-8c94-4bfd-8639-66d342e6af7d-kube-api-access-l6hl2\") pod \"alertmanager-main-0\" (UID: \"a6188869-8c94-4bfd-8639-66d342e6af7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.628872 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.628835 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:13:35.753618 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.753587 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:13:35.757011 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:13:35.756982 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6188869_8c94_4bfd_8639_66d342e6af7d.slice/crio-b74d5b466e4928234f4706d06803a85447270f221fd3d2d953b150221736e185 WatchSource:0}: Error finding container b74d5b466e4928234f4706d06803a85447270f221fd3d2d953b150221736e185: Status 404 returned error can't find the container with id b74d5b466e4928234f4706d06803a85447270f221fd3d2d953b150221736e185 Apr 16 18:13:35.961349 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.961310 2576 generic.go:358] "Generic (PLEG): container finished" podID="a6188869-8c94-4bfd-8639-66d342e6af7d" containerID="85b135bbe3a0e0a967411e847a01001d3cd22664511b74ebbcb77d91f781e869" exitCode=0 Apr 16 18:13:35.961752 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.961400 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a6188869-8c94-4bfd-8639-66d342e6af7d","Type":"ContainerDied","Data":"85b135bbe3a0e0a967411e847a01001d3cd22664511b74ebbcb77d91f781e869"} Apr 16 18:13:35.961752 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:35.961440 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a6188869-8c94-4bfd-8639-66d342e6af7d","Type":"ContainerStarted","Data":"b74d5b466e4928234f4706d06803a85447270f221fd3d2d953b150221736e185"} Apr 16 18:13:36.967890 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:36.967813 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a6188869-8c94-4bfd-8639-66d342e6af7d","Type":"ContainerStarted","Data":"8e4185500bcb06ac649a1770e4a83a4a90f8bca11b5c0befb8e61c5422ee790a"} Apr 16 18:13:36.967890 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:36.967846 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a6188869-8c94-4bfd-8639-66d342e6af7d","Type":"ContainerStarted","Data":"29caec14a23f74e367b5b296438b19aa2df2278c16899ae457b8a0bbcc61404d"} Apr 16 18:13:36.967890 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:36.967856 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a6188869-8c94-4bfd-8639-66d342e6af7d","Type":"ContainerStarted","Data":"66e21d46bfde8854cff8ee310caca92f263a326ab55a685fd7814de80687ca75"} Apr 16 18:13:36.967890 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:36.967865 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a6188869-8c94-4bfd-8639-66d342e6af7d","Type":"ContainerStarted","Data":"f9689313d7a57b8bd224a161ea9e13ac1ccb79c2be5a58fc940e2fe84a9afbb3"} Apr 16 18:13:36.967890 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:36.967874 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a6188869-8c94-4bfd-8639-66d342e6af7d","Type":"ContainerStarted","Data":"6c9655204ae17b1c66d4081a260f2a8882932e427c58b646ddd202b4a67a978b"} Apr 16 18:13:36.967890 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:36.967882 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a6188869-8c94-4bfd-8639-66d342e6af7d","Type":"ContainerStarted","Data":"0f658710ce1c91ed67750563e11f915b688c67deaffbd5259f508cf3e38e1773"} Apr 16 18:13:36.994146 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:36.994107 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.994093029 podStartE2EDuration="1.994093029s" podCreationTimestamp="2026-04-16 18:13:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:13:36.992766377 +0000 UTC m=+176.194341644" watchObservedRunningTime="2026-04-16 18:13:36.994093029 +0000 UTC m=+176.195668297" Apr 16 18:13:37.701455 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.701419 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-77cd4447c-rxvpr"] Apr 16 18:13:37.705079 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.705054 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-77cd4447c-rxvpr" Apr 16 18:13:37.707923 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.707896 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-c6qhb\"" Apr 16 18:13:37.708178 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.708134 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 18:13:37.708251 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.708203 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 18:13:37.708300 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.708282 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 18:13:37.708376 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.708284 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 18:13:37.708491 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.708425 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 18:13:37.713875 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.713851 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 18:13:37.716662 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.716637 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-77cd4447c-rxvpr"] Apr 16 18:13:37.847237 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.847204 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/42657ce0-b347-4e89-84f7-5766710baf5f-secret-telemeter-client\") pod \"telemeter-client-77cd4447c-rxvpr\" (UID: \"42657ce0-b347-4e89-84f7-5766710baf5f\") " pod="openshift-monitoring/telemeter-client-77cd4447c-rxvpr" Apr 16 18:13:37.847237 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.847238 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b44gz\" (UniqueName: \"kubernetes.io/projected/42657ce0-b347-4e89-84f7-5766710baf5f-kube-api-access-b44gz\") pod \"telemeter-client-77cd4447c-rxvpr\" (UID: \"42657ce0-b347-4e89-84f7-5766710baf5f\") " pod="openshift-monitoring/telemeter-client-77cd4447c-rxvpr" Apr 16 18:13:37.847424 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.847267 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/42657ce0-b347-4e89-84f7-5766710baf5f-telemeter-client-tls\") pod \"telemeter-client-77cd4447c-rxvpr\" (UID: \"42657ce0-b347-4e89-84f7-5766710baf5f\") " pod="openshift-monitoring/telemeter-client-77cd4447c-rxvpr" Apr 16 18:13:37.847424 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.847329 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/42657ce0-b347-4e89-84f7-5766710baf5f-federate-client-tls\") pod \"telemeter-client-77cd4447c-rxvpr\" (UID: \"42657ce0-b347-4e89-84f7-5766710baf5f\") " pod="openshift-monitoring/telemeter-client-77cd4447c-rxvpr" Apr 16 18:13:37.847424 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.847370 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/42657ce0-b347-4e89-84f7-5766710baf5f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-77cd4447c-rxvpr\" (UID: \"42657ce0-b347-4e89-84f7-5766710baf5f\") " pod="openshift-monitoring/telemeter-client-77cd4447c-rxvpr" Apr 16 18:13:37.847424 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.847397 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/42657ce0-b347-4e89-84f7-5766710baf5f-metrics-client-ca\") pod \"telemeter-client-77cd4447c-rxvpr\" (UID: \"42657ce0-b347-4e89-84f7-5766710baf5f\") " pod="openshift-monitoring/telemeter-client-77cd4447c-rxvpr" Apr 16 18:13:37.847557 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.847457 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42657ce0-b347-4e89-84f7-5766710baf5f-serving-certs-ca-bundle\") pod \"telemeter-client-77cd4447c-rxvpr\" (UID: \"42657ce0-b347-4e89-84f7-5766710baf5f\") " pod="openshift-monitoring/telemeter-client-77cd4447c-rxvpr" Apr 16 18:13:37.847557 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.847478 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42657ce0-b347-4e89-84f7-5766710baf5f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-77cd4447c-rxvpr\" (UID: \"42657ce0-b347-4e89-84f7-5766710baf5f\") " pod="openshift-monitoring/telemeter-client-77cd4447c-rxvpr" Apr 16 18:13:37.948012 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.947965 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42657ce0-b347-4e89-84f7-5766710baf5f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-77cd4447c-rxvpr\" (UID: \"42657ce0-b347-4e89-84f7-5766710baf5f\") " pod="openshift-monitoring/telemeter-client-77cd4447c-rxvpr" Apr 16 18:13:37.948012 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.948016 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/42657ce0-b347-4e89-84f7-5766710baf5f-secret-telemeter-client\") pod \"telemeter-client-77cd4447c-rxvpr\" (UID: \"42657ce0-b347-4e89-84f7-5766710baf5f\") " pod="openshift-monitoring/telemeter-client-77cd4447c-rxvpr" Apr 16 18:13:37.948237 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.948041 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b44gz\" (UniqueName: \"kubernetes.io/projected/42657ce0-b347-4e89-84f7-5766710baf5f-kube-api-access-b44gz\") pod \"telemeter-client-77cd4447c-rxvpr\" (UID: \"42657ce0-b347-4e89-84f7-5766710baf5f\") " pod="openshift-monitoring/telemeter-client-77cd4447c-rxvpr" Apr 16 18:13:37.948237 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.948102 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/42657ce0-b347-4e89-84f7-5766710baf5f-telemeter-client-tls\") pod \"telemeter-client-77cd4447c-rxvpr\" (UID: \"42657ce0-b347-4e89-84f7-5766710baf5f\") " pod="openshift-monitoring/telemeter-client-77cd4447c-rxvpr" Apr 16 18:13:37.948237 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.948132 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/42657ce0-b347-4e89-84f7-5766710baf5f-federate-client-tls\") pod \"telemeter-client-77cd4447c-rxvpr\" (UID: \"42657ce0-b347-4e89-84f7-5766710baf5f\") " pod="openshift-monitoring/telemeter-client-77cd4447c-rxvpr" Apr 16 18:13:37.948237 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.948167 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/42657ce0-b347-4e89-84f7-5766710baf5f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-77cd4447c-rxvpr\" (UID: \"42657ce0-b347-4e89-84f7-5766710baf5f\") " pod="openshift-monitoring/telemeter-client-77cd4447c-rxvpr" Apr 16 18:13:37.948237 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.948207 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/42657ce0-b347-4e89-84f7-5766710baf5f-metrics-client-ca\") pod \"telemeter-client-77cd4447c-rxvpr\" (UID: \"42657ce0-b347-4e89-84f7-5766710baf5f\") " pod="openshift-monitoring/telemeter-client-77cd4447c-rxvpr" Apr 16 18:13:37.948465 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.948282 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42657ce0-b347-4e89-84f7-5766710baf5f-serving-certs-ca-bundle\") pod \"telemeter-client-77cd4447c-rxvpr\" (UID: \"42657ce0-b347-4e89-84f7-5766710baf5f\") " pod="openshift-monitoring/telemeter-client-77cd4447c-rxvpr" Apr 16 18:13:37.949181 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.949148 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42657ce0-b347-4e89-84f7-5766710baf5f-serving-certs-ca-bundle\") pod \"telemeter-client-77cd4447c-rxvpr\" (UID: \"42657ce0-b347-4e89-84f7-5766710baf5f\") " pod="openshift-monitoring/telemeter-client-77cd4447c-rxvpr" Apr 16 18:13:37.949383 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.949287 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42657ce0-b347-4e89-84f7-5766710baf5f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-77cd4447c-rxvpr\" (UID: \"42657ce0-b347-4e89-84f7-5766710baf5f\") " pod="openshift-monitoring/telemeter-client-77cd4447c-rxvpr" Apr 16 18:13:37.949885 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.949861 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:13:37.950167 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.950141 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/42657ce0-b347-4e89-84f7-5766710baf5f-metrics-client-ca\") pod \"telemeter-client-77cd4447c-rxvpr\" (UID: \"42657ce0-b347-4e89-84f7-5766710baf5f\") " pod="openshift-monitoring/telemeter-client-77cd4447c-rxvpr" Apr 16 18:13:37.950810 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.950782 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="b252c920-4abd-4d7c-bd5e-d74dcbcd643a" containerName="prometheus" containerID="cri-o://5b3e5375ac3f006e74a4f228fee8e40dec23853a3b8b1d90b8f9e685eaa8f49a" gracePeriod=600 Apr 16 18:13:37.951249 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.950941 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="b252c920-4abd-4d7c-bd5e-d74dcbcd643a" containerName="kube-rbac-proxy-thanos" containerID="cri-o://fa16f3c670dba4369cd88186858c5b0d2e98ed9c73b5e9cd05fbffa3af8efc83" gracePeriod=600 Apr 16 18:13:37.951424 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.951046 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="b252c920-4abd-4d7c-bd5e-d74dcbcd643a" containerName="kube-rbac-proxy-web" containerID="cri-o://10185443b88eb998125800b778892279ba793508a2ee767bf5b26ced2072c262" gracePeriod=600 Apr 16 18:13:37.951570 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.951073 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="b252c920-4abd-4d7c-bd5e-d74dcbcd643a" containerName="kube-rbac-proxy" containerID="cri-o://ed184ce2f059dda12848bd8dabf54a96c498b4d984bbb6de6e0c7896ce6e751c" gracePeriod=600 Apr 16 18:13:37.951570 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.951111 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="b252c920-4abd-4d7c-bd5e-d74dcbcd643a" containerName="thanos-sidecar" containerID="cri-o://2a85df2c06617580395339185bdbb016052a1e1569f79857cbff0ae3c8cc1054" gracePeriod=600 Apr 16 18:13:37.951570 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.951136 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="b252c920-4abd-4d7c-bd5e-d74dcbcd643a" containerName="config-reloader" containerID="cri-o://cf5369ba5d307d3665894b9ef4c9e9d336a9e36233b3e29ca9ac4757a315b6b1" gracePeriod=600 Apr 16 18:13:37.952740 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.952681 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/42657ce0-b347-4e89-84f7-5766710baf5f-federate-client-tls\") pod \"telemeter-client-77cd4447c-rxvpr\" (UID: \"42657ce0-b347-4e89-84f7-5766710baf5f\") " pod="openshift-monitoring/telemeter-client-77cd4447c-rxvpr" Apr 16 18:13:37.954326 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.954302 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/42657ce0-b347-4e89-84f7-5766710baf5f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-77cd4447c-rxvpr\" (UID: \"42657ce0-b347-4e89-84f7-5766710baf5f\") " pod="openshift-monitoring/telemeter-client-77cd4447c-rxvpr" Apr 16 18:13:37.959618 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.958616 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/42657ce0-b347-4e89-84f7-5766710baf5f-secret-telemeter-client\") pod \"telemeter-client-77cd4447c-rxvpr\" (UID: \"42657ce0-b347-4e89-84f7-5766710baf5f\") " pod="openshift-monitoring/telemeter-client-77cd4447c-rxvpr" Apr 16 18:13:37.960992 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.960923 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/42657ce0-b347-4e89-84f7-5766710baf5f-telemeter-client-tls\") pod \"telemeter-client-77cd4447c-rxvpr\" (UID: \"42657ce0-b347-4e89-84f7-5766710baf5f\") " pod="openshift-monitoring/telemeter-client-77cd4447c-rxvpr" Apr 16 18:13:37.966276 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:37.966213 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b44gz\" (UniqueName: \"kubernetes.io/projected/42657ce0-b347-4e89-84f7-5766710baf5f-kube-api-access-b44gz\") pod \"telemeter-client-77cd4447c-rxvpr\" (UID: \"42657ce0-b347-4e89-84f7-5766710baf5f\") " pod="openshift-monitoring/telemeter-client-77cd4447c-rxvpr" Apr 16 18:13:38.017689 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.017657 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-77cd4447c-rxvpr" Apr 16 18:13:38.160437 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.160411 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-77cd4447c-rxvpr"] Apr 16 18:13:38.163445 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:13:38.163395 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42657ce0_b347_4e89_84f7_5766710baf5f.slice/crio-e6bd164c0043eda9ade9743c326c5080ee397012d4a0bb81828bd6fbad1a99e1 WatchSource:0}: Error finding container e6bd164c0043eda9ade9743c326c5080ee397012d4a0bb81828bd6fbad1a99e1: Status 404 returned error can't find the container with id e6bd164c0043eda9ade9743c326c5080ee397012d4a0bb81828bd6fbad1a99e1 Apr 16 18:13:38.228380 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.228356 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:38.351938 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.351911 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-prometheus-k8s-rulefiles-0\") pod \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " Apr 16 18:13:38.351938 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.351942 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7x28\" (UniqueName: \"kubernetes.io/projected/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-kube-api-access-d7x28\") pod \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " Apr 16 18:13:38.352150 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.351957 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-config\") pod \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " Apr 16 18:13:38.352150 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.351989 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-tls-assets\") pod \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " Apr 16 18:13:38.352150 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.352021 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-configmap-serving-certs-ca-bundle\") pod \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " Apr 16 18:13:38.352150 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.352051 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-secret-prometheus-k8s-tls\") pod \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " Apr 16 18:13:38.352150 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.352089 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-secret-metrics-client-certs\") pod \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " Apr 16 18:13:38.352150 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.352144 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-secret-kube-rbac-proxy\") pod \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " Apr 16 18:13:38.352424 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.352176 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-thanos-prometheus-http-client-file\") pod \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " Apr 16 18:13:38.352424 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.352202 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-prometheus-k8s-db\") pod \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " Apr 16 18:13:38.352424 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.352225 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-configmap-metrics-client-ca\") pod \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " Apr 16 18:13:38.352614 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.352577 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-config-out\") pod \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " Apr 16 18:13:38.352614 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.352525 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "b252c920-4abd-4d7c-bd5e-d74dcbcd643a" (UID: "b252c920-4abd-4d7c-bd5e-d74dcbcd643a"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:13:38.352753 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.352627 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " Apr 16 18:13:38.352753 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.352671 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-secret-grpc-tls\") pod \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " Apr 16 18:13:38.352753 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.352721 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-prometheus-trusted-ca-bundle\") pod \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " Apr 16 18:13:38.352899 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.352743 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "b252c920-4abd-4d7c-bd5e-d74dcbcd643a" (UID: "b252c920-4abd-4d7c-bd5e-d74dcbcd643a"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:13:38.352899 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.352758 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-configmap-kubelet-serving-ca-bundle\") pod \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " Apr 16 18:13:38.352899 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.352794 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " Apr 16 18:13:38.352899 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.352834 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-web-config\") pod \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\" (UID: \"b252c920-4abd-4d7c-bd5e-d74dcbcd643a\") " Apr 16 18:13:38.353449 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.353100 2576 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:13:38.353449 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.353132 2576 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-configmap-metrics-client-ca\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:13:38.353881 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.353752 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "b252c920-4abd-4d7c-bd5e-d74dcbcd643a" (UID: "b252c920-4abd-4d7c-bd5e-d74dcbcd643a"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:13:38.353988 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.353878 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "b252c920-4abd-4d7c-bd5e-d74dcbcd643a" (UID: "b252c920-4abd-4d7c-bd5e-d74dcbcd643a"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:13:38.355513 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.355384 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "b252c920-4abd-4d7c-bd5e-d74dcbcd643a" (UID: "b252c920-4abd-4d7c-bd5e-d74dcbcd643a"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:13:38.355598 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.355583 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "b252c920-4abd-4d7c-bd5e-d74dcbcd643a" (UID: "b252c920-4abd-4d7c-bd5e-d74dcbcd643a"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:13:38.356180 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.356139 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-config" (OuterVolumeSpecName: "config") pod "b252c920-4abd-4d7c-bd5e-d74dcbcd643a" (UID: "b252c920-4abd-4d7c-bd5e-d74dcbcd643a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:13:38.356180 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.356167 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "b252c920-4abd-4d7c-bd5e-d74dcbcd643a" (UID: "b252c920-4abd-4d7c-bd5e-d74dcbcd643a"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:13:38.356341 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.356239 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-kube-api-access-d7x28" (OuterVolumeSpecName: "kube-api-access-d7x28") pod "b252c920-4abd-4d7c-bd5e-d74dcbcd643a" (UID: "b252c920-4abd-4d7c-bd5e-d74dcbcd643a"). InnerVolumeSpecName "kube-api-access-d7x28". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:13:38.356489 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.356460 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "b252c920-4abd-4d7c-bd5e-d74dcbcd643a" (UID: "b252c920-4abd-4d7c-bd5e-d74dcbcd643a"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:13:38.356639 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.356614 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "b252c920-4abd-4d7c-bd5e-d74dcbcd643a" (UID: "b252c920-4abd-4d7c-bd5e-d74dcbcd643a"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:13:38.356774 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.356753 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "b252c920-4abd-4d7c-bd5e-d74dcbcd643a" (UID: "b252c920-4abd-4d7c-bd5e-d74dcbcd643a"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:13:38.356833 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.356774 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "b252c920-4abd-4d7c-bd5e-d74dcbcd643a" (UID: "b252c920-4abd-4d7c-bd5e-d74dcbcd643a"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:13:38.356978 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.356951 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-config-out" (OuterVolumeSpecName: "config-out") pod "b252c920-4abd-4d7c-bd5e-d74dcbcd643a" (UID: "b252c920-4abd-4d7c-bd5e-d74dcbcd643a"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:13:38.357041 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.357008 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "b252c920-4abd-4d7c-bd5e-d74dcbcd643a" (UID: "b252c920-4abd-4d7c-bd5e-d74dcbcd643a"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:13:38.357572 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.357556 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "b252c920-4abd-4d7c-bd5e-d74dcbcd643a" (UID: "b252c920-4abd-4d7c-bd5e-d74dcbcd643a"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:13:38.358069 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.358050 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "b252c920-4abd-4d7c-bd5e-d74dcbcd643a" (UID: "b252c920-4abd-4d7c-bd5e-d74dcbcd643a"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:13:38.366338 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.366319 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-web-config" (OuterVolumeSpecName: "web-config") pod "b252c920-4abd-4d7c-bd5e-d74dcbcd643a" (UID: "b252c920-4abd-4d7c-bd5e-d74dcbcd643a"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:13:38.454293 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.454211 2576 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:13:38.454293 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.454237 2576 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-web-config\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:13:38.454293 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.454254 2576 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:13:38.454293 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.454270 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d7x28\" (UniqueName: \"kubernetes.io/projected/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-kube-api-access-d7x28\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:13:38.454293 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.454295 2576 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-config\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:13:38.454548 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.454310 2576 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-tls-assets\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:13:38.454548 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.454324 2576 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-secret-prometheus-k8s-tls\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:13:38.454548 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.454340 2576 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-secret-metrics-client-certs\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:13:38.454548 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.454354 2576 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-secret-kube-rbac-proxy\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:13:38.454548 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.454369 2576 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-thanos-prometheus-http-client-file\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:13:38.454548 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.454381 2576 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-prometheus-k8s-db\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:13:38.454548 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.454390 2576 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-config-out\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:13:38.454548 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.454399 2576 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:13:38.454548 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.454408 2576 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-secret-grpc-tls\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:13:38.454548 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.454416 2576 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-prometheus-trusted-ca-bundle\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:13:38.454548 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.454426 2576 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b252c920-4abd-4d7c-bd5e-d74dcbcd643a-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:13:38.978123 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.978090 2576 generic.go:358] "Generic (PLEG): container finished" podID="b252c920-4abd-4d7c-bd5e-d74dcbcd643a" containerID="fa16f3c670dba4369cd88186858c5b0d2e98ed9c73b5e9cd05fbffa3af8efc83" exitCode=0 Apr 16 18:13:38.978123 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.978123 2576 generic.go:358] "Generic (PLEG): container finished" podID="b252c920-4abd-4d7c-bd5e-d74dcbcd643a" containerID="ed184ce2f059dda12848bd8dabf54a96c498b4d984bbb6de6e0c7896ce6e751c" exitCode=0 Apr 16 18:13:38.978319 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.978135 2576 generic.go:358] "Generic (PLEG): container finished" podID="b252c920-4abd-4d7c-bd5e-d74dcbcd643a" containerID="10185443b88eb998125800b778892279ba793508a2ee767bf5b26ced2072c262" exitCode=0 Apr 16 18:13:38.978319 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.978145 2576 generic.go:358] "Generic (PLEG): container finished" podID="b252c920-4abd-4d7c-bd5e-d74dcbcd643a" containerID="2a85df2c06617580395339185bdbb016052a1e1569f79857cbff0ae3c8cc1054" exitCode=0 Apr 16 18:13:38.978319 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.978153 2576 generic.go:358] "Generic (PLEG): container finished" podID="b252c920-4abd-4d7c-bd5e-d74dcbcd643a" containerID="cf5369ba5d307d3665894b9ef4c9e9d336a9e36233b3e29ca9ac4757a315b6b1" exitCode=0 Apr 16 18:13:38.978319 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.978162 2576 generic.go:358] "Generic (PLEG): container finished" podID="b252c920-4abd-4d7c-bd5e-d74dcbcd643a" containerID="5b3e5375ac3f006e74a4f228fee8e40dec23853a3b8b1d90b8f9e685eaa8f49a" exitCode=0 Apr 16 18:13:38.978319 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.978180 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b252c920-4abd-4d7c-bd5e-d74dcbcd643a","Type":"ContainerDied","Data":"fa16f3c670dba4369cd88186858c5b0d2e98ed9c73b5e9cd05fbffa3af8efc83"} Apr 16 18:13:38.978319 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.978204 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:38.978319 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.978224 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b252c920-4abd-4d7c-bd5e-d74dcbcd643a","Type":"ContainerDied","Data":"ed184ce2f059dda12848bd8dabf54a96c498b4d984bbb6de6e0c7896ce6e751c"} Apr 16 18:13:38.978319 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.978241 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b252c920-4abd-4d7c-bd5e-d74dcbcd643a","Type":"ContainerDied","Data":"10185443b88eb998125800b778892279ba793508a2ee767bf5b26ced2072c262"} Apr 16 18:13:38.978319 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.978254 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b252c920-4abd-4d7c-bd5e-d74dcbcd643a","Type":"ContainerDied","Data":"2a85df2c06617580395339185bdbb016052a1e1569f79857cbff0ae3c8cc1054"} Apr 16 18:13:38.978319 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.978268 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b252c920-4abd-4d7c-bd5e-d74dcbcd643a","Type":"ContainerDied","Data":"cf5369ba5d307d3665894b9ef4c9e9d336a9e36233b3e29ca9ac4757a315b6b1"} Apr 16 18:13:38.978319 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.978281 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b252c920-4abd-4d7c-bd5e-d74dcbcd643a","Type":"ContainerDied","Data":"5b3e5375ac3f006e74a4f228fee8e40dec23853a3b8b1d90b8f9e685eaa8f49a"} Apr 16 18:13:38.978319 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.978297 2576 scope.go:117] "RemoveContainer" containerID="fa16f3c670dba4369cd88186858c5b0d2e98ed9c73b5e9cd05fbffa3af8efc83" Apr 16 18:13:38.978319 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.978306 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b252c920-4abd-4d7c-bd5e-d74dcbcd643a","Type":"ContainerDied","Data":"17000dd6c1a3c9aee9ec804ba604b3c652aec03d16c7035b1a6d4b8fc406247f"} Apr 16 18:13:38.979730 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.979422 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-77cd4447c-rxvpr" event={"ID":"42657ce0-b347-4e89-84f7-5766710baf5f","Type":"ContainerStarted","Data":"e6bd164c0043eda9ade9743c326c5080ee397012d4a0bb81828bd6fbad1a99e1"} Apr 16 18:13:38.987667 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.987498 2576 scope.go:117] "RemoveContainer" containerID="ed184ce2f059dda12848bd8dabf54a96c498b4d984bbb6de6e0c7896ce6e751c" Apr 16 18:13:38.995483 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:38.995461 2576 scope.go:117] "RemoveContainer" containerID="10185443b88eb998125800b778892279ba793508a2ee767bf5b26ced2072c262" Apr 16 18:13:39.005110 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.005092 2576 scope.go:117] "RemoveContainer" containerID="2a85df2c06617580395339185bdbb016052a1e1569f79857cbff0ae3c8cc1054" Apr 16 18:13:39.010006 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.009984 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:13:39.014249 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.014221 2576 scope.go:117] "RemoveContainer" containerID="cf5369ba5d307d3665894b9ef4c9e9d336a9e36233b3e29ca9ac4757a315b6b1" Apr 16 18:13:39.014954 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.014919 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:13:39.022310 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.022290 2576 scope.go:117] "RemoveContainer" containerID="5b3e5375ac3f006e74a4f228fee8e40dec23853a3b8b1d90b8f9e685eaa8f49a" Apr 16 18:13:39.029897 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.029860 2576 scope.go:117] "RemoveContainer" containerID="dc0d3ea3f9ba8ca43960cc2c298db25726d4144cc930fb12accc66833d21b035" Apr 16 18:13:39.036753 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.036733 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:13:39.037168 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.037149 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b252c920-4abd-4d7c-bd5e-d74dcbcd643a" containerName="kube-rbac-proxy-web" Apr 16 18:13:39.037168 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.037170 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b252c920-4abd-4d7c-bd5e-d74dcbcd643a" containerName="kube-rbac-proxy-web" Apr 16 18:13:39.037311 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.037183 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b252c920-4abd-4d7c-bd5e-d74dcbcd643a" containerName="kube-rbac-proxy-thanos" Apr 16 18:13:39.037311 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.037191 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b252c920-4abd-4d7c-bd5e-d74dcbcd643a" containerName="kube-rbac-proxy-thanos" Apr 16 18:13:39.037311 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.037207 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b252c920-4abd-4d7c-bd5e-d74dcbcd643a" containerName="kube-rbac-proxy" Apr 16 18:13:39.037311 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.037216 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b252c920-4abd-4d7c-bd5e-d74dcbcd643a" containerName="kube-rbac-proxy" Apr 16 18:13:39.037311 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.037228 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b252c920-4abd-4d7c-bd5e-d74dcbcd643a" containerName="init-config-reloader" Apr 16 18:13:39.037311 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.037237 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b252c920-4abd-4d7c-bd5e-d74dcbcd643a" containerName="init-config-reloader" Apr 16 18:13:39.037311 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.037246 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b252c920-4abd-4d7c-bd5e-d74dcbcd643a" containerName="config-reloader" Apr 16 18:13:39.037311 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.037255 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b252c920-4abd-4d7c-bd5e-d74dcbcd643a" containerName="config-reloader" Apr 16 18:13:39.037311 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.037268 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b252c920-4abd-4d7c-bd5e-d74dcbcd643a" containerName="prometheus" Apr 16 18:13:39.037311 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.037276 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b252c920-4abd-4d7c-bd5e-d74dcbcd643a" containerName="prometheus" Apr 16 18:13:39.037311 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.037290 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b252c920-4abd-4d7c-bd5e-d74dcbcd643a" containerName="thanos-sidecar" Apr 16 18:13:39.037311 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.037298 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b252c920-4abd-4d7c-bd5e-d74dcbcd643a" containerName="thanos-sidecar" Apr 16 18:13:39.038176 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.037379 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b252c920-4abd-4d7c-bd5e-d74dcbcd643a" containerName="kube-rbac-proxy-web" Apr 16 18:13:39.038176 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.037394 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b252c920-4abd-4d7c-bd5e-d74dcbcd643a" containerName="config-reloader" Apr 16 18:13:39.038176 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.037406 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b252c920-4abd-4d7c-bd5e-d74dcbcd643a" containerName="kube-rbac-proxy-thanos" Apr 16 18:13:39.038176 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.037418 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b252c920-4abd-4d7c-bd5e-d74dcbcd643a" containerName="thanos-sidecar" Apr 16 18:13:39.038176 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.037428 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b252c920-4abd-4d7c-bd5e-d74dcbcd643a" containerName="prometheus" Apr 16 18:13:39.038176 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.037437 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b252c920-4abd-4d7c-bd5e-d74dcbcd643a" containerName="kube-rbac-proxy" Apr 16 18:13:39.038176 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.037654 2576 scope.go:117] "RemoveContainer" containerID="fa16f3c670dba4369cd88186858c5b0d2e98ed9c73b5e9cd05fbffa3af8efc83" Apr 16 18:13:39.038176 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:13:39.037993 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa16f3c670dba4369cd88186858c5b0d2e98ed9c73b5e9cd05fbffa3af8efc83\": container with ID starting with fa16f3c670dba4369cd88186858c5b0d2e98ed9c73b5e9cd05fbffa3af8efc83 not found: ID does not exist" containerID="fa16f3c670dba4369cd88186858c5b0d2e98ed9c73b5e9cd05fbffa3af8efc83" Apr 16 18:13:39.038176 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.038026 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa16f3c670dba4369cd88186858c5b0d2e98ed9c73b5e9cd05fbffa3af8efc83"} err="failed to get container status \"fa16f3c670dba4369cd88186858c5b0d2e98ed9c73b5e9cd05fbffa3af8efc83\": rpc error: code = NotFound desc = could not find container \"fa16f3c670dba4369cd88186858c5b0d2e98ed9c73b5e9cd05fbffa3af8efc83\": container with ID starting with fa16f3c670dba4369cd88186858c5b0d2e98ed9c73b5e9cd05fbffa3af8efc83 not found: ID does not exist" Apr 16 18:13:39.038176 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.038050 2576 scope.go:117] "RemoveContainer" containerID="ed184ce2f059dda12848bd8dabf54a96c498b4d984bbb6de6e0c7896ce6e751c" Apr 16 18:13:39.038867 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:13:39.038839 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed184ce2f059dda12848bd8dabf54a96c498b4d984bbb6de6e0c7896ce6e751c\": container with ID starting with ed184ce2f059dda12848bd8dabf54a96c498b4d984bbb6de6e0c7896ce6e751c not found: ID does not exist" containerID="ed184ce2f059dda12848bd8dabf54a96c498b4d984bbb6de6e0c7896ce6e751c" Apr 16 18:13:39.038974 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.038892 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed184ce2f059dda12848bd8dabf54a96c498b4d984bbb6de6e0c7896ce6e751c"} err="failed to get container status \"ed184ce2f059dda12848bd8dabf54a96c498b4d984bbb6de6e0c7896ce6e751c\": rpc error: code = NotFound desc = could not find container \"ed184ce2f059dda12848bd8dabf54a96c498b4d984bbb6de6e0c7896ce6e751c\": container with ID starting with ed184ce2f059dda12848bd8dabf54a96c498b4d984bbb6de6e0c7896ce6e751c not found: ID does not exist" Apr 16 18:13:39.038974 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.038917 2576 scope.go:117] "RemoveContainer" containerID="10185443b88eb998125800b778892279ba793508a2ee767bf5b26ced2072c262" Apr 16 18:13:39.040723 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:13:39.040404 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10185443b88eb998125800b778892279ba793508a2ee767bf5b26ced2072c262\": container with ID starting with 10185443b88eb998125800b778892279ba793508a2ee767bf5b26ced2072c262 not found: ID does not exist" containerID="10185443b88eb998125800b778892279ba793508a2ee767bf5b26ced2072c262" Apr 16 18:13:39.040723 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.040444 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10185443b88eb998125800b778892279ba793508a2ee767bf5b26ced2072c262"} err="failed to get container status \"10185443b88eb998125800b778892279ba793508a2ee767bf5b26ced2072c262\": rpc error: code = NotFound desc = could not find container \"10185443b88eb998125800b778892279ba793508a2ee767bf5b26ced2072c262\": container with ID starting with 10185443b88eb998125800b778892279ba793508a2ee767bf5b26ced2072c262 not found: ID does not exist" Apr 16 18:13:39.040723 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.040468 2576 scope.go:117] "RemoveContainer" containerID="2a85df2c06617580395339185bdbb016052a1e1569f79857cbff0ae3c8cc1054" Apr 16 18:13:39.040911 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:13:39.040786 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a85df2c06617580395339185bdbb016052a1e1569f79857cbff0ae3c8cc1054\": container with ID starting with 2a85df2c06617580395339185bdbb016052a1e1569f79857cbff0ae3c8cc1054 not found: ID does not exist" containerID="2a85df2c06617580395339185bdbb016052a1e1569f79857cbff0ae3c8cc1054" Apr 16 18:13:39.040911 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.040827 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a85df2c06617580395339185bdbb016052a1e1569f79857cbff0ae3c8cc1054"} err="failed to get container status \"2a85df2c06617580395339185bdbb016052a1e1569f79857cbff0ae3c8cc1054\": rpc error: code = NotFound desc = could not find container \"2a85df2c06617580395339185bdbb016052a1e1569f79857cbff0ae3c8cc1054\": container with ID starting with 2a85df2c06617580395339185bdbb016052a1e1569f79857cbff0ae3c8cc1054 not found: ID does not exist" Apr 16 18:13:39.040911 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.040850 2576 scope.go:117] "RemoveContainer" containerID="cf5369ba5d307d3665894b9ef4c9e9d336a9e36233b3e29ca9ac4757a315b6b1" Apr 16 18:13:39.044196 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:13:39.041137 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf5369ba5d307d3665894b9ef4c9e9d336a9e36233b3e29ca9ac4757a315b6b1\": container with ID starting with cf5369ba5d307d3665894b9ef4c9e9d336a9e36233b3e29ca9ac4757a315b6b1 not found: ID does not exist" containerID="cf5369ba5d307d3665894b9ef4c9e9d336a9e36233b3e29ca9ac4757a315b6b1" Apr 16 18:13:39.044196 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.041173 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf5369ba5d307d3665894b9ef4c9e9d336a9e36233b3e29ca9ac4757a315b6b1"} err="failed to get container status \"cf5369ba5d307d3665894b9ef4c9e9d336a9e36233b3e29ca9ac4757a315b6b1\": rpc error: code = NotFound desc = could not find container \"cf5369ba5d307d3665894b9ef4c9e9d336a9e36233b3e29ca9ac4757a315b6b1\": container with ID starting with cf5369ba5d307d3665894b9ef4c9e9d336a9e36233b3e29ca9ac4757a315b6b1 not found: ID does not exist" Apr 16 18:13:39.044196 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.041194 2576 scope.go:117] "RemoveContainer" containerID="5b3e5375ac3f006e74a4f228fee8e40dec23853a3b8b1d90b8f9e685eaa8f49a" Apr 16 18:13:39.044196 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:13:39.041592 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b3e5375ac3f006e74a4f228fee8e40dec23853a3b8b1d90b8f9e685eaa8f49a\": container with ID starting with 5b3e5375ac3f006e74a4f228fee8e40dec23853a3b8b1d90b8f9e685eaa8f49a not found: ID does not exist" containerID="5b3e5375ac3f006e74a4f228fee8e40dec23853a3b8b1d90b8f9e685eaa8f49a" Apr 16 18:13:39.044196 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.041622 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b3e5375ac3f006e74a4f228fee8e40dec23853a3b8b1d90b8f9e685eaa8f49a"} err="failed to get container status \"5b3e5375ac3f006e74a4f228fee8e40dec23853a3b8b1d90b8f9e685eaa8f49a\": rpc error: code = NotFound desc = could not find container \"5b3e5375ac3f006e74a4f228fee8e40dec23853a3b8b1d90b8f9e685eaa8f49a\": container with ID starting with 5b3e5375ac3f006e74a4f228fee8e40dec23853a3b8b1d90b8f9e685eaa8f49a not found: ID does not exist" Apr 16 18:13:39.044196 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.041641 2576 scope.go:117] "RemoveContainer" containerID="dc0d3ea3f9ba8ca43960cc2c298db25726d4144cc930fb12accc66833d21b035" Apr 16 18:13:39.044794 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:13:39.044765 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc0d3ea3f9ba8ca43960cc2c298db25726d4144cc930fb12accc66833d21b035\": container with ID starting with dc0d3ea3f9ba8ca43960cc2c298db25726d4144cc930fb12accc66833d21b035 not found: ID does not exist" containerID="dc0d3ea3f9ba8ca43960cc2c298db25726d4144cc930fb12accc66833d21b035" Apr 16 18:13:39.044868 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.044802 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc0d3ea3f9ba8ca43960cc2c298db25726d4144cc930fb12accc66833d21b035"} err="failed to get container status \"dc0d3ea3f9ba8ca43960cc2c298db25726d4144cc930fb12accc66833d21b035\": rpc error: code = NotFound desc = could not find container \"dc0d3ea3f9ba8ca43960cc2c298db25726d4144cc930fb12accc66833d21b035\": container with ID starting with dc0d3ea3f9ba8ca43960cc2c298db25726d4144cc930fb12accc66833d21b035 not found: ID does not exist" Apr 16 18:13:39.044868 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.044821 2576 scope.go:117] "RemoveContainer" containerID="fa16f3c670dba4369cd88186858c5b0d2e98ed9c73b5e9cd05fbffa3af8efc83" Apr 16 18:13:39.045150 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.045120 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa16f3c670dba4369cd88186858c5b0d2e98ed9c73b5e9cd05fbffa3af8efc83"} err="failed to get container status \"fa16f3c670dba4369cd88186858c5b0d2e98ed9c73b5e9cd05fbffa3af8efc83\": rpc error: code = NotFound desc = could not find container \"fa16f3c670dba4369cd88186858c5b0d2e98ed9c73b5e9cd05fbffa3af8efc83\": container with ID starting with fa16f3c670dba4369cd88186858c5b0d2e98ed9c73b5e9cd05fbffa3af8efc83 not found: ID does not exist" Apr 16 18:13:39.045236 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.045151 2576 scope.go:117] "RemoveContainer" containerID="ed184ce2f059dda12848bd8dabf54a96c498b4d984bbb6de6e0c7896ce6e751c" Apr 16 18:13:39.045371 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.045349 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.045428 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.045396 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed184ce2f059dda12848bd8dabf54a96c498b4d984bbb6de6e0c7896ce6e751c"} err="failed to get container status \"ed184ce2f059dda12848bd8dabf54a96c498b4d984bbb6de6e0c7896ce6e751c\": rpc error: code = NotFound desc = could not find container \"ed184ce2f059dda12848bd8dabf54a96c498b4d984bbb6de6e0c7896ce6e751c\": container with ID starting with ed184ce2f059dda12848bd8dabf54a96c498b4d984bbb6de6e0c7896ce6e751c not found: ID does not exist" Apr 16 18:13:39.045428 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.045418 2576 scope.go:117] "RemoveContainer" containerID="10185443b88eb998125800b778892279ba793508a2ee767bf5b26ced2072c262" Apr 16 18:13:39.046552 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.046514 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10185443b88eb998125800b778892279ba793508a2ee767bf5b26ced2072c262"} err="failed to get container status \"10185443b88eb998125800b778892279ba793508a2ee767bf5b26ced2072c262\": rpc error: code = NotFound desc = could not find container \"10185443b88eb998125800b778892279ba793508a2ee767bf5b26ced2072c262\": container with ID starting with 10185443b88eb998125800b778892279ba793508a2ee767bf5b26ced2072c262 not found: ID does not exist" Apr 16 18:13:39.046639 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.046551 2576 scope.go:117] "RemoveContainer" containerID="2a85df2c06617580395339185bdbb016052a1e1569f79857cbff0ae3c8cc1054" Apr 16 18:13:39.046896 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.046858 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a85df2c06617580395339185bdbb016052a1e1569f79857cbff0ae3c8cc1054"} err="failed to get container status \"2a85df2c06617580395339185bdbb016052a1e1569f79857cbff0ae3c8cc1054\": rpc error: code = NotFound desc = could not find container \"2a85df2c06617580395339185bdbb016052a1e1569f79857cbff0ae3c8cc1054\": container with ID starting with 2a85df2c06617580395339185bdbb016052a1e1569f79857cbff0ae3c8cc1054 not found: ID does not exist" Apr 16 18:13:39.046992 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.046901 2576 scope.go:117] "RemoveContainer" containerID="cf5369ba5d307d3665894b9ef4c9e9d336a9e36233b3e29ca9ac4757a315b6b1" Apr 16 18:13:39.047341 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.047308 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf5369ba5d307d3665894b9ef4c9e9d336a9e36233b3e29ca9ac4757a315b6b1"} err="failed to get container status \"cf5369ba5d307d3665894b9ef4c9e9d336a9e36233b3e29ca9ac4757a315b6b1\": rpc error: code = NotFound desc = could not find container \"cf5369ba5d307d3665894b9ef4c9e9d336a9e36233b3e29ca9ac4757a315b6b1\": container with ID starting with cf5369ba5d307d3665894b9ef4c9e9d336a9e36233b3e29ca9ac4757a315b6b1 not found: ID does not exist" Apr 16 18:13:39.047420 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.047343 2576 scope.go:117] "RemoveContainer" containerID="5b3e5375ac3f006e74a4f228fee8e40dec23853a3b8b1d90b8f9e685eaa8f49a" Apr 16 18:13:39.047614 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.047585 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b3e5375ac3f006e74a4f228fee8e40dec23853a3b8b1d90b8f9e685eaa8f49a"} err="failed to get container status \"5b3e5375ac3f006e74a4f228fee8e40dec23853a3b8b1d90b8f9e685eaa8f49a\": rpc error: code = NotFound desc = could not find container \"5b3e5375ac3f006e74a4f228fee8e40dec23853a3b8b1d90b8f9e685eaa8f49a\": container with ID starting with 5b3e5375ac3f006e74a4f228fee8e40dec23853a3b8b1d90b8f9e685eaa8f49a not found: ID does not exist" Apr 16 18:13:39.047614 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.047611 2576 scope.go:117] "RemoveContainer" containerID="dc0d3ea3f9ba8ca43960cc2c298db25726d4144cc930fb12accc66833d21b035" Apr 16 18:13:39.047852 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.047824 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc0d3ea3f9ba8ca43960cc2c298db25726d4144cc930fb12accc66833d21b035"} err="failed to get container status \"dc0d3ea3f9ba8ca43960cc2c298db25726d4144cc930fb12accc66833d21b035\": rpc error: code = NotFound desc = could not find container \"dc0d3ea3f9ba8ca43960cc2c298db25726d4144cc930fb12accc66833d21b035\": container with ID starting with dc0d3ea3f9ba8ca43960cc2c298db25726d4144cc930fb12accc66833d21b035 not found: ID does not exist" Apr 16 18:13:39.047916 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.047854 2576 scope.go:117] "RemoveContainer" containerID="fa16f3c670dba4369cd88186858c5b0d2e98ed9c73b5e9cd05fbffa3af8efc83" Apr 16 18:13:39.048062 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.048040 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa16f3c670dba4369cd88186858c5b0d2e98ed9c73b5e9cd05fbffa3af8efc83"} err="failed to get container status \"fa16f3c670dba4369cd88186858c5b0d2e98ed9c73b5e9cd05fbffa3af8efc83\": rpc error: code = NotFound desc = could not find container \"fa16f3c670dba4369cd88186858c5b0d2e98ed9c73b5e9cd05fbffa3af8efc83\": container with ID starting with fa16f3c670dba4369cd88186858c5b0d2e98ed9c73b5e9cd05fbffa3af8efc83 not found: ID does not exist" Apr 16 18:13:39.048062 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.048063 2576 scope.go:117] "RemoveContainer" containerID="ed184ce2f059dda12848bd8dabf54a96c498b4d984bbb6de6e0c7896ce6e751c" Apr 16 18:13:39.048379 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.048355 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed184ce2f059dda12848bd8dabf54a96c498b4d984bbb6de6e0c7896ce6e751c"} err="failed to get container status \"ed184ce2f059dda12848bd8dabf54a96c498b4d984bbb6de6e0c7896ce6e751c\": rpc error: code = NotFound desc = could not find container \"ed184ce2f059dda12848bd8dabf54a96c498b4d984bbb6de6e0c7896ce6e751c\": container with ID starting with ed184ce2f059dda12848bd8dabf54a96c498b4d984bbb6de6e0c7896ce6e751c not found: ID does not exist" Apr 16 18:13:39.048470 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.048382 2576 scope.go:117] "RemoveContainer" containerID="10185443b88eb998125800b778892279ba793508a2ee767bf5b26ced2072c262" Apr 16 18:13:39.048560 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.048541 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 18:13:39.048617 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.048560 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-8lhrgmnf08hm\"" Apr 16 18:13:39.048617 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.048560 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 18:13:39.048744 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.048619 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 18:13:39.048744 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.048667 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 18:13:39.048843 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.048788 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 18:13:39.048938 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.048914 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10185443b88eb998125800b778892279ba793508a2ee767bf5b26ced2072c262"} err="failed to get container status \"10185443b88eb998125800b778892279ba793508a2ee767bf5b26ced2072c262\": rpc error: code = NotFound desc = could not find container \"10185443b88eb998125800b778892279ba793508a2ee767bf5b26ced2072c262\": container with ID starting with 10185443b88eb998125800b778892279ba793508a2ee767bf5b26ced2072c262 not found: ID does not exist" Apr 16 18:13:39.049001 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.048970 2576 scope.go:117] "RemoveContainer" containerID="2a85df2c06617580395339185bdbb016052a1e1569f79857cbff0ae3c8cc1054" Apr 16 18:13:39.049001 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.048986 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 18:13:39.049090 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.049002 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 18:13:39.049490 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.049226 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 18:13:39.049490 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.049234 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 18:13:39.049490 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.049234 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 18:13:39.049490 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.049301 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-84wx4\"" Apr 16 18:13:39.049490 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.049320 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 18:13:39.049490 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.049365 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a85df2c06617580395339185bdbb016052a1e1569f79857cbff0ae3c8cc1054"} err="failed to get container status \"2a85df2c06617580395339185bdbb016052a1e1569f79857cbff0ae3c8cc1054\": rpc error: code = NotFound desc = could not find container \"2a85df2c06617580395339185bdbb016052a1e1569f79857cbff0ae3c8cc1054\": container with ID starting with 2a85df2c06617580395339185bdbb016052a1e1569f79857cbff0ae3c8cc1054 not found: ID does not exist" Apr 16 18:13:39.049490 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.049387 2576 scope.go:117] "RemoveContainer" containerID="cf5369ba5d307d3665894b9ef4c9e9d336a9e36233b3e29ca9ac4757a315b6b1" Apr 16 18:13:39.050106 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.049608 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf5369ba5d307d3665894b9ef4c9e9d336a9e36233b3e29ca9ac4757a315b6b1"} err="failed to get container status \"cf5369ba5d307d3665894b9ef4c9e9d336a9e36233b3e29ca9ac4757a315b6b1\": rpc error: code = NotFound desc = could not find container \"cf5369ba5d307d3665894b9ef4c9e9d336a9e36233b3e29ca9ac4757a315b6b1\": container with ID starting with cf5369ba5d307d3665894b9ef4c9e9d336a9e36233b3e29ca9ac4757a315b6b1 not found: ID does not exist" Apr 16 18:13:39.050106 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.049623 2576 scope.go:117] "RemoveContainer" containerID="5b3e5375ac3f006e74a4f228fee8e40dec23853a3b8b1d90b8f9e685eaa8f49a" Apr 16 18:13:39.050106 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.049824 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b3e5375ac3f006e74a4f228fee8e40dec23853a3b8b1d90b8f9e685eaa8f49a"} err="failed to get container status \"5b3e5375ac3f006e74a4f228fee8e40dec23853a3b8b1d90b8f9e685eaa8f49a\": rpc error: code = NotFound desc = could not find container \"5b3e5375ac3f006e74a4f228fee8e40dec23853a3b8b1d90b8f9e685eaa8f49a\": container with ID starting with 5b3e5375ac3f006e74a4f228fee8e40dec23853a3b8b1d90b8f9e685eaa8f49a not found: ID does not exist" Apr 16 18:13:39.050106 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.049846 2576 scope.go:117] "RemoveContainer" containerID="dc0d3ea3f9ba8ca43960cc2c298db25726d4144cc930fb12accc66833d21b035" Apr 16 18:13:39.050346 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.050280 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc0d3ea3f9ba8ca43960cc2c298db25726d4144cc930fb12accc66833d21b035"} err="failed to get container status \"dc0d3ea3f9ba8ca43960cc2c298db25726d4144cc930fb12accc66833d21b035\": rpc error: code = NotFound desc = could not find container \"dc0d3ea3f9ba8ca43960cc2c298db25726d4144cc930fb12accc66833d21b035\": container with ID starting with dc0d3ea3f9ba8ca43960cc2c298db25726d4144cc930fb12accc66833d21b035 not found: ID does not exist" Apr 16 18:13:39.050346 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.050311 2576 scope.go:117] "RemoveContainer" containerID="fa16f3c670dba4369cd88186858c5b0d2e98ed9c73b5e9cd05fbffa3af8efc83" Apr 16 18:13:39.050692 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.050659 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa16f3c670dba4369cd88186858c5b0d2e98ed9c73b5e9cd05fbffa3af8efc83"} err="failed to get container status \"fa16f3c670dba4369cd88186858c5b0d2e98ed9c73b5e9cd05fbffa3af8efc83\": rpc error: code = NotFound desc = could not find container \"fa16f3c670dba4369cd88186858c5b0d2e98ed9c73b5e9cd05fbffa3af8efc83\": container with ID starting with fa16f3c670dba4369cd88186858c5b0d2e98ed9c73b5e9cd05fbffa3af8efc83 not found: ID does not exist" Apr 16 18:13:39.050692 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.050685 2576 scope.go:117] "RemoveContainer" containerID="ed184ce2f059dda12848bd8dabf54a96c498b4d984bbb6de6e0c7896ce6e751c" Apr 16 18:13:39.050997 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.050971 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed184ce2f059dda12848bd8dabf54a96c498b4d984bbb6de6e0c7896ce6e751c"} err="failed to get container status \"ed184ce2f059dda12848bd8dabf54a96c498b4d984bbb6de6e0c7896ce6e751c\": rpc error: code = NotFound desc = could not find container \"ed184ce2f059dda12848bd8dabf54a96c498b4d984bbb6de6e0c7896ce6e751c\": container with ID starting with ed184ce2f059dda12848bd8dabf54a96c498b4d984bbb6de6e0c7896ce6e751c not found: ID does not exist" Apr 16 18:13:39.051085 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.051000 2576 scope.go:117] "RemoveContainer" containerID="10185443b88eb998125800b778892279ba793508a2ee767bf5b26ced2072c262" Apr 16 18:13:39.051409 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.051365 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10185443b88eb998125800b778892279ba793508a2ee767bf5b26ced2072c262"} err="failed to get container status \"10185443b88eb998125800b778892279ba793508a2ee767bf5b26ced2072c262\": rpc error: code = NotFound desc = could not find container \"10185443b88eb998125800b778892279ba793508a2ee767bf5b26ced2072c262\": container with ID starting with 10185443b88eb998125800b778892279ba793508a2ee767bf5b26ced2072c262 not found: ID does not exist" Apr 16 18:13:39.051409 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.051391 2576 scope.go:117] "RemoveContainer" containerID="2a85df2c06617580395339185bdbb016052a1e1569f79857cbff0ae3c8cc1054" Apr 16 18:13:39.051740 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.051626 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a85df2c06617580395339185bdbb016052a1e1569f79857cbff0ae3c8cc1054"} err="failed to get container status \"2a85df2c06617580395339185bdbb016052a1e1569f79857cbff0ae3c8cc1054\": rpc error: code = NotFound desc = could not find container \"2a85df2c06617580395339185bdbb016052a1e1569f79857cbff0ae3c8cc1054\": container with ID starting with 2a85df2c06617580395339185bdbb016052a1e1569f79857cbff0ae3c8cc1054 not found: ID does not exist" Apr 16 18:13:39.051740 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.051654 2576 scope.go:117] "RemoveContainer" containerID="cf5369ba5d307d3665894b9ef4c9e9d336a9e36233b3e29ca9ac4757a315b6b1" Apr 16 18:13:39.052013 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.051986 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf5369ba5d307d3665894b9ef4c9e9d336a9e36233b3e29ca9ac4757a315b6b1"} err="failed to get container status \"cf5369ba5d307d3665894b9ef4c9e9d336a9e36233b3e29ca9ac4757a315b6b1\": rpc error: code = NotFound desc = could not find container \"cf5369ba5d307d3665894b9ef4c9e9d336a9e36233b3e29ca9ac4757a315b6b1\": container with ID starting with cf5369ba5d307d3665894b9ef4c9e9d336a9e36233b3e29ca9ac4757a315b6b1 not found: ID does not exist" Apr 16 18:13:39.052116 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.052016 2576 scope.go:117] "RemoveContainer" containerID="5b3e5375ac3f006e74a4f228fee8e40dec23853a3b8b1d90b8f9e685eaa8f49a" Apr 16 18:13:39.052473 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.052410 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b3e5375ac3f006e74a4f228fee8e40dec23853a3b8b1d90b8f9e685eaa8f49a"} err="failed to get container status \"5b3e5375ac3f006e74a4f228fee8e40dec23853a3b8b1d90b8f9e685eaa8f49a\": rpc error: code = NotFound desc = could not find container \"5b3e5375ac3f006e74a4f228fee8e40dec23853a3b8b1d90b8f9e685eaa8f49a\": container with ID starting with 5b3e5375ac3f006e74a4f228fee8e40dec23853a3b8b1d90b8f9e685eaa8f49a not found: ID does not exist" Apr 16 18:13:39.052473 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.052461 2576 scope.go:117] "RemoveContainer" containerID="dc0d3ea3f9ba8ca43960cc2c298db25726d4144cc930fb12accc66833d21b035" Apr 16 18:13:39.052687 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.052672 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 18:13:39.054166 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.054021 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc0d3ea3f9ba8ca43960cc2c298db25726d4144cc930fb12accc66833d21b035"} err="failed to get container status \"dc0d3ea3f9ba8ca43960cc2c298db25726d4144cc930fb12accc66833d21b035\": rpc error: code = NotFound desc = could not find container \"dc0d3ea3f9ba8ca43960cc2c298db25726d4144cc930fb12accc66833d21b035\": container with ID starting with dc0d3ea3f9ba8ca43960cc2c298db25726d4144cc930fb12accc66833d21b035 not found: ID does not exist" Apr 16 18:13:39.054166 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.054047 2576 scope.go:117] "RemoveContainer" containerID="fa16f3c670dba4369cd88186858c5b0d2e98ed9c73b5e9cd05fbffa3af8efc83" Apr 16 18:13:39.054949 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.054853 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa16f3c670dba4369cd88186858c5b0d2e98ed9c73b5e9cd05fbffa3af8efc83"} err="failed to get container status \"fa16f3c670dba4369cd88186858c5b0d2e98ed9c73b5e9cd05fbffa3af8efc83\": rpc error: code = NotFound desc = could not find container \"fa16f3c670dba4369cd88186858c5b0d2e98ed9c73b5e9cd05fbffa3af8efc83\": container with ID starting with fa16f3c670dba4369cd88186858c5b0d2e98ed9c73b5e9cd05fbffa3af8efc83 not found: ID does not exist" Apr 16 18:13:39.054949 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.054878 2576 scope.go:117] "RemoveContainer" containerID="ed184ce2f059dda12848bd8dabf54a96c498b4d984bbb6de6e0c7896ce6e751c" Apr 16 18:13:39.055387 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.055361 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed184ce2f059dda12848bd8dabf54a96c498b4d984bbb6de6e0c7896ce6e751c"} err="failed to get container status \"ed184ce2f059dda12848bd8dabf54a96c498b4d984bbb6de6e0c7896ce6e751c\": rpc error: code = NotFound desc = could not find container \"ed184ce2f059dda12848bd8dabf54a96c498b4d984bbb6de6e0c7896ce6e751c\": container with ID starting with ed184ce2f059dda12848bd8dabf54a96c498b4d984bbb6de6e0c7896ce6e751c not found: ID does not exist" Apr 16 18:13:39.055500 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.055484 2576 scope.go:117] "RemoveContainer" containerID="10185443b88eb998125800b778892279ba793508a2ee767bf5b26ced2072c262" Apr 16 18:13:39.055763 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.055744 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:13:39.055873 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.055846 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10185443b88eb998125800b778892279ba793508a2ee767bf5b26ced2072c262"} err="failed to get container status \"10185443b88eb998125800b778892279ba793508a2ee767bf5b26ced2072c262\": rpc error: code = NotFound desc = could not find container \"10185443b88eb998125800b778892279ba793508a2ee767bf5b26ced2072c262\": container with ID starting with 10185443b88eb998125800b778892279ba793508a2ee767bf5b26ced2072c262 not found: ID does not exist" Apr 16 18:13:39.055963 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.055875 2576 scope.go:117] "RemoveContainer" containerID="2a85df2c06617580395339185bdbb016052a1e1569f79857cbff0ae3c8cc1054" Apr 16 18:13:39.056222 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.056199 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a85df2c06617580395339185bdbb016052a1e1569f79857cbff0ae3c8cc1054"} err="failed to get container status \"2a85df2c06617580395339185bdbb016052a1e1569f79857cbff0ae3c8cc1054\": rpc error: code = NotFound desc = could not find container \"2a85df2c06617580395339185bdbb016052a1e1569f79857cbff0ae3c8cc1054\": container with ID starting with 2a85df2c06617580395339185bdbb016052a1e1569f79857cbff0ae3c8cc1054 not found: ID does not exist" Apr 16 18:13:39.056300 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.056224 2576 scope.go:117] "RemoveContainer" containerID="cf5369ba5d307d3665894b9ef4c9e9d336a9e36233b3e29ca9ac4757a315b6b1" Apr 16 18:13:39.056585 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.056460 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf5369ba5d307d3665894b9ef4c9e9d336a9e36233b3e29ca9ac4757a315b6b1"} err="failed to get container status \"cf5369ba5d307d3665894b9ef4c9e9d336a9e36233b3e29ca9ac4757a315b6b1\": rpc error: code = NotFound desc = could not find container \"cf5369ba5d307d3665894b9ef4c9e9d336a9e36233b3e29ca9ac4757a315b6b1\": container with ID starting with cf5369ba5d307d3665894b9ef4c9e9d336a9e36233b3e29ca9ac4757a315b6b1 not found: ID does not exist" Apr 16 18:13:39.056585 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.056516 2576 scope.go:117] "RemoveContainer" containerID="5b3e5375ac3f006e74a4f228fee8e40dec23853a3b8b1d90b8f9e685eaa8f49a" Apr 16 18:13:39.056786 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.056760 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b3e5375ac3f006e74a4f228fee8e40dec23853a3b8b1d90b8f9e685eaa8f49a"} err="failed to get container status \"5b3e5375ac3f006e74a4f228fee8e40dec23853a3b8b1d90b8f9e685eaa8f49a\": rpc error: code = NotFound desc = could not find container \"5b3e5375ac3f006e74a4f228fee8e40dec23853a3b8b1d90b8f9e685eaa8f49a\": container with ID starting with 5b3e5375ac3f006e74a4f228fee8e40dec23853a3b8b1d90b8f9e685eaa8f49a not found: ID does not exist" Apr 16 18:13:39.056786 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.056784 2576 scope.go:117] "RemoveContainer" containerID="dc0d3ea3f9ba8ca43960cc2c298db25726d4144cc930fb12accc66833d21b035" Apr 16 18:13:39.057007 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.056985 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 18:13:39.057123 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.057035 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc0d3ea3f9ba8ca43960cc2c298db25726d4144cc930fb12accc66833d21b035"} err="failed to get container status \"dc0d3ea3f9ba8ca43960cc2c298db25726d4144cc930fb12accc66833d21b035\": rpc error: code = NotFound desc = could not find container \"dc0d3ea3f9ba8ca43960cc2c298db25726d4144cc930fb12accc66833d21b035\": container with ID starting with dc0d3ea3f9ba8ca43960cc2c298db25726d4144cc930fb12accc66833d21b035 not found: ID does not exist" Apr 16 18:13:39.057123 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.057058 2576 scope.go:117] "RemoveContainer" containerID="fa16f3c670dba4369cd88186858c5b0d2e98ed9c73b5e9cd05fbffa3af8efc83" Apr 16 18:13:39.057451 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.057405 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa16f3c670dba4369cd88186858c5b0d2e98ed9c73b5e9cd05fbffa3af8efc83"} err="failed to get container status \"fa16f3c670dba4369cd88186858c5b0d2e98ed9c73b5e9cd05fbffa3af8efc83\": rpc error: code = NotFound desc = could not find container \"fa16f3c670dba4369cd88186858c5b0d2e98ed9c73b5e9cd05fbffa3af8efc83\": container with ID starting with fa16f3c670dba4369cd88186858c5b0d2e98ed9c73b5e9cd05fbffa3af8efc83 not found: ID does not exist" Apr 16 18:13:39.057538 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.057453 2576 scope.go:117] "RemoveContainer" containerID="ed184ce2f059dda12848bd8dabf54a96c498b4d984bbb6de6e0c7896ce6e751c" Apr 16 18:13:39.057907 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.057878 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed184ce2f059dda12848bd8dabf54a96c498b4d984bbb6de6e0c7896ce6e751c"} err="failed to get container status \"ed184ce2f059dda12848bd8dabf54a96c498b4d984bbb6de6e0c7896ce6e751c\": rpc error: code = NotFound desc = could not find container \"ed184ce2f059dda12848bd8dabf54a96c498b4d984bbb6de6e0c7896ce6e751c\": container with ID starting with ed184ce2f059dda12848bd8dabf54a96c498b4d984bbb6de6e0c7896ce6e751c not found: ID does not exist" Apr 16 18:13:39.057907 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.057902 2576 scope.go:117] "RemoveContainer" containerID="10185443b88eb998125800b778892279ba793508a2ee767bf5b26ced2072c262" Apr 16 18:13:39.058250 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.058229 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10185443b88eb998125800b778892279ba793508a2ee767bf5b26ced2072c262"} err="failed to get container status \"10185443b88eb998125800b778892279ba793508a2ee767bf5b26ced2072c262\": rpc error: code = NotFound desc = could not find container \"10185443b88eb998125800b778892279ba793508a2ee767bf5b26ced2072c262\": container with ID starting with 10185443b88eb998125800b778892279ba793508a2ee767bf5b26ced2072c262 not found: ID does not exist" Apr 16 18:13:39.058250 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.058249 2576 scope.go:117] "RemoveContainer" containerID="2a85df2c06617580395339185bdbb016052a1e1569f79857cbff0ae3c8cc1054" Apr 16 18:13:39.058664 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.058637 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a85df2c06617580395339185bdbb016052a1e1569f79857cbff0ae3c8cc1054"} err="failed to get container status \"2a85df2c06617580395339185bdbb016052a1e1569f79857cbff0ae3c8cc1054\": rpc error: code = NotFound desc = could not find container \"2a85df2c06617580395339185bdbb016052a1e1569f79857cbff0ae3c8cc1054\": container with ID starting with 2a85df2c06617580395339185bdbb016052a1e1569f79857cbff0ae3c8cc1054 not found: ID does not exist" Apr 16 18:13:39.058664 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.058663 2576 scope.go:117] "RemoveContainer" containerID="cf5369ba5d307d3665894b9ef4c9e9d336a9e36233b3e29ca9ac4757a315b6b1" Apr 16 18:13:39.058957 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.058928 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf5369ba5d307d3665894b9ef4c9e9d336a9e36233b3e29ca9ac4757a315b6b1"} err="failed to get container status \"cf5369ba5d307d3665894b9ef4c9e9d336a9e36233b3e29ca9ac4757a315b6b1\": rpc error: code = NotFound desc = could not find container \"cf5369ba5d307d3665894b9ef4c9e9d336a9e36233b3e29ca9ac4757a315b6b1\": container with ID starting with cf5369ba5d307d3665894b9ef4c9e9d336a9e36233b3e29ca9ac4757a315b6b1 not found: ID does not exist" Apr 16 18:13:39.059053 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.058962 2576 scope.go:117] "RemoveContainer" containerID="5b3e5375ac3f006e74a4f228fee8e40dec23853a3b8b1d90b8f9e685eaa8f49a" Apr 16 18:13:39.059465 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.059429 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b3e5375ac3f006e74a4f228fee8e40dec23853a3b8b1d90b8f9e685eaa8f49a"} err="failed to get container status \"5b3e5375ac3f006e74a4f228fee8e40dec23853a3b8b1d90b8f9e685eaa8f49a\": rpc error: code = NotFound desc = could not find container \"5b3e5375ac3f006e74a4f228fee8e40dec23853a3b8b1d90b8f9e685eaa8f49a\": container with ID starting with 5b3e5375ac3f006e74a4f228fee8e40dec23853a3b8b1d90b8f9e685eaa8f49a not found: ID does not exist" Apr 16 18:13:39.059465 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.059456 2576 scope.go:117] "RemoveContainer" containerID="dc0d3ea3f9ba8ca43960cc2c298db25726d4144cc930fb12accc66833d21b035" Apr 16 18:13:39.059755 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.059727 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc0d3ea3f9ba8ca43960cc2c298db25726d4144cc930fb12accc66833d21b035"} err="failed to get container status \"dc0d3ea3f9ba8ca43960cc2c298db25726d4144cc930fb12accc66833d21b035\": rpc error: code = NotFound desc = could not find container \"dc0d3ea3f9ba8ca43960cc2c298db25726d4144cc930fb12accc66833d21b035\": container with ID starting with dc0d3ea3f9ba8ca43960cc2c298db25726d4144cc930fb12accc66833d21b035 not found: ID does not exist" Apr 16 18:13:39.159259 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.159222 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0a371e60-be7a-4ac0-ae72-3b033c2a5433-config\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.159259 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.159265 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0a371e60-be7a-4ac0-ae72-3b033c2a5433-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.159556 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.159291 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0a371e60-be7a-4ac0-ae72-3b033c2a5433-web-config\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.159556 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.159327 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0a371e60-be7a-4ac0-ae72-3b033c2a5433-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.159556 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.159344 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a371e60-be7a-4ac0-ae72-3b033c2a5433-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.159556 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.159369 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0a371e60-be7a-4ac0-ae72-3b033c2a5433-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.159556 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.159400 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0a371e60-be7a-4ac0-ae72-3b033c2a5433-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.159556 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.159416 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5km94\" (UniqueName: \"kubernetes.io/projected/0a371e60-be7a-4ac0-ae72-3b033c2a5433-kube-api-access-5km94\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.159556 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.159448 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0a371e60-be7a-4ac0-ae72-3b033c2a5433-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.159556 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.159542 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0a371e60-be7a-4ac0-ae72-3b033c2a5433-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.159887 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.159580 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0a371e60-be7a-4ac0-ae72-3b033c2a5433-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.159887 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.159632 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a371e60-be7a-4ac0-ae72-3b033c2a5433-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.159887 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.159658 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a371e60-be7a-4ac0-ae72-3b033c2a5433-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.159887 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.159729 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0a371e60-be7a-4ac0-ae72-3b033c2a5433-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.159887 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.159759 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0a371e60-be7a-4ac0-ae72-3b033c2a5433-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.159887 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.159792 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0a371e60-be7a-4ac0-ae72-3b033c2a5433-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.159887 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.159814 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0a371e60-be7a-4ac0-ae72-3b033c2a5433-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.159887 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.159851 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0a371e60-be7a-4ac0-ae72-3b033c2a5433-config-out\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.260271 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.260186 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0a371e60-be7a-4ac0-ae72-3b033c2a5433-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.260271 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.260228 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0a371e60-be7a-4ac0-ae72-3b033c2a5433-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.260271 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.260249 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0a371e60-be7a-4ac0-ae72-3b033c2a5433-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.260530 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.260283 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0a371e60-be7a-4ac0-ae72-3b033c2a5433-config-out\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.260530 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.260316 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0a371e60-be7a-4ac0-ae72-3b033c2a5433-config\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.260530 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.260338 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0a371e60-be7a-4ac0-ae72-3b033c2a5433-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.260530 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.260360 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0a371e60-be7a-4ac0-ae72-3b033c2a5433-web-config\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.260530 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.260389 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0a371e60-be7a-4ac0-ae72-3b033c2a5433-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.260530 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.260413 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a371e60-be7a-4ac0-ae72-3b033c2a5433-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.260530 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.260437 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0a371e60-be7a-4ac0-ae72-3b033c2a5433-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.260530 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.260475 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0a371e60-be7a-4ac0-ae72-3b033c2a5433-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.260530 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.260498 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5km94\" (UniqueName: \"kubernetes.io/projected/0a371e60-be7a-4ac0-ae72-3b033c2a5433-kube-api-access-5km94\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.261066 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.260536 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0a371e60-be7a-4ac0-ae72-3b033c2a5433-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.261066 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.260579 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0a371e60-be7a-4ac0-ae72-3b033c2a5433-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.261066 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.260603 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0a371e60-be7a-4ac0-ae72-3b033c2a5433-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.261066 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.260640 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a371e60-be7a-4ac0-ae72-3b033c2a5433-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.261066 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.260672 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a371e60-be7a-4ac0-ae72-3b033c2a5433-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.261066 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.260724 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0a371e60-be7a-4ac0-ae72-3b033c2a5433-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.261924 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.261564 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a371e60-be7a-4ac0-ae72-3b033c2a5433-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.261924 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.261617 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0a371e60-be7a-4ac0-ae72-3b033c2a5433-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.263919 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.263528 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0a371e60-be7a-4ac0-ae72-3b033c2a5433-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.263919 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.263790 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0a371e60-be7a-4ac0-ae72-3b033c2a5433-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.263919 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.263841 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0a371e60-be7a-4ac0-ae72-3b033c2a5433-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.265397 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.264374 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0a371e60-be7a-4ac0-ae72-3b033c2a5433-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.265397 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.264450 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a371e60-be7a-4ac0-ae72-3b033c2a5433-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.265397 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.264604 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0a371e60-be7a-4ac0-ae72-3b033c2a5433-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.265397 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.264869 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0a371e60-be7a-4ac0-ae72-3b033c2a5433-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.265397 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.265306 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0a371e60-be7a-4ac0-ae72-3b033c2a5433-config\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.265397 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.265364 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0a371e60-be7a-4ac0-ae72-3b033c2a5433-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.265866 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.265841 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0a371e60-be7a-4ac0-ae72-3b033c2a5433-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.266352 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.266308 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a371e60-be7a-4ac0-ae72-3b033c2a5433-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.266448 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.266409 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0a371e60-be7a-4ac0-ae72-3b033c2a5433-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.266795 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.266771 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0a371e60-be7a-4ac0-ae72-3b033c2a5433-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.266875 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.266827 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0a371e60-be7a-4ac0-ae72-3b033c2a5433-web-config\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.268377 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.268355 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0a371e60-be7a-4ac0-ae72-3b033c2a5433-config-out\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.269586 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.269568 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5km94\" (UniqueName: \"kubernetes.io/projected/0a371e60-be7a-4ac0-ae72-3b033c2a5433-kube-api-access-5km94\") pod \"prometheus-k8s-0\" (UID: \"0a371e60-be7a-4ac0-ae72-3b033c2a5433\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.327311 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.327281 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b252c920-4abd-4d7c-bd5e-d74dcbcd643a" path="/var/lib/kubelet/pods/b252c920-4abd-4d7c-bd5e-d74dcbcd643a/volumes" Apr 16 18:13:39.358293 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.358254 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:39.494141 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.494077 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:13:39.810049 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:13:39.810015 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a371e60_be7a_4ac0_ae72_3b033c2a5433.slice/crio-6173821d29252bf85acb0ee54535f5b240df05d61bc153b7cb98cffaff7d9d2b WatchSource:0}: Error finding container 6173821d29252bf85acb0ee54535f5b240df05d61bc153b7cb98cffaff7d9d2b: Status 404 returned error can't find the container with id 6173821d29252bf85acb0ee54535f5b240df05d61bc153b7cb98cffaff7d9d2b Apr 16 18:13:39.985888 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.985858 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-77cd4447c-rxvpr" event={"ID":"42657ce0-b347-4e89-84f7-5766710baf5f","Type":"ContainerStarted","Data":"0c20f0443ae07a691d91e528709e0bae4a3ac72de298f218eb1879ed64faa895"} Apr 16 18:13:39.987335 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.987308 2576 generic.go:358] "Generic (PLEG): container finished" podID="0a371e60-be7a-4ac0-ae72-3b033c2a5433" containerID="4677852b50f5718ce1fd37e2a8f2ee0cfd8ff1c544ad93ddc6a505cefb92fb2a" exitCode=0 Apr 16 18:13:39.987442 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.987344 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0a371e60-be7a-4ac0-ae72-3b033c2a5433","Type":"ContainerDied","Data":"4677852b50f5718ce1fd37e2a8f2ee0cfd8ff1c544ad93ddc6a505cefb92fb2a"} Apr 16 18:13:39.987442 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:39.987363 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0a371e60-be7a-4ac0-ae72-3b033c2a5433","Type":"ContainerStarted","Data":"6173821d29252bf85acb0ee54535f5b240df05d61bc153b7cb98cffaff7d9d2b"} Apr 16 18:13:40.993565 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:40.993523 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-77cd4447c-rxvpr" event={"ID":"42657ce0-b347-4e89-84f7-5766710baf5f","Type":"ContainerStarted","Data":"a754bc90424558aa68a3aa775864bbc6a851fc43774d012d61535a2e1a8902f6"} Apr 16 18:13:40.993565 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:40.993567 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-77cd4447c-rxvpr" event={"ID":"42657ce0-b347-4e89-84f7-5766710baf5f","Type":"ContainerStarted","Data":"f4fa183341f2c517ca0b4bcdeef8bb5bbd0c13507ce9c5a362095f14b1a921ca"} Apr 16 18:13:40.996583 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:40.996556 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0a371e60-be7a-4ac0-ae72-3b033c2a5433","Type":"ContainerStarted","Data":"eb2e4796133cb5c4af749317bedfbf3824c62a5f47d3daf22c50cde783fe4b92"} Apr 16 18:13:40.996692 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:40.996591 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0a371e60-be7a-4ac0-ae72-3b033c2a5433","Type":"ContainerStarted","Data":"5b1087353ba8777d0a4a9b7efad1519c4743afdef23be4c121d67d5d04dbc30a"} Apr 16 18:13:40.996692 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:40.996604 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0a371e60-be7a-4ac0-ae72-3b033c2a5433","Type":"ContainerStarted","Data":"f32542295a2ec606f04b3f1ef21aac9d555be213ce0705a61f733ee75dd815af"} Apr 16 18:13:40.996692 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:40.996616 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0a371e60-be7a-4ac0-ae72-3b033c2a5433","Type":"ContainerStarted","Data":"277deba9464375ebb8a277a369d61298476bdaaea9596ce5a308560d913f0916"} Apr 16 18:13:40.996692 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:40.996628 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0a371e60-be7a-4ac0-ae72-3b033c2a5433","Type":"ContainerStarted","Data":"1dbde67957b78535b52f7995689b3b9d9184b0d7778980b973e69e9d1c84845e"} Apr 16 18:13:40.996692 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:40.996639 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0a371e60-be7a-4ac0-ae72-3b033c2a5433","Type":"ContainerStarted","Data":"5f8c2f1c5932b4ca051b8ebf1ca80adcec6c345dea9059492c4f4c49bde842db"} Apr 16 18:13:41.017877 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:41.017835 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-77cd4447c-rxvpr" podStartSLOduration=2.323013409 podStartE2EDuration="4.01781888s" podCreationTimestamp="2026-04-16 18:13:37 +0000 UTC" firstStartedPulling="2026-04-16 18:13:38.165853129 +0000 UTC m=+177.367428376" lastFinishedPulling="2026-04-16 18:13:39.860658594 +0000 UTC m=+179.062233847" observedRunningTime="2026-04-16 18:13:41.014883108 +0000 UTC m=+180.216458375" watchObservedRunningTime="2026-04-16 18:13:41.01781888 +0000 UTC m=+180.219394148" Apr 16 18:13:41.040846 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:41.040805 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.040791996 podStartE2EDuration="2.040791996s" podCreationTimestamp="2026-04-16 18:13:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:13:41.038512437 +0000 UTC m=+180.240087701" watchObservedRunningTime="2026-04-16 18:13:41.040791996 +0000 UTC m=+180.242367263" Apr 16 18:13:44.358734 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:13:44.358680 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:14:39.358657 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:14:39.358622 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:14:39.373470 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:14:39.373444 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:14:40.199205 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:14:40.199180 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:15:35.092305 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:15:35.092268 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-8c5pl"] Apr 16 18:15:35.095806 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:15:35.095782 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8c5pl" Apr 16 18:15:35.098223 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:15:35.098204 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:15:35.102880 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:15:35.102855 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8c5pl"] Apr 16 18:15:35.255464 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:15:35.255432 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2446905a-5c73-4161-8c69-995947142aa0-dbus\") pod \"global-pull-secret-syncer-8c5pl\" (UID: \"2446905a-5c73-4161-8c69-995947142aa0\") " pod="kube-system/global-pull-secret-syncer-8c5pl" Apr 16 18:15:35.255587 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:15:35.255477 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2446905a-5c73-4161-8c69-995947142aa0-original-pull-secret\") pod \"global-pull-secret-syncer-8c5pl\" (UID: \"2446905a-5c73-4161-8c69-995947142aa0\") " pod="kube-system/global-pull-secret-syncer-8c5pl" Apr 16 18:15:35.255587 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:15:35.255504 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2446905a-5c73-4161-8c69-995947142aa0-kubelet-config\") pod \"global-pull-secret-syncer-8c5pl\" (UID: \"2446905a-5c73-4161-8c69-995947142aa0\") " pod="kube-system/global-pull-secret-syncer-8c5pl" Apr 16 18:15:35.356861 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:15:35.356800 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2446905a-5c73-4161-8c69-995947142aa0-dbus\") pod \"global-pull-secret-syncer-8c5pl\" (UID: \"2446905a-5c73-4161-8c69-995947142aa0\") " pod="kube-system/global-pull-secret-syncer-8c5pl" Apr 16 18:15:35.356861 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:15:35.356839 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2446905a-5c73-4161-8c69-995947142aa0-original-pull-secret\") pod \"global-pull-secret-syncer-8c5pl\" (UID: \"2446905a-5c73-4161-8c69-995947142aa0\") " pod="kube-system/global-pull-secret-syncer-8c5pl" Apr 16 18:15:35.356999 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:15:35.356865 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2446905a-5c73-4161-8c69-995947142aa0-kubelet-config\") pod \"global-pull-secret-syncer-8c5pl\" (UID: \"2446905a-5c73-4161-8c69-995947142aa0\") " pod="kube-system/global-pull-secret-syncer-8c5pl" Apr 16 18:15:35.356999 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:15:35.356979 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2446905a-5c73-4161-8c69-995947142aa0-kubelet-config\") pod \"global-pull-secret-syncer-8c5pl\" (UID: \"2446905a-5c73-4161-8c69-995947142aa0\") " pod="kube-system/global-pull-secret-syncer-8c5pl" Apr 16 18:15:35.356999 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:15:35.356988 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2446905a-5c73-4161-8c69-995947142aa0-dbus\") pod \"global-pull-secret-syncer-8c5pl\" (UID: \"2446905a-5c73-4161-8c69-995947142aa0\") " pod="kube-system/global-pull-secret-syncer-8c5pl" Apr 16 18:15:35.358896 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:15:35.358873 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2446905a-5c73-4161-8c69-995947142aa0-original-pull-secret\") pod \"global-pull-secret-syncer-8c5pl\" (UID: \"2446905a-5c73-4161-8c69-995947142aa0\") " pod="kube-system/global-pull-secret-syncer-8c5pl" Apr 16 18:15:35.405476 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:15:35.405452 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8c5pl" Apr 16 18:15:35.519745 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:15:35.519718 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8c5pl"] Apr 16 18:15:35.522035 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:15:35.522003 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2446905a_5c73_4161_8c69_995947142aa0.slice/crio-5c4926d944d6513d69b9e124e1cfceb0567a456ce73d082389223872d0aa3da8 WatchSource:0}: Error finding container 5c4926d944d6513d69b9e124e1cfceb0567a456ce73d082389223872d0aa3da8: Status 404 returned error can't find the container with id 5c4926d944d6513d69b9e124e1cfceb0567a456ce73d082389223872d0aa3da8 Apr 16 18:15:36.345086 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:15:36.345031 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8c5pl" event={"ID":"2446905a-5c73-4161-8c69-995947142aa0","Type":"ContainerStarted","Data":"5c4926d944d6513d69b9e124e1cfceb0567a456ce73d082389223872d0aa3da8"} Apr 16 18:15:39.355353 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:15:39.355318 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8c5pl" event={"ID":"2446905a-5c73-4161-8c69-995947142aa0","Type":"ContainerStarted","Data":"d312d6bf6c5e24c764b4533eaa413e0bbf82258481043ee87e611f83fa3a6de4"} Apr 16 18:15:39.372295 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:15:39.372242 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-8c5pl" podStartSLOduration=0.675299174 podStartE2EDuration="4.372225113s" podCreationTimestamp="2026-04-16 18:15:35 +0000 UTC" firstStartedPulling="2026-04-16 18:15:35.523822823 +0000 UTC m=+294.725398070" lastFinishedPulling="2026-04-16 18:15:39.220748764 +0000 UTC m=+298.422324009" observedRunningTime="2026-04-16 18:15:39.370573771 +0000 UTC m=+298.572149040" watchObservedRunningTime="2026-04-16 18:15:39.372225113 +0000 UTC m=+298.573800381" Apr 16 18:15:41.199784 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:15:41.199757 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-w28dq_11e9383c-4bf6-4c5c-9dec-a8f2b642aff1/console-operator/1.log" Apr 16 18:15:41.200166 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:15:41.199801 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-w28dq_11e9383c-4bf6-4c5c-9dec-a8f2b642aff1/console-operator/1.log" Apr 16 18:15:41.208059 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:15:41.208040 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 18:18:59.777349 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:18:59.777316 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-f7ljs"] Apr 16 18:18:59.780569 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:18:59.780552 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-f7ljs" Apr 16 18:18:59.783224 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:18:59.783196 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-qsxc9\"" Apr 16 18:18:59.783224 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:18:59.783218 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 18:18:59.784450 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:18:59.784428 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 18:18:59.784560 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:18:59.784445 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 18:18:59.786817 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:18:59.786794 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-f7ljs"] Apr 16 18:18:59.939856 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:18:59.939824 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzmzc\" (UniqueName: \"kubernetes.io/projected/1bfeb418-2d02-407e-86ed-7f42a2bc6938-kube-api-access-bzmzc\") pod \"s3-init-f7ljs\" (UID: \"1bfeb418-2d02-407e-86ed-7f42a2bc6938\") " pod="kserve/s3-init-f7ljs" Apr 16 18:19:00.040544 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:00.040473 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bzmzc\" (UniqueName: \"kubernetes.io/projected/1bfeb418-2d02-407e-86ed-7f42a2bc6938-kube-api-access-bzmzc\") pod \"s3-init-f7ljs\" (UID: \"1bfeb418-2d02-407e-86ed-7f42a2bc6938\") " pod="kserve/s3-init-f7ljs" Apr 16 18:19:00.050724 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:00.050677 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzmzc\" (UniqueName: \"kubernetes.io/projected/1bfeb418-2d02-407e-86ed-7f42a2bc6938-kube-api-access-bzmzc\") pod \"s3-init-f7ljs\" (UID: \"1bfeb418-2d02-407e-86ed-7f42a2bc6938\") " pod="kserve/s3-init-f7ljs" Apr 16 18:19:00.100601 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:00.100562 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-f7ljs" Apr 16 18:19:00.218507 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:00.218421 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-f7ljs"] Apr 16 18:19:00.220808 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:19:00.220778 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1bfeb418_2d02_407e_86ed_7f42a2bc6938.slice/crio-1fad3f3c146bba798d76291d465621035a83af090a446e0fc3d570396a9c30cd WatchSource:0}: Error finding container 1fad3f3c146bba798d76291d465621035a83af090a446e0fc3d570396a9c30cd: Status 404 returned error can't find the container with id 1fad3f3c146bba798d76291d465621035a83af090a446e0fc3d570396a9c30cd Apr 16 18:19:00.222871 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:00.222855 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:19:00.928229 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:00.928187 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-f7ljs" event={"ID":"1bfeb418-2d02-407e-86ed-7f42a2bc6938","Type":"ContainerStarted","Data":"1fad3f3c146bba798d76291d465621035a83af090a446e0fc3d570396a9c30cd"} Apr 16 18:19:04.944124 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:04.944089 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-f7ljs" event={"ID":"1bfeb418-2d02-407e-86ed-7f42a2bc6938","Type":"ContainerStarted","Data":"3d42ddbaee4966c5e72fee99a469bd7dc155e43416f4296ceedc5809de9f2b6c"} Apr 16 18:19:04.958764 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:04.958718 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-f7ljs" podStartSLOduration=1.638433961 podStartE2EDuration="5.958686178s" podCreationTimestamp="2026-04-16 18:18:59 +0000 UTC" firstStartedPulling="2026-04-16 18:19:00.222978189 +0000 UTC m=+499.424553436" lastFinishedPulling="2026-04-16 18:19:04.543230407 +0000 UTC m=+503.744805653" observedRunningTime="2026-04-16 18:19:04.957305471 +0000 UTC m=+504.158880740" watchObservedRunningTime="2026-04-16 18:19:04.958686178 +0000 UTC m=+504.160261446" Apr 16 18:19:07.953689 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:07.953656 2576 generic.go:358] "Generic (PLEG): container finished" podID="1bfeb418-2d02-407e-86ed-7f42a2bc6938" containerID="3d42ddbaee4966c5e72fee99a469bd7dc155e43416f4296ceedc5809de9f2b6c" exitCode=0 Apr 16 18:19:07.954061 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:07.953735 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-f7ljs" event={"ID":"1bfeb418-2d02-407e-86ed-7f42a2bc6938","Type":"ContainerDied","Data":"3d42ddbaee4966c5e72fee99a469bd7dc155e43416f4296ceedc5809de9f2b6c"} Apr 16 18:19:09.081276 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:09.081257 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-f7ljs" Apr 16 18:19:09.223472 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:09.223366 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzmzc\" (UniqueName: \"kubernetes.io/projected/1bfeb418-2d02-407e-86ed-7f42a2bc6938-kube-api-access-bzmzc\") pod \"1bfeb418-2d02-407e-86ed-7f42a2bc6938\" (UID: \"1bfeb418-2d02-407e-86ed-7f42a2bc6938\") " Apr 16 18:19:09.225588 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:09.225552 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bfeb418-2d02-407e-86ed-7f42a2bc6938-kube-api-access-bzmzc" (OuterVolumeSpecName: "kube-api-access-bzmzc") pod "1bfeb418-2d02-407e-86ed-7f42a2bc6938" (UID: "1bfeb418-2d02-407e-86ed-7f42a2bc6938"). InnerVolumeSpecName "kube-api-access-bzmzc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:19:09.324061 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:09.324034 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bzmzc\" (UniqueName: \"kubernetes.io/projected/1bfeb418-2d02-407e-86ed-7f42a2bc6938-kube-api-access-bzmzc\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:19:09.961804 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:09.961770 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-f7ljs" event={"ID":"1bfeb418-2d02-407e-86ed-7f42a2bc6938","Type":"ContainerDied","Data":"1fad3f3c146bba798d76291d465621035a83af090a446e0fc3d570396a9c30cd"} Apr 16 18:19:09.961804 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:09.961798 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-f7ljs" Apr 16 18:19:09.962017 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:09.961803 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fad3f3c146bba798d76291d465621035a83af090a446e0fc3d570396a9c30cd" Apr 16 18:19:10.692165 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:10.692138 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-8m2kw"] Apr 16 18:19:10.692531 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:10.692494 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1bfeb418-2d02-407e-86ed-7f42a2bc6938" containerName="s3-init" Apr 16 18:19:10.692531 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:10.692507 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bfeb418-2d02-407e-86ed-7f42a2bc6938" containerName="s3-init" Apr 16 18:19:10.692614 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:10.692559 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="1bfeb418-2d02-407e-86ed-7f42a2bc6938" containerName="s3-init" Apr 16 18:19:10.695600 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:10.695585 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-8m2kw" Apr 16 18:19:10.698418 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:10.698390 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 18:19:10.698418 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:10.698397 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 16 18:19:10.698418 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:10.698408 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 18:19:10.699553 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:10.699535 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-qsxc9\"" Apr 16 18:19:10.703217 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:10.703198 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-8m2kw"] Apr 16 18:19:10.836609 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:10.836585 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/c5d0aa9d-5039-4261-a920-0772a9dd6b8f-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-8m2kw\" (UID: \"c5d0aa9d-5039-4261-a920-0772a9dd6b8f\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-8m2kw" Apr 16 18:19:10.836728 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:10.836631 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6csc\" (UniqueName: \"kubernetes.io/projected/c5d0aa9d-5039-4261-a920-0772a9dd6b8f-kube-api-access-k6csc\") pod \"seaweedfs-tls-custom-ddd4dbfd-8m2kw\" (UID: \"c5d0aa9d-5039-4261-a920-0772a9dd6b8f\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-8m2kw" Apr 16 18:19:10.937937 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:10.937913 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k6csc\" (UniqueName: \"kubernetes.io/projected/c5d0aa9d-5039-4261-a920-0772a9dd6b8f-kube-api-access-k6csc\") pod \"seaweedfs-tls-custom-ddd4dbfd-8m2kw\" (UID: \"c5d0aa9d-5039-4261-a920-0772a9dd6b8f\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-8m2kw" Apr 16 18:19:10.938029 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:10.937977 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/c5d0aa9d-5039-4261-a920-0772a9dd6b8f-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-8m2kw\" (UID: \"c5d0aa9d-5039-4261-a920-0772a9dd6b8f\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-8m2kw" Apr 16 18:19:10.938274 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:10.938257 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/c5d0aa9d-5039-4261-a920-0772a9dd6b8f-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-8m2kw\" (UID: \"c5d0aa9d-5039-4261-a920-0772a9dd6b8f\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-8m2kw" Apr 16 18:19:10.946300 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:10.946242 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6csc\" (UniqueName: \"kubernetes.io/projected/c5d0aa9d-5039-4261-a920-0772a9dd6b8f-kube-api-access-k6csc\") pod \"seaweedfs-tls-custom-ddd4dbfd-8m2kw\" (UID: \"c5d0aa9d-5039-4261-a920-0772a9dd6b8f\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-8m2kw" Apr 16 18:19:11.005788 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:11.005768 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-8m2kw" Apr 16 18:19:11.119273 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:11.119242 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-8m2kw"] Apr 16 18:19:11.122614 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:19:11.122583 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5d0aa9d_5039_4261_a920_0772a9dd6b8f.slice/crio-f617a8ecb069ba1dab7d4dace139269044c430b30de539e03eee81dbf8a2e21b WatchSource:0}: Error finding container f617a8ecb069ba1dab7d4dace139269044c430b30de539e03eee81dbf8a2e21b: Status 404 returned error can't find the container with id f617a8ecb069ba1dab7d4dace139269044c430b30de539e03eee81dbf8a2e21b Apr 16 18:19:11.969068 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:11.969031 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-8m2kw" event={"ID":"c5d0aa9d-5039-4261-a920-0772a9dd6b8f","Type":"ContainerStarted","Data":"f617a8ecb069ba1dab7d4dace139269044c430b30de539e03eee81dbf8a2e21b"} Apr 16 18:19:13.975539 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:13.975508 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-8m2kw" event={"ID":"c5d0aa9d-5039-4261-a920-0772a9dd6b8f","Type":"ContainerStarted","Data":"49a9fb07b9e1fe5082fc1c64c58c77711e1278bfa599c9adfd808cdfc16c1c5c"} Apr 16 18:19:14.856909 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:14.856851 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-8m2kw" podStartSLOduration=2.377952794 podStartE2EDuration="4.856834989s" podCreationTimestamp="2026-04-16 18:19:10 +0000 UTC" firstStartedPulling="2026-04-16 18:19:11.12400773 +0000 UTC m=+510.325582981" lastFinishedPulling="2026-04-16 18:19:13.602889926 +0000 UTC m=+512.804465176" observedRunningTime="2026-04-16 18:19:13.990651425 +0000 UTC m=+513.192226707" watchObservedRunningTime="2026-04-16 18:19:14.856834989 +0000 UTC m=+514.058410257" Apr 16 18:19:14.857863 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:14.857844 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-8m2kw"] Apr 16 18:19:15.981241 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:15.981206 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-8m2kw" podUID="c5d0aa9d-5039-4261-a920-0772a9dd6b8f" containerName="seaweedfs-tls-custom" containerID="cri-o://49a9fb07b9e1fe5082fc1c64c58c77711e1278bfa599c9adfd808cdfc16c1c5c" gracePeriod=30 Apr 16 18:19:17.217782 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:17.217754 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-8m2kw" Apr 16 18:19:17.283464 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:17.283397 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/c5d0aa9d-5039-4261-a920-0772a9dd6b8f-data\") pod \"c5d0aa9d-5039-4261-a920-0772a9dd6b8f\" (UID: \"c5d0aa9d-5039-4261-a920-0772a9dd6b8f\") " Apr 16 18:19:17.283588 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:17.283464 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6csc\" (UniqueName: \"kubernetes.io/projected/c5d0aa9d-5039-4261-a920-0772a9dd6b8f-kube-api-access-k6csc\") pod \"c5d0aa9d-5039-4261-a920-0772a9dd6b8f\" (UID: \"c5d0aa9d-5039-4261-a920-0772a9dd6b8f\") " Apr 16 18:19:17.284590 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:17.284558 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5d0aa9d-5039-4261-a920-0772a9dd6b8f-data" (OuterVolumeSpecName: "data") pod "c5d0aa9d-5039-4261-a920-0772a9dd6b8f" (UID: "c5d0aa9d-5039-4261-a920-0772a9dd6b8f"). InnerVolumeSpecName "data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:19:17.285486 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:17.285454 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5d0aa9d-5039-4261-a920-0772a9dd6b8f-kube-api-access-k6csc" (OuterVolumeSpecName: "kube-api-access-k6csc") pod "c5d0aa9d-5039-4261-a920-0772a9dd6b8f" (UID: "c5d0aa9d-5039-4261-a920-0772a9dd6b8f"). InnerVolumeSpecName "kube-api-access-k6csc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:19:17.384115 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:17.384088 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k6csc\" (UniqueName: \"kubernetes.io/projected/c5d0aa9d-5039-4261-a920-0772a9dd6b8f-kube-api-access-k6csc\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:19:17.384115 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:17.384113 2576 reconciler_common.go:299] "Volume detached for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/c5d0aa9d-5039-4261-a920-0772a9dd6b8f-data\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:19:17.992240 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:17.992206 2576 generic.go:358] "Generic (PLEG): container finished" podID="c5d0aa9d-5039-4261-a920-0772a9dd6b8f" containerID="49a9fb07b9e1fe5082fc1c64c58c77711e1278bfa599c9adfd808cdfc16c1c5c" exitCode=0 Apr 16 18:19:17.992392 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:17.992266 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-8m2kw" Apr 16 18:19:17.992392 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:17.992288 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-8m2kw" event={"ID":"c5d0aa9d-5039-4261-a920-0772a9dd6b8f","Type":"ContainerDied","Data":"49a9fb07b9e1fe5082fc1c64c58c77711e1278bfa599c9adfd808cdfc16c1c5c"} Apr 16 18:19:17.992392 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:17.992325 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-8m2kw" event={"ID":"c5d0aa9d-5039-4261-a920-0772a9dd6b8f","Type":"ContainerDied","Data":"f617a8ecb069ba1dab7d4dace139269044c430b30de539e03eee81dbf8a2e21b"} Apr 16 18:19:17.992392 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:17.992341 2576 scope.go:117] "RemoveContainer" containerID="49a9fb07b9e1fe5082fc1c64c58c77711e1278bfa599c9adfd808cdfc16c1c5c" Apr 16 18:19:18.001319 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:18.001304 2576 scope.go:117] "RemoveContainer" containerID="49a9fb07b9e1fe5082fc1c64c58c77711e1278bfa599c9adfd808cdfc16c1c5c" Apr 16 18:19:18.001572 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:19:18.001551 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49a9fb07b9e1fe5082fc1c64c58c77711e1278bfa599c9adfd808cdfc16c1c5c\": container with ID starting with 49a9fb07b9e1fe5082fc1c64c58c77711e1278bfa599c9adfd808cdfc16c1c5c not found: ID does not exist" containerID="49a9fb07b9e1fe5082fc1c64c58c77711e1278bfa599c9adfd808cdfc16c1c5c" Apr 16 18:19:18.001631 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:18.001578 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49a9fb07b9e1fe5082fc1c64c58c77711e1278bfa599c9adfd808cdfc16c1c5c"} err="failed to get container status \"49a9fb07b9e1fe5082fc1c64c58c77711e1278bfa599c9adfd808cdfc16c1c5c\": rpc error: code = NotFound desc = could not find container \"49a9fb07b9e1fe5082fc1c64c58c77711e1278bfa599c9adfd808cdfc16c1c5c\": container with ID starting with 49a9fb07b9e1fe5082fc1c64c58c77711e1278bfa599c9adfd808cdfc16c1c5c not found: ID does not exist" Apr 16 18:19:18.008140 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:18.008121 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-8m2kw"] Apr 16 18:19:18.012807 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:18.012790 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-8m2kw"] Apr 16 18:19:19.326622 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:19.326591 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5d0aa9d-5039-4261-a920-0772a9dd6b8f" path="/var/lib/kubelet/pods/c5d0aa9d-5039-4261-a920-0772a9dd6b8f/volumes" Apr 16 18:19:19.336502 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:19.336476 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-custom-db5j8"] Apr 16 18:19:19.336829 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:19.336815 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5d0aa9d-5039-4261-a920-0772a9dd6b8f" containerName="seaweedfs-tls-custom" Apr 16 18:19:19.336888 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:19.336832 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5d0aa9d-5039-4261-a920-0772a9dd6b8f" containerName="seaweedfs-tls-custom" Apr 16 18:19:19.336925 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:19.336887 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c5d0aa9d-5039-4261-a920-0772a9dd6b8f" containerName="seaweedfs-tls-custom" Apr 16 18:19:19.341229 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:19.341211 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-db5j8" Apr 16 18:19:19.343926 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:19.343902 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 16 18:19:19.344027 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:19.343956 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 18:19:19.344027 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:19.343996 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-qsxc9\"" Apr 16 18:19:19.344118 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:19.344079 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 18:19:19.347083 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:19.347061 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-db5j8"] Apr 16 18:19:19.398492 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:19.398465 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62vbw\" (UniqueName: \"kubernetes.io/projected/72534c43-fc4a-4296-a820-9a02a374f5c0-kube-api-access-62vbw\") pod \"s3-tls-init-custom-db5j8\" (UID: \"72534c43-fc4a-4296-a820-9a02a374f5c0\") " pod="kserve/s3-tls-init-custom-db5j8" Apr 16 18:19:19.499245 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:19.499220 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-62vbw\" (UniqueName: \"kubernetes.io/projected/72534c43-fc4a-4296-a820-9a02a374f5c0-kube-api-access-62vbw\") pod \"s3-tls-init-custom-db5j8\" (UID: \"72534c43-fc4a-4296-a820-9a02a374f5c0\") " pod="kserve/s3-tls-init-custom-db5j8" Apr 16 18:19:19.506262 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:19.506242 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-62vbw\" (UniqueName: \"kubernetes.io/projected/72534c43-fc4a-4296-a820-9a02a374f5c0-kube-api-access-62vbw\") pod \"s3-tls-init-custom-db5j8\" (UID: \"72534c43-fc4a-4296-a820-9a02a374f5c0\") " pod="kserve/s3-tls-init-custom-db5j8" Apr 16 18:19:19.662840 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:19.662804 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-db5j8" Apr 16 18:19:19.777178 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:19.777154 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-db5j8"] Apr 16 18:19:19.779558 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:19:19.779529 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72534c43_fc4a_4296_a820_9a02a374f5c0.slice/crio-6eb7b2e9b3abbadd76273b8afbdaf12b7758020aa7df075dc4c33293fda8c6a3 WatchSource:0}: Error finding container 6eb7b2e9b3abbadd76273b8afbdaf12b7758020aa7df075dc4c33293fda8c6a3: Status 404 returned error can't find the container with id 6eb7b2e9b3abbadd76273b8afbdaf12b7758020aa7df075dc4c33293fda8c6a3 Apr 16 18:19:20.000631 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:20.000551 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-db5j8" event={"ID":"72534c43-fc4a-4296-a820-9a02a374f5c0","Type":"ContainerStarted","Data":"3137b847e8e5d79bf5a12e8ab12d1659992136b00d7a2f57fd8fc101601e0bc6"} Apr 16 18:19:20.000631 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:20.000590 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-db5j8" event={"ID":"72534c43-fc4a-4296-a820-9a02a374f5c0","Type":"ContainerStarted","Data":"6eb7b2e9b3abbadd76273b8afbdaf12b7758020aa7df075dc4c33293fda8c6a3"} Apr 16 18:19:20.016797 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:20.016748 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-custom-db5j8" podStartSLOduration=1.016729334 podStartE2EDuration="1.016729334s" podCreationTimestamp="2026-04-16 18:19:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:19:20.014630575 +0000 UTC m=+519.216205844" watchObservedRunningTime="2026-04-16 18:19:20.016729334 +0000 UTC m=+519.218304604" Apr 16 18:19:26.021441 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:26.021402 2576 generic.go:358] "Generic (PLEG): container finished" podID="72534c43-fc4a-4296-a820-9a02a374f5c0" containerID="3137b847e8e5d79bf5a12e8ab12d1659992136b00d7a2f57fd8fc101601e0bc6" exitCode=0 Apr 16 18:19:26.021441 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:26.021444 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-db5j8" event={"ID":"72534c43-fc4a-4296-a820-9a02a374f5c0","Type":"ContainerDied","Data":"3137b847e8e5d79bf5a12e8ab12d1659992136b00d7a2f57fd8fc101601e0bc6"} Apr 16 18:19:27.158349 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:27.158326 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-db5j8" Apr 16 18:19:27.258541 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:27.258517 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62vbw\" (UniqueName: \"kubernetes.io/projected/72534c43-fc4a-4296-a820-9a02a374f5c0-kube-api-access-62vbw\") pod \"72534c43-fc4a-4296-a820-9a02a374f5c0\" (UID: \"72534c43-fc4a-4296-a820-9a02a374f5c0\") " Apr 16 18:19:27.260462 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:27.260441 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72534c43-fc4a-4296-a820-9a02a374f5c0-kube-api-access-62vbw" (OuterVolumeSpecName: "kube-api-access-62vbw") pod "72534c43-fc4a-4296-a820-9a02a374f5c0" (UID: "72534c43-fc4a-4296-a820-9a02a374f5c0"). InnerVolumeSpecName "kube-api-access-62vbw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:19:27.359616 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:27.359589 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-62vbw\" (UniqueName: \"kubernetes.io/projected/72534c43-fc4a-4296-a820-9a02a374f5c0-kube-api-access-62vbw\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:19:28.029060 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:28.029031 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-db5j8" Apr 16 18:19:28.029060 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:28.029038 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-db5j8" event={"ID":"72534c43-fc4a-4296-a820-9a02a374f5c0","Type":"ContainerDied","Data":"6eb7b2e9b3abbadd76273b8afbdaf12b7758020aa7df075dc4c33293fda8c6a3"} Apr 16 18:19:28.029060 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:28.029063 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6eb7b2e9b3abbadd76273b8afbdaf12b7758020aa7df075dc4c33293fda8c6a3" Apr 16 18:19:28.617902 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:28.617869 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-djz97"] Apr 16 18:19:28.618242 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:28.618199 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72534c43-fc4a-4296-a820-9a02a374f5c0" containerName="s3-tls-init-custom" Apr 16 18:19:28.618242 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:28.618210 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="72534c43-fc4a-4296-a820-9a02a374f5c0" containerName="s3-tls-init-custom" Apr 16 18:19:28.618318 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:28.618257 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="72534c43-fc4a-4296-a820-9a02a374f5c0" containerName="s3-tls-init-custom" Apr 16 18:19:28.621335 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:28.621317 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-djz97" Apr 16 18:19:28.623664 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:28.623642 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 18:19:28.624908 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:28.624893 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving\"" Apr 16 18:19:28.625005 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:28.624910 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-qsxc9\"" Apr 16 18:19:28.625005 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:28.624971 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 16 18:19:28.625005 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:28.624982 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 18:19:28.629360 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:28.629339 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-djz97"] Apr 16 18:19:28.669649 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:28.669623 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zms5x\" (UniqueName: \"kubernetes.io/projected/092f6923-5390-47c7-b719-0874730015f9-kube-api-access-zms5x\") pod \"seaweedfs-tls-serving-7fd5766db9-djz97\" (UID: \"092f6923-5390-47c7-b719-0874730015f9\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-djz97" Apr 16 18:19:28.669754 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:28.669666 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/092f6923-5390-47c7-b719-0874730015f9-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-djz97\" (UID: \"092f6923-5390-47c7-b719-0874730015f9\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-djz97" Apr 16 18:19:28.669796 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:28.669776 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/092f6923-5390-47c7-b719-0874730015f9-data\") pod \"seaweedfs-tls-serving-7fd5766db9-djz97\" (UID: \"092f6923-5390-47c7-b719-0874730015f9\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-djz97" Apr 16 18:19:28.770225 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:28.770200 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/092f6923-5390-47c7-b719-0874730015f9-data\") pod \"seaweedfs-tls-serving-7fd5766db9-djz97\" (UID: \"092f6923-5390-47c7-b719-0874730015f9\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-djz97" Apr 16 18:19:28.770319 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:28.770245 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zms5x\" (UniqueName: \"kubernetes.io/projected/092f6923-5390-47c7-b719-0874730015f9-kube-api-access-zms5x\") pod \"seaweedfs-tls-serving-7fd5766db9-djz97\" (UID: \"092f6923-5390-47c7-b719-0874730015f9\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-djz97" Apr 16 18:19:28.770319 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:28.770299 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/092f6923-5390-47c7-b719-0874730015f9-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-djz97\" (UID: \"092f6923-5390-47c7-b719-0874730015f9\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-djz97" Apr 16 18:19:28.770445 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:19:28.770429 2576 projected.go:264] Couldn't get secret kserve/seaweedfs-tls-serving: secret "seaweedfs-tls-serving" not found Apr 16 18:19:28.770481 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:19:28.770450 2576 projected.go:194] Error preparing data for projected volume seaweedfs-tls-serving for pod kserve/seaweedfs-tls-serving-7fd5766db9-djz97: secret "seaweedfs-tls-serving" not found Apr 16 18:19:28.770524 ip-10-0-139-96 kubenswrapper[2576]: E0416 18:19:28.770514 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/092f6923-5390-47c7-b719-0874730015f9-seaweedfs-tls-serving podName:092f6923-5390-47c7-b719-0874730015f9 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:29.270494078 +0000 UTC m=+528.472069328 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "seaweedfs-tls-serving" (UniqueName: "kubernetes.io/projected/092f6923-5390-47c7-b719-0874730015f9-seaweedfs-tls-serving") pod "seaweedfs-tls-serving-7fd5766db9-djz97" (UID: "092f6923-5390-47c7-b719-0874730015f9") : secret "seaweedfs-tls-serving" not found Apr 16 18:19:28.770745 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:28.770727 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/092f6923-5390-47c7-b719-0874730015f9-data\") pod \"seaweedfs-tls-serving-7fd5766db9-djz97\" (UID: \"092f6923-5390-47c7-b719-0874730015f9\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-djz97" Apr 16 18:19:28.780747 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:28.780726 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zms5x\" (UniqueName: \"kubernetes.io/projected/092f6923-5390-47c7-b719-0874730015f9-kube-api-access-zms5x\") pod \"seaweedfs-tls-serving-7fd5766db9-djz97\" (UID: \"092f6923-5390-47c7-b719-0874730015f9\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-djz97" Apr 16 18:19:29.275042 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:29.274953 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/092f6923-5390-47c7-b719-0874730015f9-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-djz97\" (UID: \"092f6923-5390-47c7-b719-0874730015f9\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-djz97" Apr 16 18:19:29.277241 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:29.277210 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/092f6923-5390-47c7-b719-0874730015f9-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-djz97\" (UID: \"092f6923-5390-47c7-b719-0874730015f9\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-djz97" Apr 16 18:19:29.531077 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:29.531009 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-djz97" Apr 16 18:19:29.644614 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:29.644584 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-djz97"] Apr 16 18:19:29.647166 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:19:29.647138 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod092f6923_5390_47c7_b719_0874730015f9.slice/crio-a80ee1a6f526a60e6fc8705cf363b79483998595eed8a48b508e2d6e6ae79854 WatchSource:0}: Error finding container a80ee1a6f526a60e6fc8705cf363b79483998595eed8a48b508e2d6e6ae79854: Status 404 returned error can't find the container with id a80ee1a6f526a60e6fc8705cf363b79483998595eed8a48b508e2d6e6ae79854 Apr 16 18:19:30.036681 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:30.036644 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-djz97" event={"ID":"092f6923-5390-47c7-b719-0874730015f9","Type":"ContainerStarted","Data":"9bbbdd826719bb6e8a5aa42406e85f98ba69b881d8356dc16d27edcd384c037c"} Apr 16 18:19:30.036681 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:30.036681 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-djz97" event={"ID":"092f6923-5390-47c7-b719-0874730015f9","Type":"ContainerStarted","Data":"a80ee1a6f526a60e6fc8705cf363b79483998595eed8a48b508e2d6e6ae79854"} Apr 16 18:19:30.052347 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:30.052298 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-serving-7fd5766db9-djz97" podStartSLOduration=1.7665745940000002 podStartE2EDuration="2.052282255s" podCreationTimestamp="2026-04-16 18:19:28 +0000 UTC" firstStartedPulling="2026-04-16 18:19:29.64828883 +0000 UTC m=+528.849864076" lastFinishedPulling="2026-04-16 18:19:29.933996476 +0000 UTC m=+529.135571737" observedRunningTime="2026-04-16 18:19:30.051129811 +0000 UTC m=+529.252705079" watchObservedRunningTime="2026-04-16 18:19:30.052282255 +0000 UTC m=+529.253857524" Apr 16 18:19:30.558465 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:30.558430 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-serving-fvsj9"] Apr 16 18:19:30.561711 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:30.561682 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-fvsj9" Apr 16 18:19:30.567645 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:30.567618 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-fvsj9"] Apr 16 18:19:30.591978 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:30.591956 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbpsx\" (UniqueName: \"kubernetes.io/projected/e8addbcd-a056-46a5-b838-372d06344702-kube-api-access-zbpsx\") pod \"s3-tls-init-serving-fvsj9\" (UID: \"e8addbcd-a056-46a5-b838-372d06344702\") " pod="kserve/s3-tls-init-serving-fvsj9" Apr 16 18:19:30.693234 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:30.693201 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zbpsx\" (UniqueName: \"kubernetes.io/projected/e8addbcd-a056-46a5-b838-372d06344702-kube-api-access-zbpsx\") pod \"s3-tls-init-serving-fvsj9\" (UID: \"e8addbcd-a056-46a5-b838-372d06344702\") " pod="kserve/s3-tls-init-serving-fvsj9" Apr 16 18:19:30.701743 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:30.701716 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbpsx\" (UniqueName: \"kubernetes.io/projected/e8addbcd-a056-46a5-b838-372d06344702-kube-api-access-zbpsx\") pod \"s3-tls-init-serving-fvsj9\" (UID: \"e8addbcd-a056-46a5-b838-372d06344702\") " pod="kserve/s3-tls-init-serving-fvsj9" Apr 16 18:19:30.878523 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:30.878472 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-fvsj9" Apr 16 18:19:30.992586 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:30.992563 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-fvsj9"] Apr 16 18:19:30.994595 ip-10-0-139-96 kubenswrapper[2576]: W0416 18:19:30.994572 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8addbcd_a056_46a5_b838_372d06344702.slice/crio-56203e415a0fe1113ecbbb6fc2ae6853fe3cff4b5ea1ef4e6c26983cf94efbd7 WatchSource:0}: Error finding container 56203e415a0fe1113ecbbb6fc2ae6853fe3cff4b5ea1ef4e6c26983cf94efbd7: Status 404 returned error can't find the container with id 56203e415a0fe1113ecbbb6fc2ae6853fe3cff4b5ea1ef4e6c26983cf94efbd7 Apr 16 18:19:31.040186 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:31.040159 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-fvsj9" event={"ID":"e8addbcd-a056-46a5-b838-372d06344702","Type":"ContainerStarted","Data":"56203e415a0fe1113ecbbb6fc2ae6853fe3cff4b5ea1ef4e6c26983cf94efbd7"} Apr 16 18:19:32.044083 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:32.044040 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-fvsj9" event={"ID":"e8addbcd-a056-46a5-b838-372d06344702","Type":"ContainerStarted","Data":"5838b12012e4b01a09652c47625862471ef50acd2c4a66855088b6560f7d5adb"} Apr 16 18:19:32.060189 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:32.060144 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-serving-fvsj9" podStartSLOduration=2.060131056 podStartE2EDuration="2.060131056s" podCreationTimestamp="2026-04-16 18:19:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:19:32.059720872 +0000 UTC m=+531.261296137" watchObservedRunningTime="2026-04-16 18:19:32.060131056 +0000 UTC m=+531.261706324" Apr 16 18:19:36.058647 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:36.058614 2576 generic.go:358] "Generic (PLEG): container finished" podID="e8addbcd-a056-46a5-b838-372d06344702" containerID="5838b12012e4b01a09652c47625862471ef50acd2c4a66855088b6560f7d5adb" exitCode=0 Apr 16 18:19:36.058996 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:36.058685 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-fvsj9" event={"ID":"e8addbcd-a056-46a5-b838-372d06344702","Type":"ContainerDied","Data":"5838b12012e4b01a09652c47625862471ef50acd2c4a66855088b6560f7d5adb"} Apr 16 18:19:37.193448 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:37.193428 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-fvsj9" Apr 16 18:19:37.242988 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:37.242963 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbpsx\" (UniqueName: \"kubernetes.io/projected/e8addbcd-a056-46a5-b838-372d06344702-kube-api-access-zbpsx\") pod \"e8addbcd-a056-46a5-b838-372d06344702\" (UID: \"e8addbcd-a056-46a5-b838-372d06344702\") " Apr 16 18:19:37.244875 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:37.244853 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8addbcd-a056-46a5-b838-372d06344702-kube-api-access-zbpsx" (OuterVolumeSpecName: "kube-api-access-zbpsx") pod "e8addbcd-a056-46a5-b838-372d06344702" (UID: "e8addbcd-a056-46a5-b838-372d06344702"). InnerVolumeSpecName "kube-api-access-zbpsx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:19:37.343650 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:37.343626 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zbpsx\" (UniqueName: \"kubernetes.io/projected/e8addbcd-a056-46a5-b838-372d06344702-kube-api-access-zbpsx\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 18:19:38.065380 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:38.065352 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-fvsj9" Apr 16 18:19:38.065547 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:38.065352 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-fvsj9" event={"ID":"e8addbcd-a056-46a5-b838-372d06344702","Type":"ContainerDied","Data":"56203e415a0fe1113ecbbb6fc2ae6853fe3cff4b5ea1ef4e6c26983cf94efbd7"} Apr 16 18:19:38.065547 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:19:38.065472 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56203e415a0fe1113ecbbb6fc2ae6853fe3cff4b5ea1ef4e6c26983cf94efbd7" Apr 16 18:20:41.229392 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:20:41.229359 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-w28dq_11e9383c-4bf6-4c5c-9dec-a8f2b642aff1/console-operator/1.log" Apr 16 18:20:41.229966 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:20:41.229947 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-w28dq_11e9383c-4bf6-4c5c-9dec-a8f2b642aff1/console-operator/1.log" Apr 16 18:25:41.255953 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:25:41.255875 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-w28dq_11e9383c-4bf6-4c5c-9dec-a8f2b642aff1/console-operator/1.log" Apr 16 18:25:41.258800 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:25:41.258778 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-w28dq_11e9383c-4bf6-4c5c-9dec-a8f2b642aff1/console-operator/1.log" Apr 16 18:30:41.283284 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:30:41.283253 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-w28dq_11e9383c-4bf6-4c5c-9dec-a8f2b642aff1/console-operator/1.log" Apr 16 18:30:41.286235 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:30:41.286213 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-w28dq_11e9383c-4bf6-4c5c-9dec-a8f2b642aff1/console-operator/1.log" Apr 16 18:35:41.308769 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:35:41.308748 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-w28dq_11e9383c-4bf6-4c5c-9dec-a8f2b642aff1/console-operator/1.log" Apr 16 18:35:41.312251 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:35:41.312225 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-w28dq_11e9383c-4bf6-4c5c-9dec-a8f2b642aff1/console-operator/1.log" Apr 16 18:40:41.334127 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:40:41.334097 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-w28dq_11e9383c-4bf6-4c5c-9dec-a8f2b642aff1/console-operator/1.log" Apr 16 18:40:41.337570 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:40:41.337549 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-w28dq_11e9383c-4bf6-4c5c-9dec-a8f2b642aff1/console-operator/1.log" Apr 16 18:45:41.357466 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:45:41.357440 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-w28dq_11e9383c-4bf6-4c5c-9dec-a8f2b642aff1/console-operator/1.log" Apr 16 18:45:41.361731 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:45:41.361692 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-w28dq_11e9383c-4bf6-4c5c-9dec-a8f2b642aff1/console-operator/1.log" Apr 16 18:50:41.381030 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:50:41.381002 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-w28dq_11e9383c-4bf6-4c5c-9dec-a8f2b642aff1/console-operator/1.log" Apr 16 18:50:41.385174 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:50:41.385154 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-w28dq_11e9383c-4bf6-4c5c-9dec-a8f2b642aff1/console-operator/1.log" Apr 16 18:55:41.404429 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:55:41.404357 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-w28dq_11e9383c-4bf6-4c5c-9dec-a8f2b642aff1/console-operator/1.log" Apr 16 18:55:41.415923 ip-10-0-139-96 kubenswrapper[2576]: I0416 18:55:41.415899 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-w28dq_11e9383c-4bf6-4c5c-9dec-a8f2b642aff1/console-operator/1.log" Apr 16 19:00:41.433527 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:00:41.433422 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-w28dq_11e9383c-4bf6-4c5c-9dec-a8f2b642aff1/console-operator/1.log" Apr 16 19:00:41.439213 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:00:41.439191 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-w28dq_11e9383c-4bf6-4c5c-9dec-a8f2b642aff1/console-operator/1.log" Apr 16 19:05:41.456397 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:05:41.456288 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-w28dq_11e9383c-4bf6-4c5c-9dec-a8f2b642aff1/console-operator/1.log" Apr 16 19:05:41.462803 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:05:41.462786 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-w28dq_11e9383c-4bf6-4c5c-9dec-a8f2b642aff1/console-operator/1.log" Apr 16 19:10:41.479649 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:10:41.479537 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-w28dq_11e9383c-4bf6-4c5c-9dec-a8f2b642aff1/console-operator/1.log" Apr 16 19:10:41.487403 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:10:41.487385 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-w28dq_11e9383c-4bf6-4c5c-9dec-a8f2b642aff1/console-operator/1.log" Apr 16 19:13:39.147531 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:13:39.147500 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4rwnw/must-gather-wbvhw"] Apr 16 19:13:39.147990 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:13:39.147833 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8addbcd-a056-46a5-b838-372d06344702" containerName="s3-tls-init-serving" Apr 16 19:13:39.147990 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:13:39.147845 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8addbcd-a056-46a5-b838-372d06344702" containerName="s3-tls-init-serving" Apr 16 19:13:39.147990 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:13:39.147921 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e8addbcd-a056-46a5-b838-372d06344702" containerName="s3-tls-init-serving" Apr 16 19:13:39.151054 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:13:39.151033 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4rwnw/must-gather-wbvhw" Apr 16 19:13:39.153491 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:13:39.153466 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-4rwnw\"/\"openshift-service-ca.crt\"" Apr 16 19:13:39.154795 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:13:39.154768 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-4rwnw\"/\"default-dockercfg-r4sbn\"" Apr 16 19:13:39.154896 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:13:39.154773 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-4rwnw\"/\"kube-root-ca.crt\"" Apr 16 19:13:39.157051 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:13:39.157032 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4rwnw/must-gather-wbvhw"] Apr 16 19:13:39.258873 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:13:39.258836 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzpkq\" (UniqueName: \"kubernetes.io/projected/a4253a0c-4414-4667-9da4-b2acef4dd62a-kube-api-access-xzpkq\") pod \"must-gather-wbvhw\" (UID: \"a4253a0c-4414-4667-9da4-b2acef4dd62a\") " pod="openshift-must-gather-4rwnw/must-gather-wbvhw" Apr 16 19:13:39.259044 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:13:39.258891 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a4253a0c-4414-4667-9da4-b2acef4dd62a-must-gather-output\") pod \"must-gather-wbvhw\" (UID: \"a4253a0c-4414-4667-9da4-b2acef4dd62a\") " pod="openshift-must-gather-4rwnw/must-gather-wbvhw" Apr 16 19:13:39.359937 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:13:39.359910 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xzpkq\" (UniqueName: \"kubernetes.io/projected/a4253a0c-4414-4667-9da4-b2acef4dd62a-kube-api-access-xzpkq\") pod \"must-gather-wbvhw\" (UID: \"a4253a0c-4414-4667-9da4-b2acef4dd62a\") " pod="openshift-must-gather-4rwnw/must-gather-wbvhw" Apr 16 19:13:39.360121 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:13:39.359977 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a4253a0c-4414-4667-9da4-b2acef4dd62a-must-gather-output\") pod \"must-gather-wbvhw\" (UID: \"a4253a0c-4414-4667-9da4-b2acef4dd62a\") " pod="openshift-must-gather-4rwnw/must-gather-wbvhw" Apr 16 19:13:39.360293 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:13:39.360270 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a4253a0c-4414-4667-9da4-b2acef4dd62a-must-gather-output\") pod \"must-gather-wbvhw\" (UID: \"a4253a0c-4414-4667-9da4-b2acef4dd62a\") " pod="openshift-must-gather-4rwnw/must-gather-wbvhw" Apr 16 19:13:39.368238 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:13:39.368217 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzpkq\" (UniqueName: \"kubernetes.io/projected/a4253a0c-4414-4667-9da4-b2acef4dd62a-kube-api-access-xzpkq\") pod \"must-gather-wbvhw\" (UID: \"a4253a0c-4414-4667-9da4-b2acef4dd62a\") " pod="openshift-must-gather-4rwnw/must-gather-wbvhw" Apr 16 19:13:39.471496 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:13:39.471440 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4rwnw/must-gather-wbvhw" Apr 16 19:13:39.586187 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:13:39.586153 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4rwnw/must-gather-wbvhw"] Apr 16 19:13:39.589190 ip-10-0-139-96 kubenswrapper[2576]: W0416 19:13:39.589163 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4253a0c_4414_4667_9da4_b2acef4dd62a.slice/crio-a8e9034a404b7b39d8e1dca968e5f95f5ea9d38fc1d838f262348f20c23a279c WatchSource:0}: Error finding container a8e9034a404b7b39d8e1dca968e5f95f5ea9d38fc1d838f262348f20c23a279c: Status 404 returned error can't find the container with id a8e9034a404b7b39d8e1dca968e5f95f5ea9d38fc1d838f262348f20c23a279c Apr 16 19:13:39.590979 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:13:39.590965 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:13:39.703399 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:13:39.703369 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4rwnw/must-gather-wbvhw" event={"ID":"a4253a0c-4414-4667-9da4-b2acef4dd62a","Type":"ContainerStarted","Data":"a8e9034a404b7b39d8e1dca968e5f95f5ea9d38fc1d838f262348f20c23a279c"} Apr 16 19:13:44.722720 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:13:44.722664 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4rwnw/must-gather-wbvhw" event={"ID":"a4253a0c-4414-4667-9da4-b2acef4dd62a","Type":"ContainerStarted","Data":"ee5aab44cf36c1c9e837dc0e6967b63f8f4eb5639293e086e504dc9eff6b8699"} Apr 16 19:13:44.722720 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:13:44.722723 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4rwnw/must-gather-wbvhw" event={"ID":"a4253a0c-4414-4667-9da4-b2acef4dd62a","Type":"ContainerStarted","Data":"c2c550979542cc96207ca205dd0dd629c293182826a01d8bd0ccff2329c60b03"} Apr 16 19:13:44.740091 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:13:44.740042 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4rwnw/must-gather-wbvhw" podStartSLOduration=1.623069291 podStartE2EDuration="5.740028892s" podCreationTimestamp="2026-04-16 19:13:39 +0000 UTC" firstStartedPulling="2026-04-16 19:13:39.591085219 +0000 UTC m=+3778.792660469" lastFinishedPulling="2026-04-16 19:13:43.708044819 +0000 UTC m=+3782.909620070" observedRunningTime="2026-04-16 19:13:44.737541649 +0000 UTC m=+3783.939116917" watchObservedRunningTime="2026-04-16 19:13:44.740028892 +0000 UTC m=+3783.941604160" Apr 16 19:14:03.784542 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:03.784508 2576 generic.go:358] "Generic (PLEG): container finished" podID="a4253a0c-4414-4667-9da4-b2acef4dd62a" containerID="c2c550979542cc96207ca205dd0dd629c293182826a01d8bd0ccff2329c60b03" exitCode=0 Apr 16 19:14:03.784978 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:03.784580 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4rwnw/must-gather-wbvhw" event={"ID":"a4253a0c-4414-4667-9da4-b2acef4dd62a","Type":"ContainerDied","Data":"c2c550979542cc96207ca205dd0dd629c293182826a01d8bd0ccff2329c60b03"} Apr 16 19:14:03.784978 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:03.784916 2576 scope.go:117] "RemoveContainer" containerID="c2c550979542cc96207ca205dd0dd629c293182826a01d8bd0ccff2329c60b03" Apr 16 19:14:03.867873 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:03.867841 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4rwnw_must-gather-wbvhw_a4253a0c-4414-4667-9da4-b2acef4dd62a/gather/0.log" Apr 16 19:14:04.492469 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:04.492441 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-p86h9/must-gather-4c885"] Apr 16 19:14:04.494994 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:04.494976 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p86h9/must-gather-4c885" Apr 16 19:14:04.497362 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:04.497338 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-p86h9\"/\"openshift-service-ca.crt\"" Apr 16 19:14:04.498604 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:04.498584 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-p86h9\"/\"kube-root-ca.crt\"" Apr 16 19:14:04.498604 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:04.498596 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-p86h9\"/\"default-dockercfg-j49c7\"" Apr 16 19:14:04.503022 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:04.503003 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p86h9/must-gather-4c885"] Apr 16 19:14:04.588892 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:04.588857 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ae1ba0e7-3a40-4b78-894b-d90fd6c4e91a-must-gather-output\") pod \"must-gather-4c885\" (UID: \"ae1ba0e7-3a40-4b78-894b-d90fd6c4e91a\") " pod="openshift-must-gather-p86h9/must-gather-4c885" Apr 16 19:14:04.589016 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:04.588952 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmgdc\" (UniqueName: \"kubernetes.io/projected/ae1ba0e7-3a40-4b78-894b-d90fd6c4e91a-kube-api-access-zmgdc\") pod \"must-gather-4c885\" (UID: \"ae1ba0e7-3a40-4b78-894b-d90fd6c4e91a\") " pod="openshift-must-gather-p86h9/must-gather-4c885" Apr 16 19:14:04.689957 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:04.689928 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ae1ba0e7-3a40-4b78-894b-d90fd6c4e91a-must-gather-output\") pod \"must-gather-4c885\" (UID: \"ae1ba0e7-3a40-4b78-894b-d90fd6c4e91a\") " pod="openshift-must-gather-p86h9/must-gather-4c885" Apr 16 19:14:04.690078 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:04.689985 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zmgdc\" (UniqueName: \"kubernetes.io/projected/ae1ba0e7-3a40-4b78-894b-d90fd6c4e91a-kube-api-access-zmgdc\") pod \"must-gather-4c885\" (UID: \"ae1ba0e7-3a40-4b78-894b-d90fd6c4e91a\") " pod="openshift-must-gather-p86h9/must-gather-4c885" Apr 16 19:14:04.690260 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:04.690233 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ae1ba0e7-3a40-4b78-894b-d90fd6c4e91a-must-gather-output\") pod \"must-gather-4c885\" (UID: \"ae1ba0e7-3a40-4b78-894b-d90fd6c4e91a\") " pod="openshift-must-gather-p86h9/must-gather-4c885" Apr 16 19:14:04.698482 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:04.698458 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmgdc\" (UniqueName: \"kubernetes.io/projected/ae1ba0e7-3a40-4b78-894b-d90fd6c4e91a-kube-api-access-zmgdc\") pod \"must-gather-4c885\" (UID: \"ae1ba0e7-3a40-4b78-894b-d90fd6c4e91a\") " pod="openshift-must-gather-p86h9/must-gather-4c885" Apr 16 19:14:04.804300 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:04.804237 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p86h9/must-gather-4c885" Apr 16 19:14:04.921188 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:04.921151 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p86h9/must-gather-4c885"] Apr 16 19:14:04.923465 ip-10-0-139-96 kubenswrapper[2576]: W0416 19:14:04.923437 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae1ba0e7_3a40_4b78_894b_d90fd6c4e91a.slice/crio-2570844bc07efd3a49329393868493743e2587731d8950e5887b2d05bf7c458a WatchSource:0}: Error finding container 2570844bc07efd3a49329393868493743e2587731d8950e5887b2d05bf7c458a: Status 404 returned error can't find the container with id 2570844bc07efd3a49329393868493743e2587731d8950e5887b2d05bf7c458a Apr 16 19:14:05.792458 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:05.792411 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p86h9/must-gather-4c885" event={"ID":"ae1ba0e7-3a40-4b78-894b-d90fd6c4e91a","Type":"ContainerStarted","Data":"2570844bc07efd3a49329393868493743e2587731d8950e5887b2d05bf7c458a"} Apr 16 19:14:06.797764 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:06.797726 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p86h9/must-gather-4c885" event={"ID":"ae1ba0e7-3a40-4b78-894b-d90fd6c4e91a","Type":"ContainerStarted","Data":"0ea02ce11cf40062abfe21a7ae2b94ee5132f98b3f0e163aadc2b2d5b65eeb11"} Apr 16 19:14:06.797764 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:06.797768 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p86h9/must-gather-4c885" event={"ID":"ae1ba0e7-3a40-4b78-894b-d90fd6c4e91a","Type":"ContainerStarted","Data":"f900f10b2ff92627dd0ce1652a4bbc4e495d8c2bf1c454b2c5f9297ff51c889c"} Apr 16 19:14:06.815348 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:06.815289 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-p86h9/must-gather-4c885" podStartSLOduration=2.019985101 podStartE2EDuration="2.81527001s" podCreationTimestamp="2026-04-16 19:14:04 +0000 UTC" firstStartedPulling="2026-04-16 19:14:04.92523413 +0000 UTC m=+3804.126809379" lastFinishedPulling="2026-04-16 19:14:05.720519039 +0000 UTC m=+3804.922094288" observedRunningTime="2026-04-16 19:14:06.814188295 +0000 UTC m=+3806.015763582" watchObservedRunningTime="2026-04-16 19:14:06.81527001 +0000 UTC m=+3806.016845280" Apr 16 19:14:07.169515 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:07.169483 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-8c5pl_2446905a-5c73-4161-8c69-995947142aa0/global-pull-secret-syncer/0.log" Apr 16 19:14:07.306144 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:07.306088 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-d949z_20b76c8e-45ad-41f5-b3f5-1cc2a66d4881/konnectivity-agent/0.log" Apr 16 19:14:07.364610 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:07.364583 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-96.ec2.internal_ed8c920fb76e0e328ed5f4aa00cb172f/haproxy/0.log" Apr 16 19:14:09.320538 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:09.320495 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4rwnw/must-gather-wbvhw"] Apr 16 19:14:09.321103 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:09.320805 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-4rwnw/must-gather-wbvhw" podUID="a4253a0c-4414-4667-9da4-b2acef4dd62a" containerName="copy" containerID="cri-o://ee5aab44cf36c1c9e837dc0e6967b63f8f4eb5639293e086e504dc9eff6b8699" gracePeriod=2 Apr 16 19:14:09.325772 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:09.325736 2576 status_manager.go:895] "Failed to get status for pod" podUID="a4253a0c-4414-4667-9da4-b2acef4dd62a" pod="openshift-must-gather-4rwnw/must-gather-wbvhw" err="pods \"must-gather-wbvhw\" is forbidden: User \"system:node:ip-10-0-139-96.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-4rwnw\": no relationship found between node 'ip-10-0-139-96.ec2.internal' and this object" Apr 16 19:14:09.332096 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:09.332065 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4rwnw/must-gather-wbvhw"] Apr 16 19:14:09.686716 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:09.686080 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4rwnw_must-gather-wbvhw_a4253a0c-4414-4667-9da4-b2acef4dd62a/copy/0.log" Apr 16 19:14:09.686716 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:09.686489 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4rwnw/must-gather-wbvhw" Apr 16 19:14:09.688986 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:09.688944 2576 status_manager.go:895] "Failed to get status for pod" podUID="a4253a0c-4414-4667-9da4-b2acef4dd62a" pod="openshift-must-gather-4rwnw/must-gather-wbvhw" err="pods \"must-gather-wbvhw\" is forbidden: User \"system:node:ip-10-0-139-96.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-4rwnw\": no relationship found between node 'ip-10-0-139-96.ec2.internal' and this object" Apr 16 19:14:09.745636 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:09.745590 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzpkq\" (UniqueName: \"kubernetes.io/projected/a4253a0c-4414-4667-9da4-b2acef4dd62a-kube-api-access-xzpkq\") pod \"a4253a0c-4414-4667-9da4-b2acef4dd62a\" (UID: \"a4253a0c-4414-4667-9da4-b2acef4dd62a\") " Apr 16 19:14:09.746439 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:09.746387 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a4253a0c-4414-4667-9da4-b2acef4dd62a-must-gather-output\") pod \"a4253a0c-4414-4667-9da4-b2acef4dd62a\" (UID: \"a4253a0c-4414-4667-9da4-b2acef4dd62a\") " Apr 16 19:14:09.749008 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:09.748028 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4253a0c-4414-4667-9da4-b2acef4dd62a-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "a4253a0c-4414-4667-9da4-b2acef4dd62a" (UID: "a4253a0c-4414-4667-9da4-b2acef4dd62a"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:14:09.749444 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:09.749414 2576 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a4253a0c-4414-4667-9da4-b2acef4dd62a-must-gather-output\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 19:14:09.752154 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:09.752119 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4253a0c-4414-4667-9da4-b2acef4dd62a-kube-api-access-xzpkq" (OuterVolumeSpecName: "kube-api-access-xzpkq") pod "a4253a0c-4414-4667-9da4-b2acef4dd62a" (UID: "a4253a0c-4414-4667-9da4-b2acef4dd62a"). InnerVolumeSpecName "kube-api-access-xzpkq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:14:09.811338 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:09.811301 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4rwnw_must-gather-wbvhw_a4253a0c-4414-4667-9da4-b2acef4dd62a/copy/0.log" Apr 16 19:14:09.811836 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:09.811807 2576 generic.go:358] "Generic (PLEG): container finished" podID="a4253a0c-4414-4667-9da4-b2acef4dd62a" containerID="ee5aab44cf36c1c9e837dc0e6967b63f8f4eb5639293e086e504dc9eff6b8699" exitCode=143 Apr 16 19:14:09.811947 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:09.811921 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4rwnw/must-gather-wbvhw" Apr 16 19:14:09.812669 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:09.812137 2576 scope.go:117] "RemoveContainer" containerID="ee5aab44cf36c1c9e837dc0e6967b63f8f4eb5639293e086e504dc9eff6b8699" Apr 16 19:14:09.816290 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:09.816259 2576 status_manager.go:895] "Failed to get status for pod" podUID="a4253a0c-4414-4667-9da4-b2acef4dd62a" pod="openshift-must-gather-4rwnw/must-gather-wbvhw" err="pods \"must-gather-wbvhw\" is forbidden: User \"system:node:ip-10-0-139-96.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-4rwnw\": no relationship found between node 'ip-10-0-139-96.ec2.internal' and this object" Apr 16 19:14:09.829228 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:09.829200 2576 status_manager.go:895] "Failed to get status for pod" podUID="a4253a0c-4414-4667-9da4-b2acef4dd62a" pod="openshift-must-gather-4rwnw/must-gather-wbvhw" err="pods \"must-gather-wbvhw\" is forbidden: User \"system:node:ip-10-0-139-96.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-4rwnw\": no relationship found between node 'ip-10-0-139-96.ec2.internal' and this object" Apr 16 19:14:09.837720 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:09.837678 2576 scope.go:117] "RemoveContainer" containerID="c2c550979542cc96207ca205dd0dd629c293182826a01d8bd0ccff2329c60b03" Apr 16 19:14:09.854210 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:09.854187 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xzpkq\" (UniqueName: \"kubernetes.io/projected/a4253a0c-4414-4667-9da4-b2acef4dd62a-kube-api-access-xzpkq\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 16 19:14:09.856650 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:09.856636 2576 scope.go:117] "RemoveContainer" containerID="ee5aab44cf36c1c9e837dc0e6967b63f8f4eb5639293e086e504dc9eff6b8699" Apr 16 19:14:09.857821 ip-10-0-139-96 kubenswrapper[2576]: E0416 19:14:09.857787 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee5aab44cf36c1c9e837dc0e6967b63f8f4eb5639293e086e504dc9eff6b8699\": container with ID starting with ee5aab44cf36c1c9e837dc0e6967b63f8f4eb5639293e086e504dc9eff6b8699 not found: ID does not exist" containerID="ee5aab44cf36c1c9e837dc0e6967b63f8f4eb5639293e086e504dc9eff6b8699" Apr 16 19:14:09.857999 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:09.857972 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee5aab44cf36c1c9e837dc0e6967b63f8f4eb5639293e086e504dc9eff6b8699"} err="failed to get container status \"ee5aab44cf36c1c9e837dc0e6967b63f8f4eb5639293e086e504dc9eff6b8699\": rpc error: code = NotFound desc = could not find container \"ee5aab44cf36c1c9e837dc0e6967b63f8f4eb5639293e086e504dc9eff6b8699\": container with ID starting with ee5aab44cf36c1c9e837dc0e6967b63f8f4eb5639293e086e504dc9eff6b8699 not found: ID does not exist" Apr 16 19:14:09.858112 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:09.858099 2576 scope.go:117] "RemoveContainer" containerID="c2c550979542cc96207ca205dd0dd629c293182826a01d8bd0ccff2329c60b03" Apr 16 19:14:09.858525 ip-10-0-139-96 kubenswrapper[2576]: E0416 19:14:09.858459 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2c550979542cc96207ca205dd0dd629c293182826a01d8bd0ccff2329c60b03\": container with ID starting with c2c550979542cc96207ca205dd0dd629c293182826a01d8bd0ccff2329c60b03 not found: ID does not exist" containerID="c2c550979542cc96207ca205dd0dd629c293182826a01d8bd0ccff2329c60b03" Apr 16 19:14:09.858525 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:09.858489 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2c550979542cc96207ca205dd0dd629c293182826a01d8bd0ccff2329c60b03"} err="failed to get container status \"c2c550979542cc96207ca205dd0dd629c293182826a01d8bd0ccff2329c60b03\": rpc error: code = NotFound desc = could not find container \"c2c550979542cc96207ca205dd0dd629c293182826a01d8bd0ccff2329c60b03\": container with ID starting with c2c550979542cc96207ca205dd0dd629c293182826a01d8bd0ccff2329c60b03 not found: ID does not exist" Apr 16 19:14:10.864222 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:10.864192 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a6188869-8c94-4bfd-8639-66d342e6af7d/alertmanager/0.log" Apr 16 19:14:10.887527 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:10.887485 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a6188869-8c94-4bfd-8639-66d342e6af7d/config-reloader/0.log" Apr 16 19:14:10.908360 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:10.908266 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a6188869-8c94-4bfd-8639-66d342e6af7d/kube-rbac-proxy-web/0.log" Apr 16 19:14:10.931630 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:10.931597 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a6188869-8c94-4bfd-8639-66d342e6af7d/kube-rbac-proxy/0.log" Apr 16 19:14:10.957602 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:10.957528 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a6188869-8c94-4bfd-8639-66d342e6af7d/kube-rbac-proxy-metric/0.log" Apr 16 19:14:10.980471 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:10.980449 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a6188869-8c94-4bfd-8639-66d342e6af7d/prom-label-proxy/0.log" Apr 16 19:14:11.003188 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:11.003160 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a6188869-8c94-4bfd-8639-66d342e6af7d/init-config-reloader/0.log" Apr 16 19:14:11.052484 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:11.052452 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-j6fk5_273ddeef-93ac-489e-893f-a85a3c28bdb6/cluster-monitoring-operator/0.log" Apr 16 19:14:11.183199 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:11.183164 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-dclv7_f3fc3a09-3d9e-4d57-89b8-557e15fa868c/monitoring-plugin/0.log" Apr 16 19:14:11.213939 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:11.213852 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-59zm6_50df0bc0-8825-4d04-bc78-cf3f1ee8887e/node-exporter/0.log" Apr 16 19:14:11.237625 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:11.237600 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-59zm6_50df0bc0-8825-4d04-bc78-cf3f1ee8887e/kube-rbac-proxy/0.log" Apr 16 19:14:11.263397 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:11.263368 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-59zm6_50df0bc0-8825-4d04-bc78-cf3f1ee8887e/init-textfile/0.log" Apr 16 19:14:11.329266 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:11.329226 2576 status_manager.go:895] "Failed to get status for pod" podUID="a4253a0c-4414-4667-9da4-b2acef4dd62a" pod="openshift-must-gather-4rwnw/must-gather-wbvhw" err="pods \"must-gather-wbvhw\" is forbidden: User \"system:node:ip-10-0-139-96.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-4rwnw\": no relationship found between node 'ip-10-0-139-96.ec2.internal' and this object" Apr 16 19:14:11.330864 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:11.330841 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4253a0c-4414-4667-9da4-b2acef4dd62a" path="/var/lib/kubelet/pods/a4253a0c-4414-4667-9da4-b2acef4dd62a/volumes" Apr 16 19:14:11.537915 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:11.537826 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0a371e60-be7a-4ac0-ae72-3b033c2a5433/prometheus/0.log" Apr 16 19:14:11.554723 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:11.554648 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0a371e60-be7a-4ac0-ae72-3b033c2a5433/config-reloader/0.log" Apr 16 19:14:11.574522 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:11.574493 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0a371e60-be7a-4ac0-ae72-3b033c2a5433/thanos-sidecar/0.log" Apr 16 19:14:11.595811 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:11.595775 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0a371e60-be7a-4ac0-ae72-3b033c2a5433/kube-rbac-proxy-web/0.log" Apr 16 19:14:11.617060 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:11.617032 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0a371e60-be7a-4ac0-ae72-3b033c2a5433/kube-rbac-proxy/0.log" Apr 16 19:14:11.639375 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:11.639349 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0a371e60-be7a-4ac0-ae72-3b033c2a5433/kube-rbac-proxy-thanos/0.log" Apr 16 19:14:11.661791 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:11.661761 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0a371e60-be7a-4ac0-ae72-3b033c2a5433/init-config-reloader/0.log" Apr 16 19:14:11.692220 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:11.692191 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-mgj2v_bd8e1e3f-f92a-4c3d-a28b-78213622057d/prometheus-operator/0.log" Apr 16 19:14:11.709620 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:11.709588 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-mgj2v_bd8e1e3f-f92a-4c3d-a28b-78213622057d/kube-rbac-proxy/0.log" Apr 16 19:14:11.733854 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:11.733828 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-9cb97cd87-hs56f_78aaf263-7020-40a9-b797-efba25fa39de/prometheus-operator-admission-webhook/0.log" Apr 16 19:14:11.763007 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:11.762976 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-77cd4447c-rxvpr_42657ce0-b347-4e89-84f7-5766710baf5f/telemeter-client/0.log" Apr 16 19:14:11.782683 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:11.782654 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-77cd4447c-rxvpr_42657ce0-b347-4e89-84f7-5766710baf5f/reload/0.log" Apr 16 19:14:11.806285 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:11.806203 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-77cd4447c-rxvpr_42657ce0-b347-4e89-84f7-5766710baf5f/kube-rbac-proxy/0.log" Apr 16 19:14:11.841850 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:11.841819 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-66dc987d6f-rjrtv_26885632-9180-4f43-8bf2-d905b1c6e357/thanos-query/0.log" Apr 16 19:14:11.863791 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:11.863760 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-66dc987d6f-rjrtv_26885632-9180-4f43-8bf2-d905b1c6e357/kube-rbac-proxy-web/0.log" Apr 16 19:14:11.884502 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:11.884467 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-66dc987d6f-rjrtv_26885632-9180-4f43-8bf2-d905b1c6e357/kube-rbac-proxy/0.log" Apr 16 19:14:11.907833 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:11.907797 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-66dc987d6f-rjrtv_26885632-9180-4f43-8bf2-d905b1c6e357/prom-label-proxy/0.log" Apr 16 19:14:11.936100 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:11.936062 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-66dc987d6f-rjrtv_26885632-9180-4f43-8bf2-d905b1c6e357/kube-rbac-proxy-rules/0.log" Apr 16 19:14:11.960658 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:11.960620 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-66dc987d6f-rjrtv_26885632-9180-4f43-8bf2-d905b1c6e357/kube-rbac-proxy-metrics/0.log" Apr 16 19:14:13.494726 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:13.494683 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-w28dq_11e9383c-4bf6-4c5c-9dec-a8f2b642aff1/console-operator/1.log" Apr 16 19:14:13.500362 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:13.500339 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-w28dq_11e9383c-4bf6-4c5c-9dec-a8f2b642aff1/console-operator/2.log" Apr 16 19:14:14.076844 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:14.076808 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-p86h9/perf-node-gather-daemonset-q5pqf"] Apr 16 19:14:14.077330 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:14.077316 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4253a0c-4414-4667-9da4-b2acef4dd62a" containerName="copy" Apr 16 19:14:14.077373 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:14.077334 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4253a0c-4414-4667-9da4-b2acef4dd62a" containerName="copy" Apr 16 19:14:14.077373 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:14.077358 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4253a0c-4414-4667-9da4-b2acef4dd62a" containerName="gather" Apr 16 19:14:14.077373 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:14.077368 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4253a0c-4414-4667-9da4-b2acef4dd62a" containerName="gather" Apr 16 19:14:14.077469 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:14.077446 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a4253a0c-4414-4667-9da4-b2acef4dd62a" containerName="gather" Apr 16 19:14:14.077469 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:14.077461 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a4253a0c-4414-4667-9da4-b2acef4dd62a" containerName="copy" Apr 16 19:14:14.081946 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:14.081923 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-q5pqf" Apr 16 19:14:14.089171 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:14.089142 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p86h9/perf-node-gather-daemonset-q5pqf"] Apr 16 19:14:14.196625 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:14.196585 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/678f8c60-0720-4b0e-a9ee-e7f2d363a5d2-proc\") pod \"perf-node-gather-daemonset-q5pqf\" (UID: \"678f8c60-0720-4b0e-a9ee-e7f2d363a5d2\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-q5pqf" Apr 16 19:14:14.196811 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:14.196636 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/678f8c60-0720-4b0e-a9ee-e7f2d363a5d2-podres\") pod \"perf-node-gather-daemonset-q5pqf\" (UID: \"678f8c60-0720-4b0e-a9ee-e7f2d363a5d2\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-q5pqf" Apr 16 19:14:14.196811 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:14.196742 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/678f8c60-0720-4b0e-a9ee-e7f2d363a5d2-sys\") pod \"perf-node-gather-daemonset-q5pqf\" (UID: \"678f8c60-0720-4b0e-a9ee-e7f2d363a5d2\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-q5pqf" Apr 16 19:14:14.196811 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:14.196772 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/678f8c60-0720-4b0e-a9ee-e7f2d363a5d2-lib-modules\") pod \"perf-node-gather-daemonset-q5pqf\" (UID: \"678f8c60-0720-4b0e-a9ee-e7f2d363a5d2\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-q5pqf" Apr 16 19:14:14.196955 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:14.196860 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zckv\" (UniqueName: \"kubernetes.io/projected/678f8c60-0720-4b0e-a9ee-e7f2d363a5d2-kube-api-access-6zckv\") pod \"perf-node-gather-daemonset-q5pqf\" (UID: \"678f8c60-0720-4b0e-a9ee-e7f2d363a5d2\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-q5pqf" Apr 16 19:14:14.298198 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:14.298155 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/678f8c60-0720-4b0e-a9ee-e7f2d363a5d2-proc\") pod \"perf-node-gather-daemonset-q5pqf\" (UID: \"678f8c60-0720-4b0e-a9ee-e7f2d363a5d2\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-q5pqf" Apr 16 19:14:14.298366 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:14.298205 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/678f8c60-0720-4b0e-a9ee-e7f2d363a5d2-podres\") pod \"perf-node-gather-daemonset-q5pqf\" (UID: \"678f8c60-0720-4b0e-a9ee-e7f2d363a5d2\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-q5pqf" Apr 16 19:14:14.298366 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:14.298301 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/678f8c60-0720-4b0e-a9ee-e7f2d363a5d2-sys\") pod \"perf-node-gather-daemonset-q5pqf\" (UID: \"678f8c60-0720-4b0e-a9ee-e7f2d363a5d2\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-q5pqf" Apr 16 19:14:14.298366 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:14.298355 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/678f8c60-0720-4b0e-a9ee-e7f2d363a5d2-lib-modules\") pod \"perf-node-gather-daemonset-q5pqf\" (UID: \"678f8c60-0720-4b0e-a9ee-e7f2d363a5d2\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-q5pqf" Apr 16 19:14:14.298555 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:14.298440 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6zckv\" (UniqueName: \"kubernetes.io/projected/678f8c60-0720-4b0e-a9ee-e7f2d363a5d2-kube-api-access-6zckv\") pod \"perf-node-gather-daemonset-q5pqf\" (UID: \"678f8c60-0720-4b0e-a9ee-e7f2d363a5d2\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-q5pqf" Apr 16 19:14:14.298854 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:14.298827 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/678f8c60-0720-4b0e-a9ee-e7f2d363a5d2-proc\") pod \"perf-node-gather-daemonset-q5pqf\" (UID: \"678f8c60-0720-4b0e-a9ee-e7f2d363a5d2\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-q5pqf" Apr 16 19:14:14.298971 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:14.298853 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/678f8c60-0720-4b0e-a9ee-e7f2d363a5d2-podres\") pod \"perf-node-gather-daemonset-q5pqf\" (UID: \"678f8c60-0720-4b0e-a9ee-e7f2d363a5d2\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-q5pqf" Apr 16 19:14:14.298971 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:14.298838 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/678f8c60-0720-4b0e-a9ee-e7f2d363a5d2-sys\") pod \"perf-node-gather-daemonset-q5pqf\" (UID: \"678f8c60-0720-4b0e-a9ee-e7f2d363a5d2\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-q5pqf" Apr 16 19:14:14.298971 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:14.298884 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/678f8c60-0720-4b0e-a9ee-e7f2d363a5d2-lib-modules\") pod \"perf-node-gather-daemonset-q5pqf\" (UID: \"678f8c60-0720-4b0e-a9ee-e7f2d363a5d2\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-q5pqf" Apr 16 19:14:14.306818 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:14.306795 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zckv\" (UniqueName: \"kubernetes.io/projected/678f8c60-0720-4b0e-a9ee-e7f2d363a5d2-kube-api-access-6zckv\") pod \"perf-node-gather-daemonset-q5pqf\" (UID: \"678f8c60-0720-4b0e-a9ee-e7f2d363a5d2\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-q5pqf" Apr 16 19:14:14.395336 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:14.395312 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-q5pqf" Apr 16 19:14:14.530474 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:14.530446 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p86h9/perf-node-gather-daemonset-q5pqf"] Apr 16 19:14:14.850319 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:14.850223 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-q5pqf" event={"ID":"678f8c60-0720-4b0e-a9ee-e7f2d363a5d2","Type":"ContainerStarted","Data":"1986dad76f7fc04b0fc84acf1add05ee31013b515dd35be9cff46e192618ef5f"} Apr 16 19:14:14.850319 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:14.850279 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-q5pqf" Apr 16 19:14:14.850319 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:14.850290 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-q5pqf" event={"ID":"678f8c60-0720-4b0e-a9ee-e7f2d363a5d2","Type":"ContainerStarted","Data":"db79ddf03b2572b55257f88003fcba2cd3347f0c38412a3dce37881eb735e5b3"} Apr 16 19:14:14.866717 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:14.866650 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-q5pqf" podStartSLOduration=0.866632652 podStartE2EDuration="866.632652ms" podCreationTimestamp="2026-04-16 19:14:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:14:14.865106668 +0000 UTC m=+3814.066681948" watchObservedRunningTime="2026-04-16 19:14:14.866632652 +0000 UTC m=+3814.068207921" Apr 16 19:14:14.937210 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:14.937183 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-5mn52_a54e43de-5a67-45f3-b403-4317caee2eca/dns/0.log" Apr 16 19:14:14.960530 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:14.960500 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-5mn52_a54e43de-5a67-45f3-b403-4317caee2eca/kube-rbac-proxy/0.log" Apr 16 19:14:15.091432 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:15.091405 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mb8s7_fb4955c9-d4b5-4e21-a2ca-4d700832a59c/dns-node-resolver/0.log" Apr 16 19:14:15.571096 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:15.571055 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jmzcb_76ccc7ff-6855-49c3-a0b5-185487ae8516/node-ca/0.log" Apr 16 19:14:16.532425 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:16.532399 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-clw7f_31b8e533-ba32-44f3-b6db-5c9e368510c6/serve-healthcheck-canary/0.log" Apr 16 19:14:16.904965 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:16.904936 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-f6kdh_8d34f403-81ae-4142-98c8-5c0168280de0/insights-operator/0.log" Apr 16 19:14:16.906032 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:16.906009 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-f6kdh_8d34f403-81ae-4142-98c8-5c0168280de0/insights-operator/1.log" Apr 16 19:14:17.069978 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:17.069951 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-78gqq_c3a8c67b-0001-42d8-bb3f-f86472b945ff/kube-rbac-proxy/0.log" Apr 16 19:14:17.089599 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:17.089579 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-78gqq_c3a8c67b-0001-42d8-bb3f-f86472b945ff/exporter/0.log" Apr 16 19:14:17.109662 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:17.109628 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-78gqq_c3a8c67b-0001-42d8-bb3f-f86472b945ff/extractor/0.log" Apr 16 19:14:19.396912 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:19.396887 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-f7ljs_1bfeb418-2d02-407e-86ed-7f42a2bc6938/s3-init/0.log" Apr 16 19:14:19.445771 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:19.445745 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-custom-db5j8_72534c43-fc4a-4296-a820-9a02a374f5c0/s3-tls-init-custom/0.log" Apr 16 19:14:19.467283 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:19.467254 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-serving-fvsj9_e8addbcd-a056-46a5-b838-372d06344702/s3-tls-init-serving/0.log" Apr 16 19:14:19.536271 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:19.536248 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-serving-7fd5766db9-djz97_092f6923-5390-47c7-b719-0874730015f9/seaweedfs-tls-serving/0.log" Apr 16 19:14:20.867682 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:20.867652 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-q5pqf" Apr 16 19:14:23.216662 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:23.216580 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-6ctrp_23b3ce03-bb68-4ac2-a1fe-04780069ad4d/migrator/0.log" Apr 16 19:14:23.241328 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:23.241297 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-6ctrp_23b3ce03-bb68-4ac2-a1fe-04780069ad4d/graceful-termination/0.log" Apr 16 19:14:24.652850 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:24.652824 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k5m7r_439613fb-5d3f-4d29-b662-f86a49f8e289/kube-multus-additional-cni-plugins/0.log" Apr 16 19:14:24.672245 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:24.672221 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k5m7r_439613fb-5d3f-4d29-b662-f86a49f8e289/egress-router-binary-copy/0.log" Apr 16 19:14:24.703541 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:24.703516 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k5m7r_439613fb-5d3f-4d29-b662-f86a49f8e289/cni-plugins/0.log" Apr 16 19:14:24.752821 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:24.752800 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k5m7r_439613fb-5d3f-4d29-b662-f86a49f8e289/bond-cni-plugin/0.log" Apr 16 19:14:24.788754 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:24.788730 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k5m7r_439613fb-5d3f-4d29-b662-f86a49f8e289/routeoverride-cni/0.log" Apr 16 19:14:24.809018 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:24.808999 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k5m7r_439613fb-5d3f-4d29-b662-f86a49f8e289/whereabouts-cni-bincopy/0.log" Apr 16 19:14:24.829236 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:24.829213 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k5m7r_439613fb-5d3f-4d29-b662-f86a49f8e289/whereabouts-cni/0.log" Apr 16 19:14:25.080540 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:25.080458 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tkxn5_82232800-52de-476a-a364-558f49009263/kube-multus/0.log" Apr 16 19:14:25.195213 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:25.195177 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-fwqx5_22eec1cf-b2b9-495f-9507-ee4b6c6a9204/network-metrics-daemon/0.log" Apr 16 19:14:25.216820 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:25.216794 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-fwqx5_22eec1cf-b2b9-495f-9507-ee4b6c6a9204/kube-rbac-proxy/0.log" Apr 16 19:14:25.937299 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:25.937270 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b6sfk_5e9d6b21-2120-44f2-8a4a-d991547263f2/ovn-controller/0.log" Apr 16 19:14:25.976517 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:25.976494 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b6sfk_5e9d6b21-2120-44f2-8a4a-d991547263f2/ovn-acl-logging/0.log" Apr 16 19:14:25.998537 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:25.998518 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b6sfk_5e9d6b21-2120-44f2-8a4a-d991547263f2/kube-rbac-proxy-node/0.log" Apr 16 19:14:26.018182 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:26.018162 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b6sfk_5e9d6b21-2120-44f2-8a4a-d991547263f2/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 19:14:26.037124 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:26.037103 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b6sfk_5e9d6b21-2120-44f2-8a4a-d991547263f2/northd/0.log" Apr 16 19:14:26.055630 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:26.055615 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b6sfk_5e9d6b21-2120-44f2-8a4a-d991547263f2/nbdb/0.log" Apr 16 19:14:26.075887 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:26.075868 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b6sfk_5e9d6b21-2120-44f2-8a4a-d991547263f2/sbdb/0.log" Apr 16 19:14:26.190118 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:26.190059 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b6sfk_5e9d6b21-2120-44f2-8a4a-d991547263f2/ovnkube-controller/0.log" Apr 16 19:14:27.758526 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:27.758499 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-r2j4b_5ad84774-79a9-4253-9451-f7e900a7cb4d/network-check-target-container/0.log" Apr 16 19:14:28.619047 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:28.619006 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-22zxr_1b075c15-1f6a-48c0-af94-65cd41bdc367/iptables-alerter/0.log" Apr 16 19:14:29.291843 ip-10-0-139-96 kubenswrapper[2576]: I0416 19:14:29.291820 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-lsm5b_91a81544-645b-4829-be2c-a425f6f14d64/tuned/0.log"